00:00:00.000 Started by upstream project "autotest-per-patch" build number 127082 00:00:00.000 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.030 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.030 The recommended git tool is: git 00:00:00.031 using credential 00000000-0000-0000-0000-000000000002 00:00:00.032 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.047 Fetching changes from the remote Git repository 00:00:00.057 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.070 Using shallow fetch with depth 1 00:00:00.070 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.070 > git --version # timeout=10 00:00:00.094 > git --version # 'git version 2.39.2' 00:00:00.094 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.124 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.124 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.618 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.629 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.640 Checking out Revision f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08 (FETCH_HEAD) 00:00:02.640 > git config core.sparsecheckout # timeout=10 00:00:02.651 > git read-tree -mu HEAD # timeout=10 00:00:02.668 > git checkout -f f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08 # timeout=5 00:00:02.697 Commit message: "spdk-abi-per-patch: fix check-so-deps-docker-autotest parameters" 00:00:02.697 > git rev-list --no-walk f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08 # timeout=10 00:00:02.774 [Pipeline] Start of Pipeline 00:00:02.787 [Pipeline] library 00:00:02.788 Loading library shm_lib@master 00:00:02.789 Library shm_lib@master is cached. Copying from home. 00:00:02.805 [Pipeline] node 00:00:02.815 Running on WFP3 in /var/jenkins/workspace/crypto-phy-autotest 00:00:02.817 [Pipeline] { 00:00:02.827 [Pipeline] catchError 00:00:02.828 [Pipeline] { 00:00:02.843 [Pipeline] wrap 00:00:02.852 [Pipeline] { 00:00:02.860 [Pipeline] stage 00:00:02.862 [Pipeline] { (Prologue) 00:00:03.043 [Pipeline] sh 00:00:03.329 + logger -p user.info -t JENKINS-CI 00:00:03.343 [Pipeline] echo 00:00:03.344 Node: WFP3 00:00:03.349 [Pipeline] sh 00:00:03.640 [Pipeline] setCustomBuildProperty 00:00:03.649 [Pipeline] echo 00:00:03.650 Cleanup processes 00:00:03.654 [Pipeline] sh 00:00:03.936 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.936 1881897 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.949 [Pipeline] sh 00:00:04.228 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.228 ++ grep -v 'sudo pgrep' 00:00:04.228 ++ awk '{print $1}' 00:00:04.228 + sudo kill -9 00:00:04.228 + true 00:00:04.238 [Pipeline] cleanWs 00:00:04.245 [WS-CLEANUP] Deleting project workspace... 00:00:04.245 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.251 [WS-CLEANUP] done 00:00:04.254 [Pipeline] setCustomBuildProperty 00:00:04.265 [Pipeline] sh 00:00:04.541 + sudo git config --global --replace-all safe.directory '*' 00:00:04.613 [Pipeline] httpRequest 00:00:04.633 [Pipeline] echo 00:00:04.634 Sorcerer 10.211.164.101 is alive 00:00:04.642 [Pipeline] httpRequest 00:00:04.646 HttpMethod: GET 00:00:04.647 URL: http://10.211.164.101/packages/jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:04.647 Sending request to url: http://10.211.164.101/packages/jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:04.650 Response Code: HTTP/1.1 200 OK 00:00:04.650 Success: Status code 200 is in the accepted range: 200,404 00:00:04.651 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:05.647 [Pipeline] sh 00:00:05.925 + tar --no-same-owner -xf jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:05.939 [Pipeline] httpRequest 00:00:05.965 [Pipeline] echo 00:00:05.966 Sorcerer 10.211.164.101 is alive 00:00:05.972 [Pipeline] httpRequest 00:00:05.975 HttpMethod: GET 00:00:05.976 URL: http://10.211.164.101/packages/spdk_0bb5c21e286c2a526066ac6459b84bb9e7b10cac.tar.gz 00:00:05.976 Sending request to url: http://10.211.164.101/packages/spdk_0bb5c21e286c2a526066ac6459b84bb9e7b10cac.tar.gz 00:00:05.991 Response Code: HTTP/1.1 200 OK 00:00:05.992 Success: Status code 200 is in the accepted range: 200,404 00:00:05.992 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_0bb5c21e286c2a526066ac6459b84bb9e7b10cac.tar.gz 00:01:03.226 [Pipeline] sh 00:01:03.520 + tar --no-same-owner -xf spdk_0bb5c21e286c2a526066ac6459b84bb9e7b10cac.tar.gz 00:01:06.071 [Pipeline] sh 00:01:06.356 + git -C spdk log --oneline -n5 00:01:06.356 0bb5c21e2 nvmf: move register nvmf_poll_group_poll interrupt to nvmf 00:01:06.356 8968f30fe nvmf/tcp: replace pending_buf_queue with nvmf_tcp_request_get_buffers 00:01:06.356 13040d616 nvmf: enable iobuf based queuing for nvmf requests 00:01:06.356 5c0b15eed nvmf/tcp: fix spdk_nvmf_tcp_control_msg_list queuing 00:01:06.356 78cbcfdde test/scheduler: fix cpu mask for rpc governor tests 00:01:06.368 [Pipeline] } 00:01:06.384 [Pipeline] // stage 00:01:06.394 [Pipeline] stage 00:01:06.396 [Pipeline] { (Prepare) 00:01:06.416 [Pipeline] writeFile 00:01:06.434 [Pipeline] sh 00:01:06.719 + logger -p user.info -t JENKINS-CI 00:01:06.734 [Pipeline] sh 00:01:07.019 + logger -p user.info -t JENKINS-CI 00:01:07.031 [Pipeline] sh 00:01:07.314 + cat autorun-spdk.conf 00:01:07.314 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:07.314 SPDK_TEST_BLOCKDEV=1 00:01:07.314 SPDK_TEST_ISAL=1 00:01:07.314 SPDK_TEST_CRYPTO=1 00:01:07.314 SPDK_TEST_REDUCE=1 00:01:07.314 SPDK_TEST_VBDEV_COMPRESS=1 00:01:07.314 SPDK_RUN_UBSAN=1 00:01:07.314 SPDK_TEST_ACCEL=1 00:01:07.321 RUN_NIGHTLY=0 00:01:07.326 [Pipeline] readFile 00:01:07.354 [Pipeline] withEnv 00:01:07.356 [Pipeline] { 00:01:07.372 [Pipeline] sh 00:01:07.658 + set -ex 00:01:07.658 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:01:07.658 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:07.658 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:07.658 ++ SPDK_TEST_BLOCKDEV=1 00:01:07.658 ++ SPDK_TEST_ISAL=1 00:01:07.658 ++ SPDK_TEST_CRYPTO=1 00:01:07.658 ++ SPDK_TEST_REDUCE=1 00:01:07.658 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:07.658 ++ SPDK_RUN_UBSAN=1 00:01:07.658 ++ SPDK_TEST_ACCEL=1 00:01:07.658 ++ RUN_NIGHTLY=0 00:01:07.658 + case $SPDK_TEST_NVMF_NICS in 00:01:07.658 + DRIVERS= 00:01:07.658 + [[ -n '' ]] 00:01:07.658 + exit 0 00:01:07.667 [Pipeline] } 00:01:07.683 [Pipeline] // withEnv 00:01:07.689 [Pipeline] } 00:01:07.707 [Pipeline] // stage 00:01:07.717 [Pipeline] catchError 00:01:07.719 [Pipeline] { 00:01:07.734 [Pipeline] timeout 00:01:07.735 Timeout set to expire in 1 hr 0 min 00:01:07.737 [Pipeline] { 00:01:07.756 [Pipeline] stage 00:01:07.758 [Pipeline] { (Tests) 00:01:07.774 [Pipeline] sh 00:01:08.061 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:01:08.061 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:01:08.061 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:01:08.061 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:01:08.061 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:08.061 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:01:08.061 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:01:08.061 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:08.061 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:01:08.061 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:08.061 + [[ crypto-phy-autotest == pkgdep-* ]] 00:01:08.061 + cd /var/jenkins/workspace/crypto-phy-autotest 00:01:08.061 + source /etc/os-release 00:01:08.061 ++ NAME='Fedora Linux' 00:01:08.061 ++ VERSION='38 (Cloud Edition)' 00:01:08.061 ++ ID=fedora 00:01:08.061 ++ VERSION_ID=38 00:01:08.061 ++ VERSION_CODENAME= 00:01:08.061 ++ PLATFORM_ID=platform:f38 00:01:08.061 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:08.061 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:08.061 ++ LOGO=fedora-logo-icon 00:01:08.061 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:08.061 ++ HOME_URL=https://fedoraproject.org/ 00:01:08.061 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:08.061 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:08.061 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:08.061 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:08.061 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:08.061 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:08.061 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:08.061 ++ SUPPORT_END=2024-05-14 00:01:08.061 ++ VARIANT='Cloud Edition' 00:01:08.061 ++ VARIANT_ID=cloud 00:01:08.061 + uname -a 00:01:08.061 Linux spdk-wfp-03 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 02:47:10 UTC 2024 x86_64 GNU/Linux 00:01:08.061 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:01:10.595 Hugepages 00:01:10.596 node hugesize free / total 00:01:10.596 node0 1048576kB 0 / 0 00:01:10.596 node0 2048kB 0 / 0 00:01:10.596 node1 1048576kB 0 / 0 00:01:10.596 node1 2048kB 0 / 0 00:01:10.596 00:01:10.596 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:10.596 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:10.596 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:10.855 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:10.855 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:10.855 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:10.855 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:10.855 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:10.855 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:10.855 NVMe 0000:5e:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:01:10.855 NVMe 0000:5f:00.0 1b96 2600 0 nvme nvme1 nvme1n1 nvme1n2 00:01:10.855 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:10.855 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:10.855 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:10.855 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:10.855 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:10.855 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:10.855 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:10.855 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:10.855 + rm -f /tmp/spdk-ld-path 00:01:10.855 + source autorun-spdk.conf 00:01:10.855 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:10.855 ++ SPDK_TEST_BLOCKDEV=1 00:01:10.855 ++ SPDK_TEST_ISAL=1 00:01:10.855 ++ SPDK_TEST_CRYPTO=1 00:01:10.855 ++ SPDK_TEST_REDUCE=1 00:01:10.855 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:10.855 ++ SPDK_RUN_UBSAN=1 00:01:10.855 ++ SPDK_TEST_ACCEL=1 00:01:10.855 ++ RUN_NIGHTLY=0 00:01:10.855 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:10.855 + [[ -n '' ]] 00:01:10.855 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:10.855 + for M in /var/spdk/build-*-manifest.txt 00:01:10.855 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:10.855 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:10.855 + for M in /var/spdk/build-*-manifest.txt 00:01:10.855 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:10.855 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:10.855 ++ uname 00:01:10.855 + [[ Linux == \L\i\n\u\x ]] 00:01:10.855 + sudo dmesg -T 00:01:11.115 + sudo dmesg --clear 00:01:11.115 + dmesg_pid=1882973 00:01:11.115 + [[ Fedora Linux == FreeBSD ]] 00:01:11.115 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:11.115 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:11.115 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:11.115 + [[ -x /usr/src/fio-static/fio ]] 00:01:11.115 + export FIO_BIN=/usr/src/fio-static/fio 00:01:11.115 + FIO_BIN=/usr/src/fio-static/fio 00:01:11.115 + sudo dmesg -Tw 00:01:11.115 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:11.115 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:11.115 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:11.115 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:11.115 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:11.115 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:11.115 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:11.115 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:11.115 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:11.115 Test configuration: 00:01:11.115 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:11.115 SPDK_TEST_BLOCKDEV=1 00:01:11.115 SPDK_TEST_ISAL=1 00:01:11.115 SPDK_TEST_CRYPTO=1 00:01:11.115 SPDK_TEST_REDUCE=1 00:01:11.115 SPDK_TEST_VBDEV_COMPRESS=1 00:01:11.115 SPDK_RUN_UBSAN=1 00:01:11.115 SPDK_TEST_ACCEL=1 00:01:11.115 RUN_NIGHTLY=0 18:37:56 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:01:11.115 18:37:56 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:11.115 18:37:56 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:11.115 18:37:56 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:11.115 18:37:56 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:11.115 18:37:56 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:11.115 18:37:56 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:11.115 18:37:56 -- paths/export.sh@5 -- $ export PATH 00:01:11.115 18:37:56 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:11.115 18:37:56 -- common/autobuild_common.sh@446 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:01:11.115 18:37:56 -- common/autobuild_common.sh@447 -- $ date +%s 00:01:11.115 18:37:56 -- common/autobuild_common.sh@447 -- $ mktemp -dt spdk_1721839076.XXXXXX 00:01:11.115 18:37:56 -- common/autobuild_common.sh@447 -- $ SPDK_WORKSPACE=/tmp/spdk_1721839076.Hiv9UF 00:01:11.115 18:37:56 -- common/autobuild_common.sh@449 -- $ [[ -n '' ]] 00:01:11.115 18:37:56 -- common/autobuild_common.sh@453 -- $ '[' -n '' ']' 00:01:11.115 18:37:56 -- common/autobuild_common.sh@456 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:01:11.115 18:37:56 -- common/autobuild_common.sh@460 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:11.115 18:37:56 -- common/autobuild_common.sh@462 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:11.115 18:37:56 -- common/autobuild_common.sh@463 -- $ get_config_params 00:01:11.115 18:37:56 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:01:11.115 18:37:56 -- common/autotest_common.sh@10 -- $ set +x 00:01:11.115 18:37:56 -- common/autobuild_common.sh@463 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:01:11.115 18:37:56 -- common/autobuild_common.sh@465 -- $ start_monitor_resources 00:01:11.115 18:37:56 -- pm/common@17 -- $ local monitor 00:01:11.115 18:37:56 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:11.115 18:37:56 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:11.115 18:37:56 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:11.115 18:37:56 -- pm/common@21 -- $ date +%s 00:01:11.115 18:37:56 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:11.115 18:37:56 -- pm/common@21 -- $ date +%s 00:01:11.115 18:37:56 -- pm/common@25 -- $ sleep 1 00:01:11.115 18:37:56 -- pm/common@21 -- $ date +%s 00:01:11.115 18:37:56 -- pm/common@21 -- $ date +%s 00:01:11.115 18:37:56 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721839076 00:01:11.115 18:37:56 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721839076 00:01:11.115 18:37:56 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721839076 00:01:11.115 18:37:56 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721839076 00:01:11.115 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721839076_collect-vmstat.pm.log 00:01:11.115 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721839076_collect-cpu-load.pm.log 00:01:11.115 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721839076_collect-cpu-temp.pm.log 00:01:11.375 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721839076_collect-bmc-pm.bmc.pm.log 00:01:12.313 18:37:57 -- common/autobuild_common.sh@466 -- $ trap stop_monitor_resources EXIT 00:01:12.313 18:37:57 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:12.313 18:37:57 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:12.313 18:37:57 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:12.313 18:37:57 -- spdk/autobuild.sh@16 -- $ date -u 00:01:12.313 Wed Jul 24 04:37:57 PM UTC 2024 00:01:12.313 18:37:57 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:12.313 v24.09-pre-313-g0bb5c21e2 00:01:12.313 18:37:57 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:12.313 18:37:57 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:12.313 18:37:57 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:12.313 18:37:57 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:12.313 18:37:57 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:12.313 18:37:57 -- common/autotest_common.sh@10 -- $ set +x 00:01:12.313 ************************************ 00:01:12.313 START TEST ubsan 00:01:12.313 ************************************ 00:01:12.313 18:37:57 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:01:12.313 using ubsan 00:01:12.313 00:01:12.313 real 0m0.000s 00:01:12.313 user 0m0.000s 00:01:12.313 sys 0m0.000s 00:01:12.313 18:37:57 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:01:12.313 18:37:57 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:12.313 ************************************ 00:01:12.313 END TEST ubsan 00:01:12.313 ************************************ 00:01:12.313 18:37:57 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:12.313 18:37:57 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:12.313 18:37:57 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:12.313 18:37:57 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:01:12.313 18:37:57 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:12.313 18:37:57 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:12.313 18:37:57 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:12.313 18:37:57 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:01:12.313 18:37:57 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:01:12.313 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:01:12.313 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:12.573 Using 'verbs' RDMA provider 00:01:25.777 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:37.991 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:37.991 Creating mk/config.mk...done. 00:01:37.991 Creating mk/cc.flags.mk...done. 00:01:37.991 Type 'make' to build. 00:01:37.991 18:38:21 -- spdk/autobuild.sh@69 -- $ run_test make make -j96 00:01:37.991 18:38:21 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:37.991 18:38:21 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:37.991 18:38:21 -- common/autotest_common.sh@10 -- $ set +x 00:01:37.991 ************************************ 00:01:37.991 START TEST make 00:01:37.991 ************************************ 00:01:37.991 18:38:21 make -- common/autotest_common.sh@1123 -- $ make -j96 00:01:37.991 make[1]: Nothing to be done for 'all'. 00:02:04.553 The Meson build system 00:02:04.554 Version: 1.3.1 00:02:04.554 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:02:04.554 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:02:04.554 Build type: native build 00:02:04.554 Program cat found: YES (/usr/bin/cat) 00:02:04.554 Project name: DPDK 00:02:04.554 Project version: 24.03.0 00:02:04.554 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:02:04.554 C linker for the host machine: cc ld.bfd 2.39-16 00:02:04.554 Host machine cpu family: x86_64 00:02:04.554 Host machine cpu: x86_64 00:02:04.554 Message: ## Building in Developer Mode ## 00:02:04.554 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:04.554 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:04.554 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:04.554 Program python3 found: YES (/usr/bin/python3) 00:02:04.554 Program cat found: YES (/usr/bin/cat) 00:02:04.554 Compiler for C supports arguments -march=native: YES 00:02:04.554 Checking for size of "void *" : 8 00:02:04.554 Checking for size of "void *" : 8 (cached) 00:02:04.554 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:02:04.554 Library m found: YES 00:02:04.554 Library numa found: YES 00:02:04.554 Has header "numaif.h" : YES 00:02:04.554 Library fdt found: NO 00:02:04.554 Library execinfo found: NO 00:02:04.554 Has header "execinfo.h" : YES 00:02:04.554 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:04.554 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:04.554 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:04.554 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:04.554 Run-time dependency openssl found: YES 3.0.9 00:02:04.554 Run-time dependency libpcap found: YES 1.10.4 00:02:04.554 Has header "pcap.h" with dependency libpcap: YES 00:02:04.554 Compiler for C supports arguments -Wcast-qual: YES 00:02:04.554 Compiler for C supports arguments -Wdeprecated: YES 00:02:04.554 Compiler for C supports arguments -Wformat: YES 00:02:04.554 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:04.554 Compiler for C supports arguments -Wformat-security: NO 00:02:04.554 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:04.554 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:04.554 Compiler for C supports arguments -Wnested-externs: YES 00:02:04.554 Compiler for C supports arguments -Wold-style-definition: YES 00:02:04.554 Compiler for C supports arguments -Wpointer-arith: YES 00:02:04.554 Compiler for C supports arguments -Wsign-compare: YES 00:02:04.554 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:04.554 Compiler for C supports arguments -Wundef: YES 00:02:04.554 Compiler for C supports arguments -Wwrite-strings: YES 00:02:04.554 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:04.554 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:04.554 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:04.554 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:04.554 Program objdump found: YES (/usr/bin/objdump) 00:02:04.554 Compiler for C supports arguments -mavx512f: YES 00:02:04.554 Checking if "AVX512 checking" compiles: YES 00:02:04.554 Fetching value of define "__SSE4_2__" : 1 00:02:04.554 Fetching value of define "__AES__" : 1 00:02:04.554 Fetching value of define "__AVX__" : 1 00:02:04.554 Fetching value of define "__AVX2__" : 1 00:02:04.554 Fetching value of define "__AVX512BW__" : 1 00:02:04.554 Fetching value of define "__AVX512CD__" : 1 00:02:04.554 Fetching value of define "__AVX512DQ__" : 1 00:02:04.554 Fetching value of define "__AVX512F__" : 1 00:02:04.554 Fetching value of define "__AVX512VL__" : 1 00:02:04.554 Fetching value of define "__PCLMUL__" : 1 00:02:04.554 Fetching value of define "__RDRND__" : 1 00:02:04.554 Fetching value of define "__RDSEED__" : 1 00:02:04.554 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:04.554 Fetching value of define "__znver1__" : (undefined) 00:02:04.554 Fetching value of define "__znver2__" : (undefined) 00:02:04.554 Fetching value of define "__znver3__" : (undefined) 00:02:04.554 Fetching value of define "__znver4__" : (undefined) 00:02:04.554 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:04.554 Message: lib/log: Defining dependency "log" 00:02:04.554 Message: lib/kvargs: Defining dependency "kvargs" 00:02:04.554 Message: lib/telemetry: Defining dependency "telemetry" 00:02:04.554 Checking for function "getentropy" : NO 00:02:04.554 Message: lib/eal: Defining dependency "eal" 00:02:04.554 Message: lib/ring: Defining dependency "ring" 00:02:04.554 Message: lib/rcu: Defining dependency "rcu" 00:02:04.554 Message: lib/mempool: Defining dependency "mempool" 00:02:04.554 Message: lib/mbuf: Defining dependency "mbuf" 00:02:04.554 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:04.554 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:04.554 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:04.554 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:04.554 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:04.554 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:04.554 Compiler for C supports arguments -mpclmul: YES 00:02:04.554 Compiler for C supports arguments -maes: YES 00:02:04.554 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:04.554 Compiler for C supports arguments -mavx512bw: YES 00:02:04.554 Compiler for C supports arguments -mavx512dq: YES 00:02:04.554 Compiler for C supports arguments -mavx512vl: YES 00:02:04.554 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:04.554 Compiler for C supports arguments -mavx2: YES 00:02:04.554 Compiler for C supports arguments -mavx: YES 00:02:04.554 Message: lib/net: Defining dependency "net" 00:02:04.554 Message: lib/meter: Defining dependency "meter" 00:02:04.554 Message: lib/ethdev: Defining dependency "ethdev" 00:02:04.554 Message: lib/pci: Defining dependency "pci" 00:02:04.554 Message: lib/cmdline: Defining dependency "cmdline" 00:02:04.554 Message: lib/hash: Defining dependency "hash" 00:02:04.554 Message: lib/timer: Defining dependency "timer" 00:02:04.554 Message: lib/compressdev: Defining dependency "compressdev" 00:02:04.554 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:04.554 Message: lib/dmadev: Defining dependency "dmadev" 00:02:04.554 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:04.554 Message: lib/power: Defining dependency "power" 00:02:04.554 Message: lib/reorder: Defining dependency "reorder" 00:02:04.554 Message: lib/security: Defining dependency "security" 00:02:04.554 Has header "linux/userfaultfd.h" : YES 00:02:04.554 Has header "linux/vduse.h" : YES 00:02:04.554 Message: lib/vhost: Defining dependency "vhost" 00:02:04.554 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:04.554 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:02:04.554 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:04.554 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:04.554 Compiler for C supports arguments -std=c11: YES 00:02:04.554 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:02:04.554 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:02:04.554 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:02:04.554 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:02:04.554 Run-time dependency libmlx5 found: YES 1.24.46.0 00:02:04.554 Run-time dependency libibverbs found: YES 1.14.46.0 00:02:04.554 Library mtcr_ul found: NO 00:02:04.554 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:02:04.554 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:02:04.554 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:02:04.554 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:02:04.554 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:02:04.554 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:02:04.554 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:02:04.554 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:02:04.554 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:02:04.554 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:02:04.554 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:02:04.554 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:02:04.554 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:02:04.554 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:02:04.554 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:02:08.752 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:02:08.752 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:02:08.752 Configuring mlx5_autoconf.h using configuration 00:02:08.752 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:02:08.752 Run-time dependency libcrypto found: YES 3.0.9 00:02:08.752 Library IPSec_MB found: YES 00:02:08.752 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:02:08.752 Message: drivers/common/qat: Defining dependency "common_qat" 00:02:08.752 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:08.752 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:08.752 Library IPSec_MB found: YES 00:02:08.752 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:02:08.752 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:02:08.752 Compiler for C supports arguments -std=c11: YES (cached) 00:02:08.752 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:08.752 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:08.752 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:08.752 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:08.752 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:02:08.752 Run-time dependency libisal found: NO (tried pkgconfig) 00:02:08.752 Library libisal found: NO 00:02:08.752 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:02:08.752 Compiler for C supports arguments -std=c11: YES (cached) 00:02:08.752 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:08.752 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:08.752 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:08.752 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:08.752 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:02:08.752 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:08.752 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:08.752 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:08.752 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:08.752 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:08.752 Program doxygen found: YES (/usr/bin/doxygen) 00:02:08.752 Configuring doxy-api-html.conf using configuration 00:02:08.753 Configuring doxy-api-man.conf using configuration 00:02:08.753 Program mandb found: YES (/usr/bin/mandb) 00:02:08.753 Program sphinx-build found: NO 00:02:08.753 Configuring rte_build_config.h using configuration 00:02:08.753 Message: 00:02:08.753 ================= 00:02:08.753 Applications Enabled 00:02:08.753 ================= 00:02:08.753 00:02:08.753 apps: 00:02:08.753 00:02:08.753 00:02:08.753 Message: 00:02:08.753 ================= 00:02:08.753 Libraries Enabled 00:02:08.753 ================= 00:02:08.753 00:02:08.753 libs: 00:02:08.753 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:08.753 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:08.753 cryptodev, dmadev, power, reorder, security, vhost, 00:02:08.753 00:02:08.753 Message: 00:02:08.753 =============== 00:02:08.753 Drivers Enabled 00:02:08.753 =============== 00:02:08.753 00:02:08.753 common: 00:02:08.753 mlx5, qat, 00:02:08.753 bus: 00:02:08.753 auxiliary, pci, vdev, 00:02:08.753 mempool: 00:02:08.753 ring, 00:02:08.753 dma: 00:02:08.753 00:02:08.753 net: 00:02:08.753 00:02:08.753 crypto: 00:02:08.753 ipsec_mb, mlx5, 00:02:08.753 compress: 00:02:08.753 isal, mlx5, 00:02:08.753 vdpa: 00:02:08.753 00:02:08.753 00:02:08.753 Message: 00:02:08.753 ================= 00:02:08.753 Content Skipped 00:02:08.753 ================= 00:02:08.753 00:02:08.753 apps: 00:02:08.753 dumpcap: explicitly disabled via build config 00:02:08.753 graph: explicitly disabled via build config 00:02:08.753 pdump: explicitly disabled via build config 00:02:08.753 proc-info: explicitly disabled via build config 00:02:08.753 test-acl: explicitly disabled via build config 00:02:08.753 test-bbdev: explicitly disabled via build config 00:02:08.753 test-cmdline: explicitly disabled via build config 00:02:08.753 test-compress-perf: explicitly disabled via build config 00:02:08.753 test-crypto-perf: explicitly disabled via build config 00:02:08.753 test-dma-perf: explicitly disabled via build config 00:02:08.753 test-eventdev: explicitly disabled via build config 00:02:08.753 test-fib: explicitly disabled via build config 00:02:08.753 test-flow-perf: explicitly disabled via build config 00:02:08.753 test-gpudev: explicitly disabled via build config 00:02:08.753 test-mldev: explicitly disabled via build config 00:02:08.753 test-pipeline: explicitly disabled via build config 00:02:08.753 test-pmd: explicitly disabled via build config 00:02:08.753 test-regex: explicitly disabled via build config 00:02:08.753 test-sad: explicitly disabled via build config 00:02:08.753 test-security-perf: explicitly disabled via build config 00:02:08.753 00:02:08.753 libs: 00:02:08.753 argparse: explicitly disabled via build config 00:02:08.753 metrics: explicitly disabled via build config 00:02:08.753 acl: explicitly disabled via build config 00:02:08.753 bbdev: explicitly disabled via build config 00:02:08.753 bitratestats: explicitly disabled via build config 00:02:08.753 bpf: explicitly disabled via build config 00:02:08.753 cfgfile: explicitly disabled via build config 00:02:08.753 distributor: explicitly disabled via build config 00:02:08.753 efd: explicitly disabled via build config 00:02:08.753 eventdev: explicitly disabled via build config 00:02:08.753 dispatcher: explicitly disabled via build config 00:02:08.753 gpudev: explicitly disabled via build config 00:02:08.753 gro: explicitly disabled via build config 00:02:08.753 gso: explicitly disabled via build config 00:02:08.753 ip_frag: explicitly disabled via build config 00:02:08.753 jobstats: explicitly disabled via build config 00:02:08.753 latencystats: explicitly disabled via build config 00:02:08.753 lpm: explicitly disabled via build config 00:02:08.753 member: explicitly disabled via build config 00:02:08.753 pcapng: explicitly disabled via build config 00:02:08.753 rawdev: explicitly disabled via build config 00:02:08.753 regexdev: explicitly disabled via build config 00:02:08.753 mldev: explicitly disabled via build config 00:02:08.753 rib: explicitly disabled via build config 00:02:08.753 sched: explicitly disabled via build config 00:02:08.753 stack: explicitly disabled via build config 00:02:08.753 ipsec: explicitly disabled via build config 00:02:08.753 pdcp: explicitly disabled via build config 00:02:08.753 fib: explicitly disabled via build config 00:02:08.753 port: explicitly disabled via build config 00:02:08.753 pdump: explicitly disabled via build config 00:02:08.753 table: explicitly disabled via build config 00:02:08.753 pipeline: explicitly disabled via build config 00:02:08.753 graph: explicitly disabled via build config 00:02:08.753 node: explicitly disabled via build config 00:02:08.753 00:02:08.753 drivers: 00:02:08.753 common/cpt: not in enabled drivers build config 00:02:08.753 common/dpaax: not in enabled drivers build config 00:02:08.753 common/iavf: not in enabled drivers build config 00:02:08.753 common/idpf: not in enabled drivers build config 00:02:08.753 common/ionic: not in enabled drivers build config 00:02:08.753 common/mvep: not in enabled drivers build config 00:02:08.753 common/octeontx: not in enabled drivers build config 00:02:08.753 bus/cdx: not in enabled drivers build config 00:02:08.753 bus/dpaa: not in enabled drivers build config 00:02:08.753 bus/fslmc: not in enabled drivers build config 00:02:08.753 bus/ifpga: not in enabled drivers build config 00:02:08.753 bus/platform: not in enabled drivers build config 00:02:08.753 bus/uacce: not in enabled drivers build config 00:02:08.753 bus/vmbus: not in enabled drivers build config 00:02:08.753 common/cnxk: not in enabled drivers build config 00:02:08.753 common/nfp: not in enabled drivers build config 00:02:08.753 common/nitrox: not in enabled drivers build config 00:02:08.753 common/sfc_efx: not in enabled drivers build config 00:02:08.753 mempool/bucket: not in enabled drivers build config 00:02:08.753 mempool/cnxk: not in enabled drivers build config 00:02:08.753 mempool/dpaa: not in enabled drivers build config 00:02:08.753 mempool/dpaa2: not in enabled drivers build config 00:02:08.753 mempool/octeontx: not in enabled drivers build config 00:02:08.753 mempool/stack: not in enabled drivers build config 00:02:08.753 dma/cnxk: not in enabled drivers build config 00:02:08.753 dma/dpaa: not in enabled drivers build config 00:02:08.753 dma/dpaa2: not in enabled drivers build config 00:02:08.753 dma/hisilicon: not in enabled drivers build config 00:02:08.753 dma/idxd: not in enabled drivers build config 00:02:08.753 dma/ioat: not in enabled drivers build config 00:02:08.753 dma/skeleton: not in enabled drivers build config 00:02:08.753 net/af_packet: not in enabled drivers build config 00:02:08.753 net/af_xdp: not in enabled drivers build config 00:02:08.753 net/ark: not in enabled drivers build config 00:02:08.753 net/atlantic: not in enabled drivers build config 00:02:08.753 net/avp: not in enabled drivers build config 00:02:08.753 net/axgbe: not in enabled drivers build config 00:02:08.753 net/bnx2x: not in enabled drivers build config 00:02:08.753 net/bnxt: not in enabled drivers build config 00:02:08.753 net/bonding: not in enabled drivers build config 00:02:08.753 net/cnxk: not in enabled drivers build config 00:02:08.753 net/cpfl: not in enabled drivers build config 00:02:08.753 net/cxgbe: not in enabled drivers build config 00:02:08.753 net/dpaa: not in enabled drivers build config 00:02:08.753 net/dpaa2: not in enabled drivers build config 00:02:08.753 net/e1000: not in enabled drivers build config 00:02:08.753 net/ena: not in enabled drivers build config 00:02:08.753 net/enetc: not in enabled drivers build config 00:02:08.753 net/enetfec: not in enabled drivers build config 00:02:08.753 net/enic: not in enabled drivers build config 00:02:08.753 net/failsafe: not in enabled drivers build config 00:02:08.753 net/fm10k: not in enabled drivers build config 00:02:08.753 net/gve: not in enabled drivers build config 00:02:08.753 net/hinic: not in enabled drivers build config 00:02:08.753 net/hns3: not in enabled drivers build config 00:02:08.753 net/i40e: not in enabled drivers build config 00:02:08.753 net/iavf: not in enabled drivers build config 00:02:08.753 net/ice: not in enabled drivers build config 00:02:08.753 net/idpf: not in enabled drivers build config 00:02:08.753 net/igc: not in enabled drivers build config 00:02:08.753 net/ionic: not in enabled drivers build config 00:02:08.753 net/ipn3ke: not in enabled drivers build config 00:02:08.753 net/ixgbe: not in enabled drivers build config 00:02:08.753 net/mana: not in enabled drivers build config 00:02:08.753 net/memif: not in enabled drivers build config 00:02:08.753 net/mlx4: not in enabled drivers build config 00:02:08.753 net/mlx5: not in enabled drivers build config 00:02:08.753 net/mvneta: not in enabled drivers build config 00:02:08.753 net/mvpp2: not in enabled drivers build config 00:02:08.753 net/netvsc: not in enabled drivers build config 00:02:08.753 net/nfb: not in enabled drivers build config 00:02:08.753 net/nfp: not in enabled drivers build config 00:02:08.753 net/ngbe: not in enabled drivers build config 00:02:08.753 net/null: not in enabled drivers build config 00:02:08.753 net/octeontx: not in enabled drivers build config 00:02:08.753 net/octeon_ep: not in enabled drivers build config 00:02:08.753 net/pcap: not in enabled drivers build config 00:02:08.753 net/pfe: not in enabled drivers build config 00:02:08.753 net/qede: not in enabled drivers build config 00:02:08.753 net/ring: not in enabled drivers build config 00:02:08.753 net/sfc: not in enabled drivers build config 00:02:08.753 net/softnic: not in enabled drivers build config 00:02:08.753 net/tap: not in enabled drivers build config 00:02:08.753 net/thunderx: not in enabled drivers build config 00:02:08.753 net/txgbe: not in enabled drivers build config 00:02:08.753 net/vdev_netvsc: not in enabled drivers build config 00:02:08.753 net/vhost: not in enabled drivers build config 00:02:08.753 net/virtio: not in enabled drivers build config 00:02:08.754 net/vmxnet3: not in enabled drivers build config 00:02:08.754 raw/*: missing internal dependency, "rawdev" 00:02:08.754 crypto/armv8: not in enabled drivers build config 00:02:08.754 crypto/bcmfs: not in enabled drivers build config 00:02:08.754 crypto/caam_jr: not in enabled drivers build config 00:02:08.754 crypto/ccp: not in enabled drivers build config 00:02:08.754 crypto/cnxk: not in enabled drivers build config 00:02:08.754 crypto/dpaa_sec: not in enabled drivers build config 00:02:08.754 crypto/dpaa2_sec: not in enabled drivers build config 00:02:08.754 crypto/mvsam: not in enabled drivers build config 00:02:08.754 crypto/nitrox: not in enabled drivers build config 00:02:08.754 crypto/null: not in enabled drivers build config 00:02:08.754 crypto/octeontx: not in enabled drivers build config 00:02:08.754 crypto/openssl: not in enabled drivers build config 00:02:08.754 crypto/scheduler: not in enabled drivers build config 00:02:08.754 crypto/uadk: not in enabled drivers build config 00:02:08.754 crypto/virtio: not in enabled drivers build config 00:02:08.754 compress/nitrox: not in enabled drivers build config 00:02:08.754 compress/octeontx: not in enabled drivers build config 00:02:08.754 compress/zlib: not in enabled drivers build config 00:02:08.754 regex/*: missing internal dependency, "regexdev" 00:02:08.754 ml/*: missing internal dependency, "mldev" 00:02:08.754 vdpa/ifc: not in enabled drivers build config 00:02:08.754 vdpa/mlx5: not in enabled drivers build config 00:02:08.754 vdpa/nfp: not in enabled drivers build config 00:02:08.754 vdpa/sfc: not in enabled drivers build config 00:02:08.754 event/*: missing internal dependency, "eventdev" 00:02:08.754 baseband/*: missing internal dependency, "bbdev" 00:02:08.754 gpu/*: missing internal dependency, "gpudev" 00:02:08.754 00:02:08.754 00:02:08.754 Build targets in project: 115 00:02:08.754 00:02:08.754 DPDK 24.03.0 00:02:08.754 00:02:08.754 User defined options 00:02:08.754 buildtype : debug 00:02:08.754 default_library : shared 00:02:08.754 libdir : lib 00:02:08.754 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:08.754 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:02:08.754 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:02:08.754 cpu_instruction_set: native 00:02:08.754 disable_apps : test-sad,test-acl,test-dma-perf,test-pipeline,test-compress-perf,test-fib,test-flow-perf,test-crypto-perf,test-bbdev,test-eventdev,pdump,test-mldev,test-cmdline,graph,test-security-perf,test-pmd,test,proc-info,test-regex,dumpcap,test-gpudev 00:02:08.754 disable_libs : port,sched,rib,node,ipsec,distributor,gro,eventdev,pdcp,acl,member,latencystats,efd,stack,regexdev,rawdev,bpf,metrics,gpudev,pipeline,pdump,table,fib,dispatcher,mldev,gso,cfgfile,bitratestats,ip_frag,graph,lpm,jobstats,argparse,pcapng,bbdev 00:02:08.754 enable_docs : false 00:02:08.754 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:02:08.754 enable_kmods : false 00:02:08.754 max_lcores : 128 00:02:08.754 tests : false 00:02:08.754 00:02:08.754 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:08.754 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:02:08.754 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:08.754 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:08.754 [3/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:08.754 [4/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:08.754 [5/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:09.013 [6/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:09.013 [7/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:09.013 [8/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:09.013 [9/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:09.013 [10/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:09.013 [11/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:09.013 [12/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:09.013 [13/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:09.013 [14/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:09.013 [15/378] Linking static target lib/librte_kvargs.a 00:02:09.013 [16/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:09.013 [17/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:09.013 [18/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:09.013 [19/378] Linking static target lib/librte_log.a 00:02:09.013 [20/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:09.013 [21/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:09.013 [22/378] Linking static target lib/librte_pci.a 00:02:09.013 [23/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:09.013 [24/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:09.283 [25/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:09.283 [26/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:09.283 [27/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:09.283 [28/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.283 [29/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:09.283 [30/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:09.283 [31/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:09.283 [32/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:09.283 [33/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:09.541 [34/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.541 [35/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:09.541 [36/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:09.541 [37/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:09.541 [38/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:09.541 [39/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:09.541 [40/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:09.541 [41/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:09.541 [42/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:09.541 [43/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:09.541 [44/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:09.541 [45/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:09.541 [46/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:09.541 [47/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:09.541 [48/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:09.542 [49/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:09.542 [50/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:09.542 [51/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:09.542 [52/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:09.542 [53/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:09.542 [54/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:09.542 [55/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:09.542 [56/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:09.542 [57/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:09.542 [58/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:09.542 [59/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:09.542 [60/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:09.542 [61/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:09.542 [62/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:09.542 [63/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:09.542 [64/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:09.542 [65/378] Linking static target lib/librte_meter.a 00:02:09.542 [66/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:09.542 [67/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:09.542 [68/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:09.542 [69/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:09.542 [70/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:09.542 [71/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:09.542 [72/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:09.542 [73/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:09.542 [74/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:09.542 [75/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:09.542 [76/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:09.542 [77/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:09.542 [78/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:09.542 [79/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:09.542 [80/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:09.542 [81/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:09.542 [82/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:09.542 [83/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:09.542 [84/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:09.542 [85/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:09.542 [86/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:09.542 [87/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:09.542 [88/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:09.542 [89/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:09.542 [90/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:09.542 [91/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:09.542 [92/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:09.542 [93/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:09.542 [94/378] Linking static target lib/librte_telemetry.a 00:02:09.542 [95/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:09.542 [96/378] Linking static target lib/librte_ring.a 00:02:09.542 [97/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:09.542 [98/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:09.542 [99/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:09.542 [100/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:09.542 [101/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:09.542 [102/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:09.542 [103/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:09.542 [104/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:09.542 [105/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:09.542 [106/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:09.542 [107/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:09.542 [108/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:09.542 [109/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:09.542 [110/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:09.542 [111/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:09.542 [112/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:09.542 [113/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:09.804 [114/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:02:09.804 [115/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:09.804 [116/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:09.804 [117/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:09.804 [118/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:09.804 [119/378] Linking static target lib/librte_net.a 00:02:09.804 [120/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:09.804 [121/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:09.804 [122/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:09.804 [123/378] Linking static target lib/librte_mempool.a 00:02:09.804 [124/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:09.804 [125/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:09.804 [126/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:09.804 [127/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:09.804 [128/378] Linking static target lib/librte_rcu.a 00:02:09.804 [129/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:02:09.804 [130/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:09.804 [131/378] Linking static target lib/librte_cmdline.a 00:02:09.804 [132/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:09.804 [133/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:09.804 [134/378] Linking static target lib/librte_eal.a 00:02:09.804 [135/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:09.804 [136/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:09.804 [137/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.804 [138/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:10.067 [139/378] Linking target lib/librte_log.so.24.1 00:02:10.067 [140/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:10.067 [141/378] Linking static target lib/librte_mbuf.a 00:02:10.067 [142/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:02:10.067 [143/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:10.067 [144/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.067 [145/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:10.067 [146/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:10.067 [147/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:10.067 [148/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:10.067 [149/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:10.067 [150/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:10.067 [151/378] Linking static target lib/librte_timer.a 00:02:10.067 [152/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.067 [153/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:10.067 [154/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:02:10.067 [155/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.067 [156/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:10.067 [157/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:10.067 [158/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:10.328 [159/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:10.328 [160/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:10.328 [161/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:02:10.328 [162/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:10.328 [163/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:10.328 [164/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:02:10.328 [165/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:10.328 [166/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:10.328 [167/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:10.328 [168/378] Linking target lib/librte_kvargs.so.24.1 00:02:10.328 [169/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:10.328 [170/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:02:10.328 [171/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:10.328 [172/378] Linking static target lib/librte_dmadev.a 00:02:10.328 [173/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:10.328 [174/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:10.328 [175/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.328 [176/378] Linking static target lib/librte_compressdev.a 00:02:10.328 [177/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:10.328 [178/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:10.328 [179/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:10.328 [180/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:10.328 [181/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:10.328 [182/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:10.328 [183/378] Linking static target lib/librte_power.a 00:02:10.328 [184/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:10.328 [185/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:10.328 [186/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:02:10.328 [187/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:02:10.328 [188/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.328 [189/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:02:10.328 [190/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:10.328 [191/378] Linking static target lib/librte_reorder.a 00:02:10.328 [192/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:10.328 [193/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:02:10.328 [194/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:02:10.328 [195/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:02:10.328 [196/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:10.328 [197/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:02:10.328 [198/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:02:10.328 [199/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:02:10.328 [200/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:02:10.328 [201/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:02:10.328 [202/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:02:10.328 [203/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:02:10.587 [204/378] Linking target lib/librte_telemetry.so.24.1 00:02:10.587 [205/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:10.587 [206/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:02:10.587 [207/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:02:10.587 [208/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:10.587 [209/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:02:10.587 [210/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:02:10.587 [211/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:02:10.587 [212/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:10.587 [213/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:02:10.587 [214/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:02:10.587 [215/378] Linking static target lib/librte_security.a 00:02:10.587 [216/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:02:10.587 [217/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:02:10.587 [218/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:02:10.587 [219/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:02:10.587 [220/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:02:10.587 [221/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:10.587 [222/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:10.587 [223/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:02:10.587 [224/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:10.587 [225/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:10.587 [226/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:10.587 [227/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:02:10.587 [228/378] Linking static target drivers/librte_bus_vdev.a 00:02:10.587 [229/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:10.587 [230/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:10.587 [231/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.587 [232/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:02:10.587 [233/378] Linking static target drivers/librte_bus_pci.a 00:02:10.587 [234/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:02:10.587 [235/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:10.587 [236/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:02:10.587 [237/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:02:10.587 [238/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:10.587 [239/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:02:10.587 [240/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:10.587 [241/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:02:10.587 [242/378] Linking static target drivers/librte_bus_auxiliary.a 00:02:10.587 [243/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:02:10.587 [244/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:02:10.587 [245/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.587 [246/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:02:10.587 [247/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:10.587 [248/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:02:10.587 [249/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:10.587 [250/378] Linking static target lib/librte_hash.a 00:02:10.587 [251/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.587 [252/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:02:10.845 [253/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:02:10.845 [254/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:02:10.845 [255/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:10.845 [256/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.845 [257/378] Linking static target lib/librte_cryptodev.a 00:02:10.845 [258/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:02:10.845 [259/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:10.845 [260/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:02:10.845 [261/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:10.845 [262/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:02:10.845 [263/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:02:10.845 [264/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.845 [265/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:02:10.845 [266/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:02:10.845 [267/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:02:10.845 [268/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:02:10.845 [269/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:02:10.845 [270/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.845 [271/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:02:10.845 [272/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:02:10.845 [273/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.845 [274/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.845 [275/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:10.845 [276/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.845 [277/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:02:10.845 [278/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:02:10.845 [279/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:10.845 [280/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:11.103 [281/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.103 [282/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:02:11.103 [283/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:11.104 [284/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:11.104 [285/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:02:11.104 [286/378] Linking static target drivers/librte_mempool_ring.a 00:02:11.104 [287/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:11.104 [288/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:02:11.104 [289/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:02:11.104 [290/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:02:11.104 [291/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:02:11.104 [292/378] Linking static target lib/librte_ethdev.a 00:02:11.104 [293/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:02:11.104 [294/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.104 [295/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:11.104 [296/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:11.104 [297/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:11.104 [298/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:11.104 [299/378] Linking static target drivers/librte_compress_mlx5.a 00:02:11.104 [300/378] Linking static target drivers/librte_compress_isal.a 00:02:11.104 [301/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:11.104 [302/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:11.104 [303/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:11.104 [304/378] Linking static target drivers/librte_crypto_mlx5.a 00:02:11.104 [305/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:02:11.104 [306/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:02:11.104 [307/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.361 [308/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:11.361 [309/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:11.361 [310/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:11.361 [311/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:11.361 [312/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:11.361 [313/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:11.361 [314/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:11.361 [315/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:11.361 [316/378] Linking static target drivers/librte_common_mlx5.a 00:02:11.361 [317/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.619 [318/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:11.619 [319/378] Linking static target drivers/libtmp_rte_common_qat.a 00:02:11.876 [320/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:11.876 [321/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:11.876 [322/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:11.876 [323/378] Linking static target drivers/librte_common_qat.a 00:02:12.441 [324/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:12.441 [325/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:12.441 [326/378] Linking static target lib/librte_vhost.a 00:02:14.336 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.231 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.797 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.363 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.363 [331/378] Linking target lib/librte_eal.so.24.1 00:02:19.621 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:02:19.621 [333/378] Linking target lib/librte_ring.so.24.1 00:02:19.621 [334/378] Linking target lib/librte_dmadev.so.24.1 00:02:19.621 [335/378] Linking target drivers/librte_bus_vdev.so.24.1 00:02:19.621 [336/378] Linking target lib/librte_pci.so.24.1 00:02:19.621 [337/378] Linking target lib/librte_timer.so.24.1 00:02:19.621 [338/378] Linking target lib/librte_meter.so.24.1 00:02:19.621 [339/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:02:19.621 [340/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:02:19.621 [341/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:02:19.621 [342/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:02:19.621 [343/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:02:19.621 [344/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:02:19.621 [345/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:02:19.621 [346/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:02:19.621 [347/378] Linking target lib/librte_rcu.so.24.1 00:02:19.621 [348/378] Linking target lib/librte_mempool.so.24.1 00:02:19.621 [349/378] Linking target drivers/librte_bus_pci.so.24.1 00:02:19.880 [350/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:02:19.880 [351/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:02:19.880 [352/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:02:19.880 [353/378] Linking target lib/librte_mbuf.so.24.1 00:02:19.880 [354/378] Linking target drivers/librte_mempool_ring.so.24.1 00:02:19.880 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:02:20.137 [356/378] Linking target lib/librte_net.so.24.1 00:02:20.137 [357/378] Linking target lib/librte_reorder.so.24.1 00:02:20.137 [358/378] Linking target lib/librte_compressdev.so.24.1 00:02:20.137 [359/378] Linking target lib/librte_cryptodev.so.24.1 00:02:20.137 [360/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:02:20.138 [361/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:02:20.138 [362/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:02:20.138 [363/378] Linking target lib/librte_security.so.24.1 00:02:20.138 [364/378] Linking target lib/librte_hash.so.24.1 00:02:20.138 [365/378] Linking target lib/librte_cmdline.so.24.1 00:02:20.138 [366/378] Linking target drivers/librte_compress_isal.so.24.1 00:02:20.138 [367/378] Linking target lib/librte_ethdev.so.24.1 00:02:20.395 [368/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:20.395 [369/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:20.395 [370/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:20.395 [371/378] Linking target drivers/librte_common_mlx5.so.24.1 00:02:20.395 [372/378] Linking target lib/librte_power.so.24.1 00:02:20.395 [373/378] Linking target lib/librte_vhost.so.24.1 00:02:20.653 [374/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:20.653 [375/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:20.653 [376/378] Linking target drivers/librte_common_qat.so.24.1 00:02:20.653 [377/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:20.653 [378/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:20.653 INFO: autodetecting backend as ninja 00:02:20.653 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 96 00:02:21.586 CC lib/ut/ut.o 00:02:21.586 CC lib/log/log.o 00:02:21.586 CC lib/log/log_flags.o 00:02:21.586 CC lib/log/log_deprecated.o 00:02:21.586 CC lib/ut_mock/mock.o 00:02:21.586 LIB libspdk_ut.a 00:02:21.843 LIB libspdk_log.a 00:02:21.843 LIB libspdk_ut_mock.a 00:02:21.843 SO libspdk_ut.so.2.0 00:02:21.843 SO libspdk_log.so.7.0 00:02:21.843 SO libspdk_ut_mock.so.6.0 00:02:21.843 SYMLINK libspdk_ut.so 00:02:21.843 SYMLINK libspdk_log.so 00:02:21.843 SYMLINK libspdk_ut_mock.so 00:02:22.101 CC lib/dma/dma.o 00:02:22.101 CXX lib/trace_parser/trace.o 00:02:22.101 CC lib/ioat/ioat.o 00:02:22.101 CC lib/util/base64.o 00:02:22.101 CC lib/util/bit_array.o 00:02:22.101 CC lib/util/crc32.o 00:02:22.101 CC lib/util/cpuset.o 00:02:22.101 CC lib/util/crc16.o 00:02:22.101 CC lib/util/crc32c.o 00:02:22.101 CC lib/util/crc32_ieee.o 00:02:22.101 CC lib/util/crc64.o 00:02:22.101 CC lib/util/dif.o 00:02:22.101 CC lib/util/fd.o 00:02:22.101 CC lib/util/fd_group.o 00:02:22.101 CC lib/util/file.o 00:02:22.101 CC lib/util/hexlify.o 00:02:22.101 CC lib/util/iov.o 00:02:22.101 CC lib/util/math.o 00:02:22.101 CC lib/util/net.o 00:02:22.101 CC lib/util/pipe.o 00:02:22.101 CC lib/util/strerror_tls.o 00:02:22.101 CC lib/util/uuid.o 00:02:22.101 CC lib/util/string.o 00:02:22.101 CC lib/util/zipf.o 00:02:22.101 CC lib/util/xor.o 00:02:22.360 LIB libspdk_dma.a 00:02:22.360 CC lib/vfio_user/host/vfio_user_pci.o 00:02:22.360 CC lib/vfio_user/host/vfio_user.o 00:02:22.360 SO libspdk_dma.so.4.0 00:02:22.360 SYMLINK libspdk_dma.so 00:02:22.360 LIB libspdk_ioat.a 00:02:22.360 SO libspdk_ioat.so.7.0 00:02:22.360 SYMLINK libspdk_ioat.so 00:02:22.360 LIB libspdk_vfio_user.a 00:02:22.617 SO libspdk_vfio_user.so.5.0 00:02:22.618 LIB libspdk_util.a 00:02:22.618 SYMLINK libspdk_vfio_user.so 00:02:22.618 SO libspdk_util.so.10.0 00:02:22.875 SYMLINK libspdk_util.so 00:02:22.875 LIB libspdk_trace_parser.a 00:02:22.875 SO libspdk_trace_parser.so.5.0 00:02:22.875 SYMLINK libspdk_trace_parser.so 00:02:23.134 CC lib/rdma_provider/common.o 00:02:23.134 CC lib/rdma_provider/rdma_provider_verbs.o 00:02:23.134 CC lib/rdma_utils/rdma_utils.o 00:02:23.134 CC lib/json/json_parse.o 00:02:23.134 CC lib/json/json_util.o 00:02:23.134 CC lib/json/json_write.o 00:02:23.134 CC lib/reduce/reduce.o 00:02:23.134 CC lib/idxd/idxd.o 00:02:23.134 CC lib/idxd/idxd_user.o 00:02:23.134 CC lib/idxd/idxd_kernel.o 00:02:23.134 CC lib/conf/conf.o 00:02:23.134 CC lib/env_dpdk/env.o 00:02:23.134 CC lib/env_dpdk/memory.o 00:02:23.134 CC lib/env_dpdk/pci.o 00:02:23.134 CC lib/env_dpdk/init.o 00:02:23.134 CC lib/env_dpdk/pci_ioat.o 00:02:23.134 CC lib/env_dpdk/threads.o 00:02:23.134 CC lib/env_dpdk/pci_virtio.o 00:02:23.134 CC lib/env_dpdk/pci_vmd.o 00:02:23.134 CC lib/env_dpdk/pci_idxd.o 00:02:23.134 CC lib/vmd/vmd.o 00:02:23.134 CC lib/env_dpdk/pci_event.o 00:02:23.134 CC lib/vmd/led.o 00:02:23.134 CC lib/env_dpdk/sigbus_handler.o 00:02:23.134 CC lib/env_dpdk/pci_dpdk.o 00:02:23.134 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:23.134 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:23.134 LIB libspdk_rdma_provider.a 00:02:23.134 SO libspdk_rdma_provider.so.6.0 00:02:23.392 LIB libspdk_conf.a 00:02:23.392 LIB libspdk_rdma_utils.a 00:02:23.392 SYMLINK libspdk_rdma_provider.so 00:02:23.392 SO libspdk_conf.so.6.0 00:02:23.392 LIB libspdk_json.a 00:02:23.392 SO libspdk_rdma_utils.so.1.0 00:02:23.392 SYMLINK libspdk_conf.so 00:02:23.392 SO libspdk_json.so.6.0 00:02:23.392 SYMLINK libspdk_rdma_utils.so 00:02:23.392 SYMLINK libspdk_json.so 00:02:23.392 LIB libspdk_idxd.a 00:02:23.650 SO libspdk_idxd.so.12.0 00:02:23.650 LIB libspdk_vmd.a 00:02:23.650 LIB libspdk_reduce.a 00:02:23.650 SO libspdk_vmd.so.6.0 00:02:23.650 SYMLINK libspdk_idxd.so 00:02:23.650 SO libspdk_reduce.so.6.1 00:02:23.650 SYMLINK libspdk_vmd.so 00:02:23.650 SYMLINK libspdk_reduce.so 00:02:23.650 CC lib/jsonrpc/jsonrpc_server.o 00:02:23.650 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:23.650 CC lib/jsonrpc/jsonrpc_client.o 00:02:23.650 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:23.909 LIB libspdk_jsonrpc.a 00:02:23.909 SO libspdk_jsonrpc.so.6.0 00:02:23.909 SYMLINK libspdk_jsonrpc.so 00:02:24.166 LIB libspdk_env_dpdk.a 00:02:24.166 SO libspdk_env_dpdk.so.15.0 00:02:24.166 SYMLINK libspdk_env_dpdk.so 00:02:24.166 CC lib/rpc/rpc.o 00:02:24.424 LIB libspdk_rpc.a 00:02:24.424 SO libspdk_rpc.so.6.0 00:02:24.681 SYMLINK libspdk_rpc.so 00:02:24.937 CC lib/trace/trace.o 00:02:24.937 CC lib/trace/trace_flags.o 00:02:24.937 CC lib/trace/trace_rpc.o 00:02:24.937 CC lib/keyring/keyring_rpc.o 00:02:24.937 CC lib/keyring/keyring.o 00:02:24.937 CC lib/notify/notify.o 00:02:24.937 CC lib/notify/notify_rpc.o 00:02:24.937 LIB libspdk_notify.a 00:02:24.937 LIB libspdk_keyring.a 00:02:24.937 LIB libspdk_trace.a 00:02:24.937 SO libspdk_notify.so.6.0 00:02:24.937 SO libspdk_keyring.so.1.0 00:02:25.195 SO libspdk_trace.so.10.0 00:02:25.195 SYMLINK libspdk_notify.so 00:02:25.195 SYMLINK libspdk_keyring.so 00:02:25.195 SYMLINK libspdk_trace.so 00:02:25.451 CC lib/thread/thread.o 00:02:25.451 CC lib/thread/iobuf.o 00:02:25.451 CC lib/sock/sock_rpc.o 00:02:25.451 CC lib/sock/sock.o 00:02:25.708 LIB libspdk_sock.a 00:02:25.708 SO libspdk_sock.so.10.0 00:02:25.708 SYMLINK libspdk_sock.so 00:02:25.965 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:25.965 CC lib/nvme/nvme_ctrlr.o 00:02:25.965 CC lib/nvme/nvme_fabric.o 00:02:25.965 CC lib/nvme/nvme_ns_cmd.o 00:02:25.965 CC lib/nvme/nvme_ns.o 00:02:25.965 CC lib/nvme/nvme_pcie_common.o 00:02:25.965 CC lib/nvme/nvme_pcie.o 00:02:25.965 CC lib/nvme/nvme_qpair.o 00:02:25.965 CC lib/nvme/nvme.o 00:02:25.965 CC lib/nvme/nvme_quirks.o 00:02:25.965 CC lib/nvme/nvme_transport.o 00:02:25.965 CC lib/nvme/nvme_discovery.o 00:02:25.966 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:25.966 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:25.966 CC lib/nvme/nvme_tcp.o 00:02:25.966 CC lib/nvme/nvme_opal.o 00:02:25.966 CC lib/nvme/nvme_io_msg.o 00:02:25.966 CC lib/nvme/nvme_poll_group.o 00:02:25.966 CC lib/nvme/nvme_stubs.o 00:02:25.966 CC lib/nvme/nvme_zns.o 00:02:25.966 CC lib/nvme/nvme_auth.o 00:02:25.966 CC lib/nvme/nvme_cuse.o 00:02:25.966 CC lib/nvme/nvme_rdma.o 00:02:26.530 LIB libspdk_thread.a 00:02:26.530 SO libspdk_thread.so.10.1 00:02:26.530 SYMLINK libspdk_thread.so 00:02:26.788 CC lib/accel/accel_rpc.o 00:02:26.788 CC lib/accel/accel.o 00:02:26.788 CC lib/accel/accel_sw.o 00:02:26.788 CC lib/blob/blobstore.o 00:02:26.788 CC lib/blob/request.o 00:02:26.788 CC lib/blob/zeroes.o 00:02:26.788 CC lib/blob/blob_bs_dev.o 00:02:26.788 CC lib/init/json_config.o 00:02:26.788 CC lib/virtio/virtio_vfio_user.o 00:02:26.788 CC lib/virtio/virtio.o 00:02:26.788 CC lib/init/subsystem.o 00:02:26.788 CC lib/virtio/virtio_vhost_user.o 00:02:26.788 CC lib/init/subsystem_rpc.o 00:02:26.788 CC lib/init/rpc.o 00:02:26.788 CC lib/virtio/virtio_pci.o 00:02:27.046 LIB libspdk_init.a 00:02:27.046 SO libspdk_init.so.5.0 00:02:27.046 LIB libspdk_virtio.a 00:02:27.046 SYMLINK libspdk_init.so 00:02:27.046 SO libspdk_virtio.so.7.0 00:02:27.304 SYMLINK libspdk_virtio.so 00:02:27.304 CC lib/event/app.o 00:02:27.304 CC lib/event/reactor.o 00:02:27.304 CC lib/event/log_rpc.o 00:02:27.304 CC lib/event/app_rpc.o 00:02:27.304 CC lib/event/scheduler_static.o 00:02:27.562 LIB libspdk_accel.a 00:02:27.562 SO libspdk_accel.so.16.0 00:02:27.562 SYMLINK libspdk_accel.so 00:02:27.562 LIB libspdk_nvme.a 00:02:27.820 LIB libspdk_event.a 00:02:27.820 SO libspdk_nvme.so.13.1 00:02:27.820 SO libspdk_event.so.14.0 00:02:27.820 SYMLINK libspdk_event.so 00:02:27.820 CC lib/bdev/bdev.o 00:02:27.820 CC lib/bdev/bdev_zone.o 00:02:27.820 CC lib/bdev/bdev_rpc.o 00:02:27.820 CC lib/bdev/scsi_nvme.o 00:02:27.820 CC lib/bdev/part.o 00:02:28.078 SYMLINK libspdk_nvme.so 00:02:29.013 LIB libspdk_blob.a 00:02:29.013 SO libspdk_blob.so.11.0 00:02:29.013 SYMLINK libspdk_blob.so 00:02:29.271 CC lib/lvol/lvol.o 00:02:29.271 CC lib/blobfs/blobfs.o 00:02:29.271 CC lib/blobfs/tree.o 00:02:29.530 LIB libspdk_bdev.a 00:02:29.788 SO libspdk_bdev.so.16.0 00:02:29.788 SYMLINK libspdk_bdev.so 00:02:29.788 LIB libspdk_blobfs.a 00:02:29.788 SO libspdk_blobfs.so.10.0 00:02:29.788 LIB libspdk_lvol.a 00:02:30.047 SO libspdk_lvol.so.10.0 00:02:30.047 SYMLINK libspdk_blobfs.so 00:02:30.047 SYMLINK libspdk_lvol.so 00:02:30.047 CC lib/ftl/ftl_init.o 00:02:30.047 CC lib/ftl/ftl_layout.o 00:02:30.047 CC lib/ftl/ftl_core.o 00:02:30.047 CC lib/ftl/ftl_debug.o 00:02:30.047 CC lib/nvmf/ctrlr_discovery.o 00:02:30.047 CC lib/nvmf/ctrlr.o 00:02:30.047 CC lib/nvmf/subsystem.o 00:02:30.047 CC lib/nvmf/ctrlr_bdev.o 00:02:30.047 CC lib/ftl/ftl_io.o 00:02:30.047 CC lib/ftl/ftl_sb.o 00:02:30.047 CC lib/nvmf/nvmf_rpc.o 00:02:30.047 CC lib/nvmf/nvmf.o 00:02:30.047 CC lib/ftl/ftl_l2p.o 00:02:30.047 CC lib/ftl/ftl_l2p_flat.o 00:02:30.047 CC lib/nvmf/transport.o 00:02:30.047 CC lib/ftl/ftl_nv_cache.o 00:02:30.047 CC lib/ftl/ftl_band.o 00:02:30.047 CC lib/nvmf/tcp.o 00:02:30.047 CC lib/nvmf/stubs.o 00:02:30.047 CC lib/ftl/ftl_band_ops.o 00:02:30.047 CC lib/nvmf/mdns_server.o 00:02:30.047 CC lib/nvmf/rdma.o 00:02:30.047 CC lib/ftl/ftl_writer.o 00:02:30.047 CC lib/ftl/ftl_reloc.o 00:02:30.047 CC lib/ftl/ftl_rq.o 00:02:30.047 CC lib/nvmf/auth.o 00:02:30.047 CC lib/nbd/nbd.o 00:02:30.047 CC lib/ftl/ftl_l2p_cache.o 00:02:30.047 CC lib/ftl/ftl_p2l.o 00:02:30.047 CC lib/nbd/nbd_rpc.o 00:02:30.047 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:30.047 CC lib/ftl/mngt/ftl_mngt.o 00:02:30.047 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:30.047 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:30.047 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:30.047 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:30.047 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:30.047 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:30.047 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:30.047 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:30.047 CC lib/ublk/ublk.o 00:02:30.047 CC lib/ublk/ublk_rpc.o 00:02:30.047 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:30.047 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:30.047 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:30.047 CC lib/scsi/dev.o 00:02:30.047 CC lib/ftl/utils/ftl_conf.o 00:02:30.047 CC lib/scsi/lun.o 00:02:30.047 CC lib/scsi/port.o 00:02:30.047 CC lib/scsi/scsi_bdev.o 00:02:30.047 CC lib/ftl/utils/ftl_mempool.o 00:02:30.047 CC lib/scsi/scsi.o 00:02:30.047 CC lib/ftl/utils/ftl_bitmap.o 00:02:30.047 CC lib/ftl/utils/ftl_md.o 00:02:30.047 CC lib/scsi/scsi_pr.o 00:02:30.047 CC lib/scsi/scsi_rpc.o 00:02:30.047 CC lib/ftl/utils/ftl_property.o 00:02:30.047 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:30.047 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:30.047 CC lib/scsi/task.o 00:02:30.047 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:30.047 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:30.047 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:30.047 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:30.047 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:02:30.047 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:30.047 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:30.047 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:30.047 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:30.047 CC lib/ftl/base/ftl_base_dev.o 00:02:30.047 CC lib/ftl/base/ftl_base_bdev.o 00:02:30.047 CC lib/ftl/ftl_trace.o 00:02:30.614 LIB libspdk_nbd.a 00:02:30.614 SO libspdk_nbd.so.7.0 00:02:30.614 LIB libspdk_scsi.a 00:02:30.614 SO libspdk_scsi.so.9.0 00:02:30.872 SYMLINK libspdk_nbd.so 00:02:30.872 SYMLINK libspdk_scsi.so 00:02:30.872 LIB libspdk_ublk.a 00:02:30.872 SO libspdk_ublk.so.3.0 00:02:30.872 SYMLINK libspdk_ublk.so 00:02:31.130 LIB libspdk_ftl.a 00:02:31.130 CC lib/vhost/vhost.o 00:02:31.130 CC lib/vhost/vhost_rpc.o 00:02:31.130 CC lib/vhost/vhost_scsi.o 00:02:31.130 CC lib/vhost/rte_vhost_user.o 00:02:31.130 CC lib/vhost/vhost_blk.o 00:02:31.130 CC lib/iscsi/conn.o 00:02:31.130 CC lib/iscsi/init_grp.o 00:02:31.130 CC lib/iscsi/iscsi.o 00:02:31.130 CC lib/iscsi/md5.o 00:02:31.130 CC lib/iscsi/portal_grp.o 00:02:31.130 CC lib/iscsi/param.o 00:02:31.130 CC lib/iscsi/tgt_node.o 00:02:31.130 CC lib/iscsi/task.o 00:02:31.130 CC lib/iscsi/iscsi_subsystem.o 00:02:31.130 CC lib/iscsi/iscsi_rpc.o 00:02:31.130 SO libspdk_ftl.so.9.0 00:02:31.387 SYMLINK libspdk_ftl.so 00:02:31.645 LIB libspdk_nvmf.a 00:02:31.902 SO libspdk_nvmf.so.19.0 00:02:31.902 LIB libspdk_vhost.a 00:02:31.902 SO libspdk_vhost.so.8.0 00:02:31.902 SYMLINK libspdk_nvmf.so 00:02:31.902 SYMLINK libspdk_vhost.so 00:02:32.160 LIB libspdk_iscsi.a 00:02:32.160 SO libspdk_iscsi.so.8.0 00:02:32.160 SYMLINK libspdk_iscsi.so 00:02:32.728 CC module/env_dpdk/env_dpdk_rpc.o 00:02:32.728 CC module/sock/posix/posix.o 00:02:32.728 CC module/scheduler/gscheduler/gscheduler.o 00:02:32.728 CC module/keyring/file/keyring_rpc.o 00:02:32.728 CC module/keyring/file/keyring.o 00:02:32.728 CC module/accel/dsa/accel_dsa_rpc.o 00:02:32.728 CC module/accel/dsa/accel_dsa.o 00:02:32.728 LIB libspdk_env_dpdk_rpc.a 00:02:32.986 CC module/blob/bdev/blob_bdev.o 00:02:32.986 CC module/keyring/linux/keyring.o 00:02:32.986 CC module/keyring/linux/keyring_rpc.o 00:02:32.986 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:32.986 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:32.986 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:32.986 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:32.986 CC module/accel/error/accel_error.o 00:02:32.986 CC module/accel/error/accel_error_rpc.o 00:02:32.986 CC module/accel/ioat/accel_ioat.o 00:02:32.986 CC module/accel/ioat/accel_ioat_rpc.o 00:02:32.986 CC module/accel/iaa/accel_iaa.o 00:02:32.986 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:32.986 CC module/accel/iaa/accel_iaa_rpc.o 00:02:32.986 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:32.986 SO libspdk_env_dpdk_rpc.so.6.0 00:02:32.986 SYMLINK libspdk_env_dpdk_rpc.so 00:02:32.986 LIB libspdk_scheduler_gscheduler.a 00:02:32.986 LIB libspdk_keyring_file.a 00:02:32.986 LIB libspdk_scheduler_dpdk_governor.a 00:02:32.986 LIB libspdk_keyring_linux.a 00:02:32.986 SO libspdk_scheduler_gscheduler.so.4.0 00:02:32.986 SO libspdk_keyring_file.so.1.0 00:02:32.986 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:32.986 SO libspdk_keyring_linux.so.1.0 00:02:32.986 LIB libspdk_scheduler_dynamic.a 00:02:32.986 LIB libspdk_accel_error.a 00:02:32.986 SYMLINK libspdk_scheduler_gscheduler.so 00:02:32.986 LIB libspdk_accel_ioat.a 00:02:32.986 SO libspdk_scheduler_dynamic.so.4.0 00:02:32.986 LIB libspdk_accel_iaa.a 00:02:32.986 SYMLINK libspdk_keyring_file.so 00:02:32.986 LIB libspdk_accel_dsa.a 00:02:32.986 SO libspdk_accel_error.so.2.0 00:02:32.986 SO libspdk_accel_ioat.so.6.0 00:02:32.986 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:32.986 LIB libspdk_blob_bdev.a 00:02:32.986 SYMLINK libspdk_keyring_linux.so 00:02:33.245 SO libspdk_accel_dsa.so.5.0 00:02:33.245 SO libspdk_accel_iaa.so.3.0 00:02:33.245 SYMLINK libspdk_scheduler_dynamic.so 00:02:33.245 SO libspdk_blob_bdev.so.11.0 00:02:33.245 SYMLINK libspdk_accel_ioat.so 00:02:33.245 SYMLINK libspdk_accel_error.so 00:02:33.245 SYMLINK libspdk_accel_dsa.so 00:02:33.245 SYMLINK libspdk_accel_iaa.so 00:02:33.245 SYMLINK libspdk_blob_bdev.so 00:02:33.513 LIB libspdk_sock_posix.a 00:02:33.513 SO libspdk_sock_posix.so.6.0 00:02:33.513 SYMLINK libspdk_sock_posix.so 00:02:33.513 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:33.513 CC module/bdev/delay/vbdev_delay.o 00:02:33.812 CC module/blobfs/bdev/blobfs_bdev.o 00:02:33.812 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:33.812 CC module/bdev/null/bdev_null.o 00:02:33.812 CC module/bdev/error/vbdev_error.o 00:02:33.812 CC module/bdev/null/bdev_null_rpc.o 00:02:33.812 CC module/bdev/error/vbdev_error_rpc.o 00:02:33.812 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:33.812 CC module/bdev/passthru/vbdev_passthru.o 00:02:33.812 CC module/bdev/gpt/gpt.o 00:02:33.812 CC module/bdev/gpt/vbdev_gpt.o 00:02:33.812 CC module/bdev/aio/bdev_aio.o 00:02:33.812 CC module/bdev/aio/bdev_aio_rpc.o 00:02:33.812 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:33.812 CC module/bdev/compress/vbdev_compress.o 00:02:33.812 CC module/bdev/lvol/vbdev_lvol.o 00:02:33.812 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:33.812 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:33.812 CC module/bdev/raid/bdev_raid.o 00:02:33.812 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:33.812 CC module/bdev/ftl/bdev_ftl.o 00:02:33.812 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:33.812 CC module/bdev/raid/bdev_raid_rpc.o 00:02:33.812 CC module/bdev/raid/bdev_raid_sb.o 00:02:33.812 CC module/bdev/raid/raid0.o 00:02:33.812 CC module/bdev/split/vbdev_split.o 00:02:33.812 CC module/bdev/raid/raid1.o 00:02:33.812 CC module/bdev/split/vbdev_split_rpc.o 00:02:33.812 CC module/bdev/raid/concat.o 00:02:33.812 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:33.812 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:33.812 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:33.812 CC module/bdev/nvme/bdev_nvme.o 00:02:33.812 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:33.812 CC module/bdev/nvme/nvme_rpc.o 00:02:33.812 CC module/bdev/iscsi/bdev_iscsi.o 00:02:33.812 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:33.812 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:33.812 CC module/bdev/nvme/bdev_mdns_client.o 00:02:33.812 CC module/bdev/nvme/vbdev_opal.o 00:02:33.812 CC module/bdev/malloc/bdev_malloc.o 00:02:33.812 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:33.812 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:33.812 CC module/bdev/crypto/vbdev_crypto.o 00:02:33.812 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:33.812 LIB libspdk_accel_dpdk_compressdev.a 00:02:33.812 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:33.812 LIB libspdk_blobfs_bdev.a 00:02:33.812 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:33.812 SO libspdk_blobfs_bdev.so.6.0 00:02:33.812 LIB libspdk_bdev_error.a 00:02:33.812 LIB libspdk_accel_dpdk_cryptodev.a 00:02:33.812 LIB libspdk_bdev_split.a 00:02:34.071 LIB libspdk_bdev_null.a 00:02:34.071 LIB libspdk_bdev_gpt.a 00:02:34.071 SO libspdk_bdev_error.so.6.0 00:02:34.071 LIB libspdk_bdev_passthru.a 00:02:34.071 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:34.071 SYMLINK libspdk_blobfs_bdev.so 00:02:34.071 SO libspdk_bdev_split.so.6.0 00:02:34.071 LIB libspdk_bdev_ftl.a 00:02:34.071 SO libspdk_bdev_gpt.so.6.0 00:02:34.071 SO libspdk_bdev_null.so.6.0 00:02:34.071 SO libspdk_bdev_ftl.so.6.0 00:02:34.071 SO libspdk_bdev_passthru.so.6.0 00:02:34.071 LIB libspdk_bdev_aio.a 00:02:34.071 SYMLINK libspdk_bdev_error.so 00:02:34.071 SYMLINK libspdk_bdev_split.so 00:02:34.071 SYMLINK libspdk_bdev_null.so 00:02:34.071 LIB libspdk_bdev_compress.a 00:02:34.071 SYMLINK libspdk_bdev_gpt.so 00:02:34.071 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:34.071 LIB libspdk_bdev_delay.a 00:02:34.071 SO libspdk_bdev_aio.so.6.0 00:02:34.071 LIB libspdk_bdev_zone_block.a 00:02:34.071 LIB libspdk_bdev_crypto.a 00:02:34.071 SYMLINK libspdk_bdev_passthru.so 00:02:34.071 SYMLINK libspdk_bdev_ftl.so 00:02:34.071 SO libspdk_bdev_compress.so.6.0 00:02:34.071 LIB libspdk_bdev_malloc.a 00:02:34.071 LIB libspdk_bdev_iscsi.a 00:02:34.071 SO libspdk_bdev_delay.so.6.0 00:02:34.071 SO libspdk_bdev_zone_block.so.6.0 00:02:34.071 SO libspdk_bdev_crypto.so.6.0 00:02:34.071 SO libspdk_bdev_malloc.so.6.0 00:02:34.071 SYMLINK libspdk_bdev_aio.so 00:02:34.071 SO libspdk_bdev_iscsi.so.6.0 00:02:34.071 SYMLINK libspdk_bdev_compress.so 00:02:34.071 SYMLINK libspdk_bdev_delay.so 00:02:34.071 SYMLINK libspdk_bdev_zone_block.so 00:02:34.071 SYMLINK libspdk_bdev_crypto.so 00:02:34.071 SYMLINK libspdk_bdev_iscsi.so 00:02:34.071 SYMLINK libspdk_bdev_malloc.so 00:02:34.071 LIB libspdk_bdev_lvol.a 00:02:34.071 LIB libspdk_bdev_virtio.a 00:02:34.330 SO libspdk_bdev_lvol.so.6.0 00:02:34.330 SO libspdk_bdev_virtio.so.6.0 00:02:34.330 SYMLINK libspdk_bdev_lvol.so 00:02:34.330 SYMLINK libspdk_bdev_virtio.so 00:02:34.588 LIB libspdk_bdev_raid.a 00:02:34.588 SO libspdk_bdev_raid.so.6.0 00:02:34.588 SYMLINK libspdk_bdev_raid.so 00:02:35.156 LIB libspdk_bdev_nvme.a 00:02:35.415 SO libspdk_bdev_nvme.so.7.0 00:02:35.415 SYMLINK libspdk_bdev_nvme.so 00:02:35.983 CC module/event/subsystems/scheduler/scheduler.o 00:02:35.983 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:35.983 CC module/event/subsystems/keyring/keyring.o 00:02:35.983 CC module/event/subsystems/sock/sock.o 00:02:35.983 CC module/event/subsystems/iobuf/iobuf.o 00:02:35.983 CC module/event/subsystems/vmd/vmd.o 00:02:35.983 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:35.983 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:36.242 LIB libspdk_event_scheduler.a 00:02:36.242 LIB libspdk_event_vhost_blk.a 00:02:36.242 LIB libspdk_event_sock.a 00:02:36.243 SO libspdk_event_scheduler.so.4.0 00:02:36.243 LIB libspdk_event_keyring.a 00:02:36.243 LIB libspdk_event_iobuf.a 00:02:36.243 LIB libspdk_event_vmd.a 00:02:36.243 SO libspdk_event_vhost_blk.so.3.0 00:02:36.243 SO libspdk_event_sock.so.5.0 00:02:36.243 SO libspdk_event_keyring.so.1.0 00:02:36.243 SO libspdk_event_iobuf.so.3.0 00:02:36.243 SO libspdk_event_vmd.so.6.0 00:02:36.243 SYMLINK libspdk_event_scheduler.so 00:02:36.243 SYMLINK libspdk_event_keyring.so 00:02:36.243 SYMLINK libspdk_event_sock.so 00:02:36.243 SYMLINK libspdk_event_vhost_blk.so 00:02:36.243 SYMLINK libspdk_event_iobuf.so 00:02:36.243 SYMLINK libspdk_event_vmd.so 00:02:36.501 CC module/event/subsystems/accel/accel.o 00:02:36.761 LIB libspdk_event_accel.a 00:02:36.761 SO libspdk_event_accel.so.6.0 00:02:36.761 SYMLINK libspdk_event_accel.so 00:02:37.020 CC module/event/subsystems/bdev/bdev.o 00:02:37.279 LIB libspdk_event_bdev.a 00:02:37.279 SO libspdk_event_bdev.so.6.0 00:02:37.279 SYMLINK libspdk_event_bdev.so 00:02:37.538 CC module/event/subsystems/ublk/ublk.o 00:02:37.538 CC module/event/subsystems/scsi/scsi.o 00:02:37.538 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:37.538 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:37.538 CC module/event/subsystems/nbd/nbd.o 00:02:37.797 LIB libspdk_event_ublk.a 00:02:37.797 LIB libspdk_event_scsi.a 00:02:37.797 SO libspdk_event_ublk.so.3.0 00:02:37.797 SO libspdk_event_scsi.so.6.0 00:02:37.797 LIB libspdk_event_nbd.a 00:02:37.797 SO libspdk_event_nbd.so.6.0 00:02:37.797 SYMLINK libspdk_event_scsi.so 00:02:37.797 SYMLINK libspdk_event_ublk.so 00:02:37.797 LIB libspdk_event_nvmf.a 00:02:37.797 SO libspdk_event_nvmf.so.6.0 00:02:37.797 SYMLINK libspdk_event_nbd.so 00:02:37.797 SYMLINK libspdk_event_nvmf.so 00:02:38.056 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:38.056 CC module/event/subsystems/iscsi/iscsi.o 00:02:38.315 LIB libspdk_event_vhost_scsi.a 00:02:38.315 LIB libspdk_event_iscsi.a 00:02:38.315 SO libspdk_event_vhost_scsi.so.3.0 00:02:38.315 SO libspdk_event_iscsi.so.6.0 00:02:38.315 SYMLINK libspdk_event_vhost_scsi.so 00:02:38.315 SYMLINK libspdk_event_iscsi.so 00:02:38.572 SO libspdk.so.6.0 00:02:38.572 SYMLINK libspdk.so 00:02:38.846 TEST_HEADER include/spdk/accel_module.h 00:02:38.846 TEST_HEADER include/spdk/accel.h 00:02:38.846 TEST_HEADER include/spdk/assert.h 00:02:38.846 TEST_HEADER include/spdk/barrier.h 00:02:38.846 TEST_HEADER include/spdk/bdev.h 00:02:38.846 TEST_HEADER include/spdk/base64.h 00:02:38.846 TEST_HEADER include/spdk/bdev_module.h 00:02:38.846 TEST_HEADER include/spdk/bit_array.h 00:02:38.846 TEST_HEADER include/spdk/bit_pool.h 00:02:38.846 TEST_HEADER include/spdk/bdev_zone.h 00:02:38.846 TEST_HEADER include/spdk/blob_bdev.h 00:02:38.846 TEST_HEADER include/spdk/blobfs.h 00:02:38.847 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:38.847 TEST_HEADER include/spdk/conf.h 00:02:38.847 TEST_HEADER include/spdk/cpuset.h 00:02:38.847 TEST_HEADER include/spdk/config.h 00:02:38.847 TEST_HEADER include/spdk/blob.h 00:02:38.847 TEST_HEADER include/spdk/crc16.h 00:02:38.847 TEST_HEADER include/spdk/dif.h 00:02:38.847 TEST_HEADER include/spdk/crc32.h 00:02:38.847 TEST_HEADER include/spdk/crc64.h 00:02:38.847 TEST_HEADER include/spdk/dma.h 00:02:38.847 TEST_HEADER include/spdk/endian.h 00:02:38.847 TEST_HEADER include/spdk/env_dpdk.h 00:02:38.847 TEST_HEADER include/spdk/event.h 00:02:38.847 TEST_HEADER include/spdk/env.h 00:02:38.847 TEST_HEADER include/spdk/fd.h 00:02:38.847 TEST_HEADER include/spdk/fd_group.h 00:02:38.847 TEST_HEADER include/spdk/ftl.h 00:02:38.847 TEST_HEADER include/spdk/file.h 00:02:38.847 TEST_HEADER include/spdk/hexlify.h 00:02:38.847 TEST_HEADER include/spdk/histogram_data.h 00:02:38.847 TEST_HEADER include/spdk/gpt_spec.h 00:02:38.847 TEST_HEADER include/spdk/idxd.h 00:02:38.847 TEST_HEADER include/spdk/init.h 00:02:38.847 TEST_HEADER include/spdk/idxd_spec.h 00:02:38.847 TEST_HEADER include/spdk/ioat.h 00:02:38.847 TEST_HEADER include/spdk/ioat_spec.h 00:02:38.847 TEST_HEADER include/spdk/iscsi_spec.h 00:02:38.847 CC test/rpc_client/rpc_client_test.o 00:02:38.847 TEST_HEADER include/spdk/json.h 00:02:38.847 TEST_HEADER include/spdk/jsonrpc.h 00:02:38.847 CC app/spdk_nvme_perf/perf.o 00:02:38.847 TEST_HEADER include/spdk/keyring_module.h 00:02:38.847 TEST_HEADER include/spdk/keyring.h 00:02:38.847 CC app/spdk_top/spdk_top.o 00:02:38.847 TEST_HEADER include/spdk/likely.h 00:02:38.847 TEST_HEADER include/spdk/log.h 00:02:38.847 CC app/spdk_lspci/spdk_lspci.o 00:02:38.847 TEST_HEADER include/spdk/memory.h 00:02:38.847 TEST_HEADER include/spdk/lvol.h 00:02:38.847 TEST_HEADER include/spdk/mmio.h 00:02:38.847 TEST_HEADER include/spdk/nbd.h 00:02:38.847 CXX app/trace/trace.o 00:02:38.847 TEST_HEADER include/spdk/net.h 00:02:38.847 TEST_HEADER include/spdk/notify.h 00:02:38.847 TEST_HEADER include/spdk/nvme.h 00:02:38.847 TEST_HEADER include/spdk/nvme_intel.h 00:02:38.847 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:38.847 CC app/spdk_nvme_discover/discovery_aer.o 00:02:38.847 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:38.847 TEST_HEADER include/spdk/nvme_spec.h 00:02:38.847 TEST_HEADER include/spdk/nvme_zns.h 00:02:38.847 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:38.847 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:38.847 CC app/trace_record/trace_record.o 00:02:38.847 TEST_HEADER include/spdk/nvmf.h 00:02:38.847 TEST_HEADER include/spdk/nvmf_spec.h 00:02:38.847 TEST_HEADER include/spdk/nvmf_transport.h 00:02:38.847 TEST_HEADER include/spdk/opal.h 00:02:38.847 TEST_HEADER include/spdk/opal_spec.h 00:02:38.847 TEST_HEADER include/spdk/pci_ids.h 00:02:38.847 CC app/spdk_nvme_identify/identify.o 00:02:38.847 TEST_HEADER include/spdk/pipe.h 00:02:38.847 TEST_HEADER include/spdk/queue.h 00:02:38.847 TEST_HEADER include/spdk/reduce.h 00:02:38.847 TEST_HEADER include/spdk/rpc.h 00:02:38.847 TEST_HEADER include/spdk/scsi.h 00:02:38.847 TEST_HEADER include/spdk/scheduler.h 00:02:38.847 TEST_HEADER include/spdk/scsi_spec.h 00:02:38.847 TEST_HEADER include/spdk/sock.h 00:02:38.847 TEST_HEADER include/spdk/stdinc.h 00:02:38.847 TEST_HEADER include/spdk/string.h 00:02:38.847 TEST_HEADER include/spdk/thread.h 00:02:38.847 TEST_HEADER include/spdk/trace.h 00:02:38.847 TEST_HEADER include/spdk/trace_parser.h 00:02:38.847 TEST_HEADER include/spdk/tree.h 00:02:38.847 TEST_HEADER include/spdk/util.h 00:02:38.847 TEST_HEADER include/spdk/uuid.h 00:02:38.847 TEST_HEADER include/spdk/ublk.h 00:02:38.847 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:38.847 TEST_HEADER include/spdk/version.h 00:02:38.847 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:38.847 TEST_HEADER include/spdk/vmd.h 00:02:38.847 TEST_HEADER include/spdk/vhost.h 00:02:38.847 TEST_HEADER include/spdk/xor.h 00:02:38.847 CXX test/cpp_headers/accel.o 00:02:38.847 TEST_HEADER include/spdk/zipf.h 00:02:38.847 CXX test/cpp_headers/accel_module.o 00:02:38.847 CXX test/cpp_headers/assert.o 00:02:38.847 CXX test/cpp_headers/barrier.o 00:02:38.847 CXX test/cpp_headers/base64.o 00:02:38.847 CXX test/cpp_headers/bdev_module.o 00:02:38.847 CXX test/cpp_headers/bdev.o 00:02:38.847 CXX test/cpp_headers/bdev_zone.o 00:02:38.847 CXX test/cpp_headers/bit_array.o 00:02:38.847 CXX test/cpp_headers/bit_pool.o 00:02:38.847 CXX test/cpp_headers/blob_bdev.o 00:02:38.847 CXX test/cpp_headers/blobfs_bdev.o 00:02:38.847 CXX test/cpp_headers/blobfs.o 00:02:38.847 CXX test/cpp_headers/blob.o 00:02:38.847 CXX test/cpp_headers/conf.o 00:02:38.847 CXX test/cpp_headers/config.o 00:02:38.847 CXX test/cpp_headers/crc16.o 00:02:38.847 CXX test/cpp_headers/cpuset.o 00:02:38.847 CXX test/cpp_headers/crc32.o 00:02:38.847 CXX test/cpp_headers/crc64.o 00:02:38.847 CC app/iscsi_tgt/iscsi_tgt.o 00:02:38.847 CXX test/cpp_headers/dif.o 00:02:38.847 CXX test/cpp_headers/dma.o 00:02:38.847 CXX test/cpp_headers/endian.o 00:02:38.847 CXX test/cpp_headers/env_dpdk.o 00:02:38.847 CXX test/cpp_headers/event.o 00:02:38.847 CXX test/cpp_headers/fd_group.o 00:02:38.847 CXX test/cpp_headers/fd.o 00:02:38.847 CXX test/cpp_headers/env.o 00:02:38.847 CXX test/cpp_headers/file.o 00:02:38.847 CXX test/cpp_headers/ftl.o 00:02:38.847 CXX test/cpp_headers/hexlify.o 00:02:38.847 CXX test/cpp_headers/gpt_spec.o 00:02:38.847 CXX test/cpp_headers/histogram_data.o 00:02:38.847 CXX test/cpp_headers/idxd.o 00:02:38.847 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:38.847 CXX test/cpp_headers/init.o 00:02:38.847 CXX test/cpp_headers/ioat.o 00:02:38.847 CXX test/cpp_headers/idxd_spec.o 00:02:38.847 CXX test/cpp_headers/ioat_spec.o 00:02:38.847 CXX test/cpp_headers/iscsi_spec.o 00:02:38.847 CXX test/cpp_headers/json.o 00:02:38.847 CXX test/cpp_headers/keyring.o 00:02:38.847 CXX test/cpp_headers/jsonrpc.o 00:02:38.847 CXX test/cpp_headers/likely.o 00:02:38.847 CXX test/cpp_headers/keyring_module.o 00:02:38.847 CC app/spdk_dd/spdk_dd.o 00:02:38.847 CXX test/cpp_headers/log.o 00:02:38.847 CXX test/cpp_headers/memory.o 00:02:38.847 CXX test/cpp_headers/lvol.o 00:02:38.847 CXX test/cpp_headers/mmio.o 00:02:38.847 CXX test/cpp_headers/nbd.o 00:02:38.847 CXX test/cpp_headers/net.o 00:02:38.847 CXX test/cpp_headers/notify.o 00:02:38.847 CXX test/cpp_headers/nvme.o 00:02:38.847 CXX test/cpp_headers/nvme_intel.o 00:02:38.847 CXX test/cpp_headers/nvme_ocssd.o 00:02:38.847 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:38.847 CXX test/cpp_headers/nvme_spec.o 00:02:38.847 CXX test/cpp_headers/nvmf_cmd.o 00:02:38.847 CXX test/cpp_headers/nvme_zns.o 00:02:38.847 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:38.847 CXX test/cpp_headers/nvmf.o 00:02:38.847 CXX test/cpp_headers/nvmf_spec.o 00:02:38.847 CC app/nvmf_tgt/nvmf_main.o 00:02:38.847 CXX test/cpp_headers/nvmf_transport.o 00:02:38.847 CXX test/cpp_headers/opal.o 00:02:38.847 CXX test/cpp_headers/opal_spec.o 00:02:38.847 CXX test/cpp_headers/pci_ids.o 00:02:38.847 CXX test/cpp_headers/pipe.o 00:02:38.847 CC app/spdk_tgt/spdk_tgt.o 00:02:38.847 CXX test/cpp_headers/queue.o 00:02:38.847 CXX test/cpp_headers/reduce.o 00:02:39.115 CC test/app/histogram_perf/histogram_perf.o 00:02:39.115 CC test/thread/poller_perf/poller_perf.o 00:02:39.115 CXX test/cpp_headers/rpc.o 00:02:39.115 CC test/app/jsoncat/jsoncat.o 00:02:39.115 CC test/env/vtophys/vtophys.o 00:02:39.115 CC examples/ioat/verify/verify.o 00:02:39.115 CC app/fio/nvme/fio_plugin.o 00:02:39.115 CC test/app/stub/stub.o 00:02:39.115 CC test/env/pci/pci_ut.o 00:02:39.115 CC test/env/memory/memory_ut.o 00:02:39.115 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:39.115 CC examples/util/zipf/zipf.o 00:02:39.115 CC test/dma/test_dma/test_dma.o 00:02:39.115 CC examples/ioat/perf/perf.o 00:02:39.115 CC app/fio/bdev/fio_plugin.o 00:02:39.115 CC test/app/bdev_svc/bdev_svc.o 00:02:39.377 LINK spdk_lspci 00:02:39.377 LINK rpc_client_test 00:02:39.377 CC test/env/mem_callbacks/mem_callbacks.o 00:02:39.377 LINK spdk_nvme_discover 00:02:39.377 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:39.377 LINK histogram_perf 00:02:39.377 CXX test/cpp_headers/scheduler.o 00:02:39.377 CXX test/cpp_headers/scsi.o 00:02:39.377 LINK jsoncat 00:02:39.377 CXX test/cpp_headers/scsi_spec.o 00:02:39.377 CXX test/cpp_headers/sock.o 00:02:39.637 CXX test/cpp_headers/stdinc.o 00:02:39.637 CXX test/cpp_headers/string.o 00:02:39.637 CXX test/cpp_headers/thread.o 00:02:39.637 CXX test/cpp_headers/trace.o 00:02:39.637 CXX test/cpp_headers/trace_parser.o 00:02:39.637 CXX test/cpp_headers/tree.o 00:02:39.637 CXX test/cpp_headers/ublk.o 00:02:39.637 CXX test/cpp_headers/util.o 00:02:39.637 CXX test/cpp_headers/version.o 00:02:39.637 CXX test/cpp_headers/uuid.o 00:02:39.637 CXX test/cpp_headers/vfio_user_pci.o 00:02:39.637 CXX test/cpp_headers/vfio_user_spec.o 00:02:39.637 LINK zipf 00:02:39.637 CXX test/cpp_headers/vhost.o 00:02:39.637 CXX test/cpp_headers/vmd.o 00:02:39.637 CXX test/cpp_headers/xor.o 00:02:39.637 CXX test/cpp_headers/zipf.o 00:02:39.637 LINK env_dpdk_post_init 00:02:39.637 LINK nvmf_tgt 00:02:39.637 LINK interrupt_tgt 00:02:39.637 LINK spdk_tgt 00:02:39.637 LINK stub 00:02:39.637 LINK vtophys 00:02:39.637 LINK poller_perf 00:02:39.637 LINK iscsi_tgt 00:02:39.637 LINK spdk_trace_record 00:02:39.637 LINK verify 00:02:39.637 LINK bdev_svc 00:02:39.637 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:39.637 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:39.637 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:39.637 LINK ioat_perf 00:02:39.637 LINK spdk_dd 00:02:39.894 LINK spdk_trace 00:02:39.894 LINK pci_ut 00:02:39.894 LINK test_dma 00:02:39.894 LINK spdk_bdev 00:02:39.894 LINK nvme_fuzz 00:02:40.152 CC examples/sock/hello_world/hello_sock.o 00:02:40.152 CC examples/vmd/led/led.o 00:02:40.152 LINK vhost_fuzz 00:02:40.152 LINK spdk_nvme_perf 00:02:40.152 CC examples/idxd/perf/perf.o 00:02:40.152 CC examples/vmd/lsvmd/lsvmd.o 00:02:40.152 CC examples/thread/thread/thread_ex.o 00:02:40.152 LINK spdk_nvme_identify 00:02:40.152 LINK spdk_nvme 00:02:40.152 LINK mem_callbacks 00:02:40.152 LINK spdk_top 00:02:40.152 CC app/vhost/vhost.o 00:02:40.152 CC test/event/reactor/reactor.o 00:02:40.152 CC test/event/reactor_perf/reactor_perf.o 00:02:40.152 CC test/event/event_perf/event_perf.o 00:02:40.152 CC test/event/app_repeat/app_repeat.o 00:02:40.152 LINK lsvmd 00:02:40.152 LINK led 00:02:40.152 CC test/event/scheduler/scheduler.o 00:02:40.152 LINK hello_sock 00:02:40.410 CC test/nvme/e2edp/nvme_dp.o 00:02:40.410 CC test/nvme/reset/reset.o 00:02:40.410 CC test/nvme/startup/startup.o 00:02:40.410 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:40.410 CC test/nvme/cuse/cuse.o 00:02:40.410 CC test/nvme/overhead/overhead.o 00:02:40.410 CC test/nvme/reserve/reserve.o 00:02:40.410 CC test/nvme/connect_stress/connect_stress.o 00:02:40.410 CC test/nvme/boot_partition/boot_partition.o 00:02:40.410 CC test/nvme/compliance/nvme_compliance.o 00:02:40.410 CC test/nvme/fused_ordering/fused_ordering.o 00:02:40.410 CC test/nvme/aer/aer.o 00:02:40.410 CC test/nvme/simple_copy/simple_copy.o 00:02:40.410 CC test/nvme/sgl/sgl.o 00:02:40.410 CC test/nvme/err_injection/err_injection.o 00:02:40.410 CC test/nvme/fdp/fdp.o 00:02:40.410 CC test/accel/dif/dif.o 00:02:40.410 LINK thread 00:02:40.410 CC test/blobfs/mkfs/mkfs.o 00:02:40.410 LINK reactor 00:02:40.410 LINK reactor_perf 00:02:40.410 LINK idxd_perf 00:02:40.410 LINK event_perf 00:02:40.410 LINK vhost 00:02:40.410 LINK app_repeat 00:02:40.410 CC test/lvol/esnap/esnap.o 00:02:40.410 LINK scheduler 00:02:40.410 LINK startup 00:02:40.410 LINK boot_partition 00:02:40.410 LINK reserve 00:02:40.410 LINK connect_stress 00:02:40.410 LINK doorbell_aers 00:02:40.410 LINK err_injection 00:02:40.669 LINK fused_ordering 00:02:40.669 LINK reset 00:02:40.669 LINK simple_copy 00:02:40.669 LINK nvme_dp 00:02:40.669 LINK mkfs 00:02:40.669 LINK sgl 00:02:40.669 LINK memory_ut 00:02:40.669 LINK aer 00:02:40.669 LINK overhead 00:02:40.669 LINK nvme_compliance 00:02:40.669 LINK fdp 00:02:40.669 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:40.669 CC examples/nvme/abort/abort.o 00:02:40.669 CC examples/nvme/reconnect/reconnect.o 00:02:40.669 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:40.669 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:40.669 CC examples/nvme/arbitration/arbitration.o 00:02:40.669 CC examples/nvme/hotplug/hotplug.o 00:02:40.669 CC examples/nvme/hello_world/hello_world.o 00:02:40.669 LINK dif 00:02:40.927 CC examples/accel/perf/accel_perf.o 00:02:40.927 LINK pmr_persistence 00:02:40.927 CC examples/blob/hello_world/hello_blob.o 00:02:40.927 CC examples/blob/cli/blobcli.o 00:02:40.927 LINK cmb_copy 00:02:40.927 LINK hello_world 00:02:40.927 LINK hotplug 00:02:40.927 LINK iscsi_fuzz 00:02:40.927 LINK arbitration 00:02:40.927 LINK reconnect 00:02:40.927 LINK abort 00:02:41.185 LINK nvme_manage 00:02:41.185 LINK hello_blob 00:02:41.185 LINK accel_perf 00:02:41.185 CC test/bdev/bdevio/bdevio.o 00:02:41.443 LINK blobcli 00:02:41.443 LINK cuse 00:02:41.702 LINK bdevio 00:02:41.702 CC examples/bdev/hello_world/hello_bdev.o 00:02:41.702 CC examples/bdev/bdevperf/bdevperf.o 00:02:41.959 LINK hello_bdev 00:02:42.216 LINK bdevperf 00:02:42.783 CC examples/nvmf/nvmf/nvmf.o 00:02:43.041 LINK nvmf 00:02:43.977 LINK esnap 00:02:44.235 00:02:44.235 real 1m7.550s 00:02:44.235 user 14m16.402s 00:02:44.235 sys 4m5.765s 00:02:44.235 18:39:29 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:02:44.235 18:39:29 make -- common/autotest_common.sh@10 -- $ set +x 00:02:44.235 ************************************ 00:02:44.235 END TEST make 00:02:44.235 ************************************ 00:02:44.235 18:39:29 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:02:44.235 18:39:29 -- pm/common@29 -- $ signal_monitor_resources TERM 00:02:44.235 18:39:29 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:02:44.235 18:39:29 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:44.235 18:39:29 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:02:44.235 18:39:29 -- pm/common@44 -- $ pid=1883019 00:02:44.235 18:39:29 -- pm/common@50 -- $ kill -TERM 1883019 00:02:44.235 18:39:29 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:44.235 18:39:29 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:02:44.235 18:39:29 -- pm/common@44 -- $ pid=1883021 00:02:44.235 18:39:29 -- pm/common@50 -- $ kill -TERM 1883021 00:02:44.235 18:39:29 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:44.235 18:39:29 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:02:44.235 18:39:29 -- pm/common@44 -- $ pid=1883023 00:02:44.235 18:39:29 -- pm/common@50 -- $ kill -TERM 1883023 00:02:44.235 18:39:29 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:44.235 18:39:29 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:02:44.235 18:39:29 -- pm/common@44 -- $ pid=1883048 00:02:44.235 18:39:29 -- pm/common@50 -- $ sudo -E kill -TERM 1883048 00:02:44.235 18:39:29 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:02:44.235 18:39:29 -- nvmf/common.sh@7 -- # uname -s 00:02:44.235 18:39:29 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:44.235 18:39:29 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:44.235 18:39:29 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:44.235 18:39:29 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:44.235 18:39:29 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:44.235 18:39:29 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:44.235 18:39:29 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:44.235 18:39:29 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:44.235 18:39:29 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:44.235 18:39:29 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:44.235 18:39:29 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:02:44.235 18:39:29 -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:02:44.235 18:39:29 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:44.235 18:39:29 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:44.235 18:39:29 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:44.235 18:39:29 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:02:44.235 18:39:29 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:02:44.235 18:39:29 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:44.235 18:39:29 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:44.235 18:39:29 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:44.235 18:39:29 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:44.235 18:39:29 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:44.235 18:39:29 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:44.235 18:39:29 -- paths/export.sh@5 -- # export PATH 00:02:44.235 18:39:29 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:44.235 18:39:29 -- nvmf/common.sh@47 -- # : 0 00:02:44.235 18:39:29 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:02:44.235 18:39:29 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:02:44.235 18:39:29 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:02:44.235 18:39:29 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:44.235 18:39:29 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:44.235 18:39:29 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:02:44.235 18:39:29 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:02:44.235 18:39:29 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:02:44.235 18:39:29 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:44.235 18:39:29 -- spdk/autotest.sh@32 -- # uname -s 00:02:44.235 18:39:29 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:44.235 18:39:29 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:44.235 18:39:29 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:44.525 18:39:29 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:44.525 18:39:29 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:44.525 18:39:29 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:44.525 18:39:29 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:44.525 18:39:29 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:44.525 18:39:29 -- spdk/autotest.sh@48 -- # udevadm_pid=1949524 00:02:44.525 18:39:29 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:02:44.525 18:39:29 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:44.525 18:39:29 -- pm/common@17 -- # local monitor 00:02:44.525 18:39:29 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:44.525 18:39:29 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:44.525 18:39:29 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:44.525 18:39:29 -- pm/common@21 -- # date +%s 00:02:44.525 18:39:29 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:44.525 18:39:29 -- pm/common@21 -- # date +%s 00:02:44.525 18:39:29 -- pm/common@25 -- # sleep 1 00:02:44.525 18:39:29 -- pm/common@21 -- # date +%s 00:02:44.525 18:39:29 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721839169 00:02:44.525 18:39:29 -- pm/common@21 -- # date +%s 00:02:44.525 18:39:29 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721839169 00:02:44.525 18:39:29 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721839169 00:02:44.525 18:39:29 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721839169 00:02:44.525 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721839169_collect-vmstat.pm.log 00:02:44.525 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721839169_collect-cpu-load.pm.log 00:02:44.525 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721839169_collect-cpu-temp.pm.log 00:02:44.525 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721839169_collect-bmc-pm.bmc.pm.log 00:02:45.461 18:39:30 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:45.461 18:39:30 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:02:45.461 18:39:30 -- common/autotest_common.sh@722 -- # xtrace_disable 00:02:45.461 18:39:30 -- common/autotest_common.sh@10 -- # set +x 00:02:45.461 18:39:30 -- spdk/autotest.sh@59 -- # create_test_list 00:02:45.461 18:39:30 -- common/autotest_common.sh@746 -- # xtrace_disable 00:02:45.461 18:39:30 -- common/autotest_common.sh@10 -- # set +x 00:02:45.461 18:39:30 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:02:45.461 18:39:30 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:45.461 18:39:30 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:45.461 18:39:30 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:02:45.461 18:39:30 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:45.461 18:39:30 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:02:45.461 18:39:30 -- common/autotest_common.sh@1453 -- # uname 00:02:45.461 18:39:30 -- common/autotest_common.sh@1453 -- # '[' Linux = FreeBSD ']' 00:02:45.461 18:39:30 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:02:45.461 18:39:30 -- common/autotest_common.sh@1473 -- # uname 00:02:45.461 18:39:30 -- common/autotest_common.sh@1473 -- # [[ Linux = FreeBSD ]] 00:02:45.461 18:39:30 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:02:45.461 18:39:30 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:02:45.461 18:39:30 -- spdk/autotest.sh@72 -- # hash lcov 00:02:45.461 18:39:30 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:02:45.461 18:39:30 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:02:45.461 --rc lcov_branch_coverage=1 00:02:45.461 --rc lcov_function_coverage=1 00:02:45.461 --rc genhtml_branch_coverage=1 00:02:45.461 --rc genhtml_function_coverage=1 00:02:45.461 --rc genhtml_legend=1 00:02:45.461 --rc geninfo_all_blocks=1 00:02:45.461 ' 00:02:45.461 18:39:30 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:02:45.461 --rc lcov_branch_coverage=1 00:02:45.461 --rc lcov_function_coverage=1 00:02:45.461 --rc genhtml_branch_coverage=1 00:02:45.461 --rc genhtml_function_coverage=1 00:02:45.461 --rc genhtml_legend=1 00:02:45.461 --rc geninfo_all_blocks=1 00:02:45.461 ' 00:02:45.461 18:39:30 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:02:45.461 --rc lcov_branch_coverage=1 00:02:45.461 --rc lcov_function_coverage=1 00:02:45.461 --rc genhtml_branch_coverage=1 00:02:45.461 --rc genhtml_function_coverage=1 00:02:45.461 --rc genhtml_legend=1 00:02:45.461 --rc geninfo_all_blocks=1 00:02:45.461 --no-external' 00:02:45.461 18:39:30 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:02:45.461 --rc lcov_branch_coverage=1 00:02:45.461 --rc lcov_function_coverage=1 00:02:45.461 --rc genhtml_branch_coverage=1 00:02:45.461 --rc genhtml_function_coverage=1 00:02:45.461 --rc genhtml_legend=1 00:02:45.461 --rc geninfo_all_blocks=1 00:02:45.461 --no-external' 00:02:45.461 18:39:30 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:02:45.461 lcov: LCOV version 1.14 00:02:45.461 18:39:30 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:02:57.662 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:02:57.662 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:03:05.832 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:03:05.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:03:05.833 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:03:05.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:03:06.092 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:03:06.092 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:03:06.092 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:03:06.092 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:03:06.092 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:03:06.092 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:03:06.092 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:03:06.092 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:03:06.092 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:03:06.092 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:03:06.092 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:03:06.092 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:03:06.092 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:03:06.092 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:03:06.092 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:03:06.092 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:03:06.092 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:03:06.092 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:03:06.092 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:03:06.092 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:03:06.092 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:03:06.092 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:03:06.092 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:03:06.092 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:03:06.092 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:03:06.092 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:03:06.092 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:03:06.092 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:03:06.092 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:03:06.092 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:03:09.377 18:39:54 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:03:09.377 18:39:54 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:09.377 18:39:54 -- common/autotest_common.sh@10 -- # set +x 00:03:09.377 18:39:54 -- spdk/autotest.sh@91 -- # rm -f 00:03:09.377 18:39:54 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:12.667 0000:5f:00.0 (1b96 2600): Already using the nvme driver 00:03:12.667 0000:5e:00.0 (8086 0a54): Already using the nvme driver 00:03:12.667 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:12.667 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:12.667 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:12.667 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:12.667 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:12.667 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:12.667 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:12.667 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:12.667 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:12.667 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:12.667 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:12.667 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:12.667 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:12.667 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:12.667 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:12.667 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:12.667 18:39:57 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:03:12.667 18:39:57 -- common/autotest_common.sh@1667 -- # zoned_devs=() 00:03:12.667 18:39:57 -- common/autotest_common.sh@1667 -- # local -gA zoned_devs 00:03:12.667 18:39:57 -- common/autotest_common.sh@1668 -- # local nvme bdf 00:03:12.667 18:39:57 -- common/autotest_common.sh@1670 -- # for nvme in /sys/block/nvme* 00:03:12.667 18:39:57 -- common/autotest_common.sh@1671 -- # is_block_zoned nvme0n1 00:03:12.667 18:39:57 -- common/autotest_common.sh@1660 -- # local device=nvme0n1 00:03:12.667 18:39:57 -- common/autotest_common.sh@1662 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:12.667 18:39:57 -- common/autotest_common.sh@1663 -- # [[ none != none ]] 00:03:12.667 18:39:57 -- common/autotest_common.sh@1670 -- # for nvme in /sys/block/nvme* 00:03:12.667 18:39:57 -- common/autotest_common.sh@1671 -- # is_block_zoned nvme1n1 00:03:12.667 18:39:57 -- common/autotest_common.sh@1660 -- # local device=nvme1n1 00:03:12.667 18:39:57 -- common/autotest_common.sh@1662 -- # [[ -e /sys/block/nvme1n1/queue/zoned ]] 00:03:12.667 18:39:57 -- common/autotest_common.sh@1663 -- # [[ none != none ]] 00:03:12.667 18:39:57 -- common/autotest_common.sh@1670 -- # for nvme in /sys/block/nvme* 00:03:12.667 18:39:57 -- common/autotest_common.sh@1671 -- # is_block_zoned nvme1n2 00:03:12.667 18:39:57 -- common/autotest_common.sh@1660 -- # local device=nvme1n2 00:03:12.667 18:39:57 -- common/autotest_common.sh@1662 -- # [[ -e /sys/block/nvme1n2/queue/zoned ]] 00:03:12.667 18:39:57 -- common/autotest_common.sh@1663 -- # [[ host-managed != none ]] 00:03:12.667 18:39:57 -- common/autotest_common.sh@1672 -- # zoned_devs["${nvme##*/}"]=0000:5f:00.0 00:03:12.667 18:39:57 -- spdk/autotest.sh@98 -- # (( 1 > 0 )) 00:03:12.667 18:39:57 -- spdk/autotest.sh@103 -- # export PCI_BLOCKED=0000:5f:00.0 00:03:12.667 18:39:57 -- spdk/autotest.sh@103 -- # PCI_BLOCKED=0000:5f:00.0 00:03:12.667 18:39:57 -- spdk/autotest.sh@104 -- # export PCI_ZONED=0000:5f:00.0 00:03:12.667 18:39:57 -- spdk/autotest.sh@104 -- # PCI_ZONED=0000:5f:00.0 00:03:12.667 18:39:57 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:12.667 18:39:57 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:12.667 18:39:57 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:03:12.667 18:39:57 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:03:12.667 18:39:57 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:12.667 No valid GPT data, bailing 00:03:12.667 18:39:57 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:12.667 18:39:57 -- scripts/common.sh@391 -- # pt= 00:03:12.667 18:39:57 -- scripts/common.sh@392 -- # return 1 00:03:12.667 18:39:57 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:12.667 1+0 records in 00:03:12.667 1+0 records out 00:03:12.667 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00140726 s, 745 MB/s 00:03:12.667 18:39:57 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:12.667 18:39:57 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:12.667 18:39:57 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme1n1 00:03:12.667 18:39:57 -- scripts/common.sh@378 -- # local block=/dev/nvme1n1 pt 00:03:12.667 18:39:57 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme1n1 00:03:12.667 No valid GPT data, bailing 00:03:12.667 18:39:57 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme1n1 00:03:12.667 18:39:57 -- scripts/common.sh@391 -- # pt= 00:03:12.667 18:39:57 -- scripts/common.sh@392 -- # return 1 00:03:12.667 18:39:57 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme1n1 bs=1M count=1 00:03:12.667 1+0 records in 00:03:12.667 1+0 records out 00:03:12.667 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00556502 s, 188 MB/s 00:03:12.667 18:39:57 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:12.667 18:39:57 -- spdk/autotest.sh@112 -- # [[ -z 0000:5f:00.0 ]] 00:03:12.667 18:39:57 -- spdk/autotest.sh@112 -- # continue 00:03:12.667 18:39:57 -- spdk/autotest.sh@118 -- # sync 00:03:12.667 18:39:57 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:12.667 18:39:57 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:12.667 18:39:57 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:17.944 18:40:02 -- spdk/autotest.sh@124 -- # uname -s 00:03:17.944 18:40:02 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:03:17.944 18:40:02 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:17.944 18:40:02 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:17.944 18:40:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:17.944 18:40:02 -- common/autotest_common.sh@10 -- # set +x 00:03:17.944 ************************************ 00:03:17.944 START TEST setup.sh 00:03:17.944 ************************************ 00:03:17.944 18:40:02 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:17.944 * Looking for test storage... 00:03:17.944 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:17.944 18:40:02 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:03:17.944 18:40:02 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:17.944 18:40:02 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:17.944 18:40:02 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:17.944 18:40:02 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:17.944 18:40:02 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:17.944 ************************************ 00:03:17.944 START TEST acl 00:03:17.944 ************************************ 00:03:17.944 18:40:02 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:17.944 * Looking for test storage... 00:03:17.944 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:17.944 18:40:02 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:03:17.944 18:40:02 setup.sh.acl -- common/autotest_common.sh@1667 -- # zoned_devs=() 00:03:17.944 18:40:02 setup.sh.acl -- common/autotest_common.sh@1667 -- # local -gA zoned_devs 00:03:17.944 18:40:02 setup.sh.acl -- common/autotest_common.sh@1668 -- # local nvme bdf 00:03:17.944 18:40:02 setup.sh.acl -- common/autotest_common.sh@1670 -- # for nvme in /sys/block/nvme* 00:03:17.944 18:40:02 setup.sh.acl -- common/autotest_common.sh@1671 -- # is_block_zoned nvme0n1 00:03:17.944 18:40:02 setup.sh.acl -- common/autotest_common.sh@1660 -- # local device=nvme0n1 00:03:17.944 18:40:02 setup.sh.acl -- common/autotest_common.sh@1662 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:17.944 18:40:02 setup.sh.acl -- common/autotest_common.sh@1663 -- # [[ none != none ]] 00:03:17.944 18:40:02 setup.sh.acl -- common/autotest_common.sh@1670 -- # for nvme in /sys/block/nvme* 00:03:17.944 18:40:02 setup.sh.acl -- common/autotest_common.sh@1671 -- # is_block_zoned nvme1n1 00:03:17.945 18:40:02 setup.sh.acl -- common/autotest_common.sh@1660 -- # local device=nvme1n1 00:03:17.945 18:40:02 setup.sh.acl -- common/autotest_common.sh@1662 -- # [[ -e /sys/block/nvme1n1/queue/zoned ]] 00:03:17.945 18:40:02 setup.sh.acl -- common/autotest_common.sh@1663 -- # [[ none != none ]] 00:03:17.945 18:40:02 setup.sh.acl -- common/autotest_common.sh@1670 -- # for nvme in /sys/block/nvme* 00:03:17.945 18:40:02 setup.sh.acl -- common/autotest_common.sh@1671 -- # is_block_zoned nvme1n2 00:03:17.945 18:40:02 setup.sh.acl -- common/autotest_common.sh@1660 -- # local device=nvme1n2 00:03:17.945 18:40:02 setup.sh.acl -- common/autotest_common.sh@1662 -- # [[ -e /sys/block/nvme1n2/queue/zoned ]] 00:03:17.945 18:40:02 setup.sh.acl -- common/autotest_common.sh@1663 -- # [[ host-managed != none ]] 00:03:17.945 18:40:02 setup.sh.acl -- common/autotest_common.sh@1672 -- # zoned_devs["${nvme##*/}"]=0000:5f:00.0 00:03:17.945 18:40:02 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:03:17.945 18:40:02 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:03:17.945 18:40:02 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:03:17.945 18:40:02 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:03:17.945 18:40:02 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:03:17.945 18:40:02 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:17.945 18:40:02 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:21.236 18:40:05 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:03:21.236 18:40:05 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:03:21.236 18:40:05 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:21.236 18:40:05 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:03:21.236 18:40:05 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:03:21.236 18:40:05 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:03:24.528 Hugepages 00:03:24.528 node hugesize free / total 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 00:03:24.528 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@21 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5f:00.0 == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@21 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\f\:\0\0\.\0* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@21 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:24.528 18:40:09 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:03:24.528 18:40:09 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:24.528 18:40:09 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:24.528 18:40:09 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:24.528 ************************************ 00:03:24.528 START TEST denied 00:03:24.528 ************************************ 00:03:24.528 18:40:09 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:03:24.528 18:40:09 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED='0000:5f:00.0 0000:5e:00.0' 00:03:24.528 18:40:09 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:03:24.528 18:40:09 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:03:24.528 18:40:09 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:03:24.528 18:40:09 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:27.820 0000:5e:00.0 (8086 0a54): Skipping denied controller at 0000:5e:00.0 00:03:27.820 18:40:12 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:03:27.820 18:40:12 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:03:27.820 18:40:12 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:03:27.820 18:40:12 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:03:27.820 18:40:12 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:03:27.820 18:40:12 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:27.820 18:40:12 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:27.820 18:40:12 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:03:27.820 18:40:12 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:27.820 18:40:12 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:32.015 00:03:32.015 real 0m7.235s 00:03:32.015 user 0m2.264s 00:03:32.015 sys 0m4.043s 00:03:32.015 18:40:16 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:32.015 18:40:16 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:03:32.015 ************************************ 00:03:32.015 END TEST denied 00:03:32.015 ************************************ 00:03:32.015 18:40:16 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:32.015 18:40:16 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:32.015 18:40:16 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:32.015 18:40:16 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:32.015 ************************************ 00:03:32.015 START TEST allowed 00:03:32.015 ************************************ 00:03:32.015 18:40:16 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:03:32.015 18:40:16 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:03:32.015 18:40:16 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:03:32.015 18:40:16 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:03:32.015 18:40:16 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:03:32.015 18:40:16 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:36.248 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:03:36.248 18:40:20 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:03:36.248 18:40:20 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:03:36.248 18:40:20 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:03:36.248 18:40:20 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:36.248 18:40:20 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:39.534 00:03:39.534 real 0m7.630s 00:03:39.534 user 0m2.563s 00:03:39.534 sys 0m4.233s 00:03:39.534 18:40:24 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:39.534 18:40:24 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:03:39.534 ************************************ 00:03:39.534 END TEST allowed 00:03:39.534 ************************************ 00:03:39.534 00:03:39.534 real 0m22.015s 00:03:39.534 user 0m7.524s 00:03:39.534 sys 0m12.935s 00:03:39.534 18:40:24 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:39.534 18:40:24 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:39.534 ************************************ 00:03:39.534 END TEST acl 00:03:39.534 ************************************ 00:03:39.534 18:40:24 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:39.535 18:40:24 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:39.535 18:40:24 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:39.535 18:40:24 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:39.535 ************************************ 00:03:39.535 START TEST hugepages 00:03:39.535 ************************************ 00:03:39.535 18:40:24 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:39.794 * Looking for test storage... 00:03:39.794 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 75670776 kB' 'MemAvailable: 79088912 kB' 'Buffers: 2696 kB' 'Cached: 9749316 kB' 'SwapCached: 0 kB' 'Active: 6737428 kB' 'Inactive: 3507864 kB' 'Active(anon): 6340104 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 497152 kB' 'Mapped: 171176 kB' 'Shmem: 5846824 kB' 'KReclaimable: 218348 kB' 'Slab: 665500 kB' 'SReclaimable: 218348 kB' 'SUnreclaim: 447152 kB' 'KernelStack: 19712 kB' 'PageTables: 8552 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52952956 kB' 'Committed_AS: 7712304 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221192 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.794 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.795 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:39.796 18:40:24 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:39.796 18:40:24 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:39.796 18:40:24 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:39.796 18:40:24 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:39.796 ************************************ 00:03:39.796 START TEST default_setup 00:03:39.796 ************************************ 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:03:39.796 18:40:24 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:42.331 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:42.331 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:42.331 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:42.331 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:42.590 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:42.590 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:42.590 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:42.590 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:42.590 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:42.590 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:42.590 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:42.590 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:42.590 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:42.590 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:42.590 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:42.590 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:42.590 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:43.532 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77833008 kB' 'MemAvailable: 81250920 kB' 'Buffers: 2696 kB' 'Cached: 9749416 kB' 'SwapCached: 0 kB' 'Active: 6754040 kB' 'Inactive: 3507864 kB' 'Active(anon): 6356716 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513048 kB' 'Mapped: 170980 kB' 'Shmem: 5846924 kB' 'KReclaimable: 217900 kB' 'Slab: 663336 kB' 'SReclaimable: 217900 kB' 'SUnreclaim: 445436 kB' 'KernelStack: 19824 kB' 'PageTables: 8456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7728340 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221480 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.532 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.533 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77834640 kB' 'MemAvailable: 81252552 kB' 'Buffers: 2696 kB' 'Cached: 9749416 kB' 'SwapCached: 0 kB' 'Active: 6754696 kB' 'Inactive: 3507864 kB' 'Active(anon): 6357372 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513788 kB' 'Mapped: 170980 kB' 'Shmem: 5846924 kB' 'KReclaimable: 217900 kB' 'Slab: 663372 kB' 'SReclaimable: 217900 kB' 'SUnreclaim: 445472 kB' 'KernelStack: 19984 kB' 'PageTables: 9016 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7728356 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221576 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.534 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.535 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77834900 kB' 'MemAvailable: 81252812 kB' 'Buffers: 2696 kB' 'Cached: 9749436 kB' 'SwapCached: 0 kB' 'Active: 6754644 kB' 'Inactive: 3507864 kB' 'Active(anon): 6357320 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513588 kB' 'Mapped: 170980 kB' 'Shmem: 5846944 kB' 'KReclaimable: 217900 kB' 'Slab: 663372 kB' 'SReclaimable: 217900 kB' 'SUnreclaim: 445472 kB' 'KernelStack: 20064 kB' 'PageTables: 9240 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7727128 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221528 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.536 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.537 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:43.538 nr_hugepages=1024 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:43.538 resv_hugepages=0 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:43.538 surplus_hugepages=0 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:43.538 anon_hugepages=0 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77835636 kB' 'MemAvailable: 81253548 kB' 'Buffers: 2696 kB' 'Cached: 9749456 kB' 'SwapCached: 0 kB' 'Active: 6754484 kB' 'Inactive: 3507864 kB' 'Active(anon): 6357160 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513460 kB' 'Mapped: 170980 kB' 'Shmem: 5846964 kB' 'KReclaimable: 217900 kB' 'Slab: 663212 kB' 'SReclaimable: 217900 kB' 'SUnreclaim: 445312 kB' 'KernelStack: 19968 kB' 'PageTables: 8544 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7728400 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221592 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.538 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.539 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 27268216 kB' 'MemUsed: 5366412 kB' 'SwapCached: 0 kB' 'Active: 2026824 kB' 'Inactive: 100316 kB' 'Active(anon): 1833828 kB' 'Inactive(anon): 0 kB' 'Active(file): 192996 kB' 'Inactive(file): 100316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1902312 kB' 'Mapped: 126308 kB' 'AnonPages: 227732 kB' 'Shmem: 1609000 kB' 'KernelStack: 11000 kB' 'PageTables: 5792 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 90220 kB' 'Slab: 318656 kB' 'SReclaimable: 90220 kB' 'SUnreclaim: 228436 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:43.540 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.541 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:43.801 node0=1024 expecting 1024 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:43.801 00:03:43.801 real 0m3.886s 00:03:43.801 user 0m1.178s 00:03:43.801 sys 0m1.817s 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:43.801 18:40:28 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:03:43.801 ************************************ 00:03:43.801 END TEST default_setup 00:03:43.801 ************************************ 00:03:43.801 18:40:28 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:43.801 18:40:28 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:43.801 18:40:28 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:43.801 18:40:28 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:43.801 ************************************ 00:03:43.801 START TEST per_node_1G_alloc 00:03:43.801 ************************************ 00:03:43.801 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:03:43.801 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:03:43.801 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:43.801 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:03:43.801 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:43.801 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:03:43.801 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:43.801 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:03:43.801 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:43.801 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:43.801 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:43.801 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:43.801 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:43.801 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:43.801 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:43.801 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:43.801 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:43.801 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:43.802 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:43.802 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:43.802 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:43.802 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:43.802 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:03:43.802 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:43.802 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:43.802 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:03:43.802 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:43.802 18:40:28 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:46.337 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:46.337 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:46.337 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:46.337 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:46.599 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:46.599 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:46.599 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:46.599 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:46.599 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:46.600 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:46.600 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:46.600 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:46.600 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:46.600 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:46.600 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:46.600 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:46.600 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:46.600 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77864352 kB' 'MemAvailable: 81282280 kB' 'Buffers: 2696 kB' 'Cached: 9749560 kB' 'SwapCached: 0 kB' 'Active: 6755008 kB' 'Inactive: 3507864 kB' 'Active(anon): 6357684 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513912 kB' 'Mapped: 171032 kB' 'Shmem: 5847068 kB' 'KReclaimable: 217932 kB' 'Slab: 663644 kB' 'SReclaimable: 217932 kB' 'SUnreclaim: 445712 kB' 'KernelStack: 19696 kB' 'PageTables: 8200 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7726252 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221368 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.600 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.601 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77865136 kB' 'MemAvailable: 81283064 kB' 'Buffers: 2696 kB' 'Cached: 9749560 kB' 'SwapCached: 0 kB' 'Active: 6754940 kB' 'Inactive: 3507864 kB' 'Active(anon): 6357616 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513888 kB' 'Mapped: 170992 kB' 'Shmem: 5847068 kB' 'KReclaimable: 217932 kB' 'Slab: 663612 kB' 'SReclaimable: 217932 kB' 'SUnreclaim: 445680 kB' 'KernelStack: 19664 kB' 'PageTables: 8088 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7726268 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221336 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.602 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:46.603 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77866040 kB' 'MemAvailable: 81283968 kB' 'Buffers: 2696 kB' 'Cached: 9749580 kB' 'SwapCached: 0 kB' 'Active: 6754660 kB' 'Inactive: 3507864 kB' 'Active(anon): 6357336 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513576 kB' 'Mapped: 170992 kB' 'Shmem: 5847088 kB' 'KReclaimable: 217932 kB' 'Slab: 663676 kB' 'SReclaimable: 217932 kB' 'SUnreclaim: 445744 kB' 'KernelStack: 19664 kB' 'PageTables: 8124 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7726292 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221352 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.604 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.605 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:46.606 nr_hugepages=1024 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:46.606 resv_hugepages=0 00:03:46.606 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:46.606 surplus_hugepages=0 00:03:46.868 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:46.868 anon_hugepages=0 00:03:46.868 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:46.868 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:46.868 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:46.868 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:46.868 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:46.868 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:46.868 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:46.868 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.868 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:46.868 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:46.868 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.868 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.868 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.868 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.868 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77866436 kB' 'MemAvailable: 81284364 kB' 'Buffers: 2696 kB' 'Cached: 9749600 kB' 'SwapCached: 0 kB' 'Active: 6754676 kB' 'Inactive: 3507864 kB' 'Active(anon): 6357352 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513580 kB' 'Mapped: 170992 kB' 'Shmem: 5847108 kB' 'KReclaimable: 217932 kB' 'Slab: 663676 kB' 'SReclaimable: 217932 kB' 'SUnreclaim: 445744 kB' 'KernelStack: 19664 kB' 'PageTables: 8124 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7726316 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221352 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:46.868 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.868 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.869 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 28338356 kB' 'MemUsed: 4296272 kB' 'SwapCached: 0 kB' 'Active: 2026060 kB' 'Inactive: 100316 kB' 'Active(anon): 1833064 kB' 'Inactive(anon): 0 kB' 'Active(file): 192996 kB' 'Inactive(file): 100316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1902368 kB' 'Mapped: 126320 kB' 'AnonPages: 227176 kB' 'Shmem: 1609056 kB' 'KernelStack: 10616 kB' 'PageTables: 4624 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 90252 kB' 'Slab: 319280 kB' 'SReclaimable: 90252 kB' 'SUnreclaim: 229028 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.870 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.871 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60688384 kB' 'MemFree: 49528080 kB' 'MemUsed: 11160304 kB' 'SwapCached: 0 kB' 'Active: 4728616 kB' 'Inactive: 3407548 kB' 'Active(anon): 4524288 kB' 'Inactive(anon): 0 kB' 'Active(file): 204328 kB' 'Inactive(file): 3407548 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7849928 kB' 'Mapped: 44672 kB' 'AnonPages: 286404 kB' 'Shmem: 4238052 kB' 'KernelStack: 9048 kB' 'PageTables: 3500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 127680 kB' 'Slab: 344396 kB' 'SReclaimable: 127680 kB' 'SUnreclaim: 216716 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.872 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:46.873 node0=512 expecting 512 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:46.873 node1=512 expecting 512 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:46.873 00:03:46.873 real 0m3.047s 00:03:46.873 user 0m1.218s 00:03:46.873 sys 0m1.850s 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:46.873 18:40:31 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:46.873 ************************************ 00:03:46.873 END TEST per_node_1G_alloc 00:03:46.873 ************************************ 00:03:46.873 18:40:31 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:46.873 18:40:31 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:46.873 18:40:31 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:46.873 18:40:31 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:46.873 ************************************ 00:03:46.873 START TEST even_2G_alloc 00:03:46.873 ************************************ 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:46.874 18:40:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:49.408 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:49.669 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:49.669 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:49.669 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:49.669 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:49.669 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:49.669 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:49.669 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:49.669 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:49.669 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:49.669 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:49.669 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:49.669 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:49.669 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:49.669 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:49.669 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:49.669 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:49.669 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:49.669 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:49.669 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:03:49.669 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:49.669 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:49.669 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:49.669 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:49.669 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:49.669 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:49.669 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:49.669 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:49.669 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:49.669 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77877552 kB' 'MemAvailable: 81295480 kB' 'Buffers: 2696 kB' 'Cached: 9749720 kB' 'SwapCached: 0 kB' 'Active: 6754624 kB' 'Inactive: 3507864 kB' 'Active(anon): 6357300 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512244 kB' 'Mapped: 169840 kB' 'Shmem: 5847228 kB' 'KReclaimable: 217932 kB' 'Slab: 663248 kB' 'SReclaimable: 217932 kB' 'SUnreclaim: 445316 kB' 'KernelStack: 19632 kB' 'PageTables: 7956 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7720492 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221336 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.670 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77878180 kB' 'MemAvailable: 81296108 kB' 'Buffers: 2696 kB' 'Cached: 9749720 kB' 'SwapCached: 0 kB' 'Active: 6753780 kB' 'Inactive: 3507864 kB' 'Active(anon): 6356456 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512556 kB' 'Mapped: 169840 kB' 'Shmem: 5847228 kB' 'KReclaimable: 217932 kB' 'Slab: 663252 kB' 'SReclaimable: 217932 kB' 'SUnreclaim: 445320 kB' 'KernelStack: 19632 kB' 'PageTables: 7964 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7720508 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221320 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.671 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.672 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.936 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77878208 kB' 'MemAvailable: 81296136 kB' 'Buffers: 2696 kB' 'Cached: 9749740 kB' 'SwapCached: 0 kB' 'Active: 6753816 kB' 'Inactive: 3507864 kB' 'Active(anon): 6356492 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512568 kB' 'Mapped: 169840 kB' 'Shmem: 5847248 kB' 'KReclaimable: 217932 kB' 'Slab: 663252 kB' 'SReclaimable: 217932 kB' 'SUnreclaim: 445320 kB' 'KernelStack: 19632 kB' 'PageTables: 7964 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7720532 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221320 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.937 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.938 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:49.939 nr_hugepages=1024 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:49.939 resv_hugepages=0 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:49.939 surplus_hugepages=0 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:49.939 anon_hugepages=0 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77878896 kB' 'MemAvailable: 81296824 kB' 'Buffers: 2696 kB' 'Cached: 9749760 kB' 'SwapCached: 0 kB' 'Active: 6753828 kB' 'Inactive: 3507864 kB' 'Active(anon): 6356504 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512564 kB' 'Mapped: 169840 kB' 'Shmem: 5847268 kB' 'KReclaimable: 217932 kB' 'Slab: 663252 kB' 'SReclaimable: 217932 kB' 'SUnreclaim: 445320 kB' 'KernelStack: 19632 kB' 'PageTables: 7964 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7720552 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221320 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.939 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.940 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.941 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 28342284 kB' 'MemUsed: 4292344 kB' 'SwapCached: 0 kB' 'Active: 2024768 kB' 'Inactive: 100316 kB' 'Active(anon): 1831772 kB' 'Inactive(anon): 0 kB' 'Active(file): 192996 kB' 'Inactive(file): 100316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1902500 kB' 'Mapped: 125768 kB' 'AnonPages: 225720 kB' 'Shmem: 1609188 kB' 'KernelStack: 10568 kB' 'PageTables: 4460 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 90252 kB' 'Slab: 319220 kB' 'SReclaimable: 90252 kB' 'SUnreclaim: 228968 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.942 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60688384 kB' 'MemFree: 49536856 kB' 'MemUsed: 11151528 kB' 'SwapCached: 0 kB' 'Active: 4729060 kB' 'Inactive: 3407548 kB' 'Active(anon): 4524732 kB' 'Inactive(anon): 0 kB' 'Active(file): 204328 kB' 'Inactive(file): 3407548 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7849956 kB' 'Mapped: 44072 kB' 'AnonPages: 286844 kB' 'Shmem: 4238080 kB' 'KernelStack: 9064 kB' 'PageTables: 3504 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 127680 kB' 'Slab: 344032 kB' 'SReclaimable: 127680 kB' 'SUnreclaim: 216352 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.943 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:49.944 node0=512 expecting 512 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:49.944 node1=512 expecting 512 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:49.944 00:03:49.944 real 0m3.072s 00:03:49.944 user 0m1.243s 00:03:49.944 sys 0m1.851s 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:49.944 18:40:34 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:49.944 ************************************ 00:03:49.944 END TEST even_2G_alloc 00:03:49.944 ************************************ 00:03:49.944 18:40:34 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:49.944 18:40:34 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:49.944 18:40:34 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:49.944 18:40:34 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:49.944 ************************************ 00:03:49.944 START TEST odd_alloc 00:03:49.944 ************************************ 00:03:49.944 18:40:34 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:03:49.944 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:49.944 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:03:49.944 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:49.944 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:49.945 18:40:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:52.502 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:52.502 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:52.502 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:52.502 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:52.502 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:52.502 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:52.765 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:52.765 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:52.765 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:52.765 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:52.765 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:52.765 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:52.765 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:52.765 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:52.765 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:52.765 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:52.765 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:52.765 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77874048 kB' 'MemAvailable: 81291976 kB' 'Buffers: 2696 kB' 'Cached: 9749876 kB' 'SwapCached: 0 kB' 'Active: 6754244 kB' 'Inactive: 3507864 kB' 'Active(anon): 6356920 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512868 kB' 'Mapped: 169888 kB' 'Shmem: 5847384 kB' 'KReclaimable: 217932 kB' 'Slab: 662616 kB' 'SReclaimable: 217932 kB' 'SUnreclaim: 444684 kB' 'KernelStack: 19632 kB' 'PageTables: 8020 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54000508 kB' 'Committed_AS: 7721024 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221288 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.765 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.766 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77876512 kB' 'MemAvailable: 81294440 kB' 'Buffers: 2696 kB' 'Cached: 9749876 kB' 'SwapCached: 0 kB' 'Active: 6754648 kB' 'Inactive: 3507864 kB' 'Active(anon): 6357324 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513208 kB' 'Mapped: 169860 kB' 'Shmem: 5847384 kB' 'KReclaimable: 217932 kB' 'Slab: 662600 kB' 'SReclaimable: 217932 kB' 'SUnreclaim: 444668 kB' 'KernelStack: 19600 kB' 'PageTables: 7916 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54000508 kB' 'Committed_AS: 7722532 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221240 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.767 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.768 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77876244 kB' 'MemAvailable: 81294172 kB' 'Buffers: 2696 kB' 'Cached: 9749896 kB' 'SwapCached: 0 kB' 'Active: 6754940 kB' 'Inactive: 3507864 kB' 'Active(anon): 6357616 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513424 kB' 'Mapped: 169860 kB' 'Shmem: 5847404 kB' 'KReclaimable: 217932 kB' 'Slab: 662624 kB' 'SReclaimable: 217932 kB' 'SUnreclaim: 444692 kB' 'KernelStack: 19744 kB' 'PageTables: 8156 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54000508 kB' 'Committed_AS: 7723676 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221416 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.769 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.770 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:52.771 nr_hugepages=1025 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:52.771 resv_hugepages=0 00:03:52.771 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:52.771 surplus_hugepages=0 00:03:53.033 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:53.033 anon_hugepages=0 00:03:53.033 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:53.033 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:53.033 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:53.033 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:53.033 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:53.033 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:53.033 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:53.033 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:53.033 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:53.033 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:53.033 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:53.033 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:53.033 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.033 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77877560 kB' 'MemAvailable: 81295488 kB' 'Buffers: 2696 kB' 'Cached: 9749916 kB' 'SwapCached: 0 kB' 'Active: 6755240 kB' 'Inactive: 3507864 kB' 'Active(anon): 6357916 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 514252 kB' 'Mapped: 169860 kB' 'Shmem: 5847424 kB' 'KReclaimable: 217932 kB' 'Slab: 662624 kB' 'SReclaimable: 217932 kB' 'SUnreclaim: 444692 kB' 'KernelStack: 19648 kB' 'PageTables: 7780 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54000508 kB' 'Committed_AS: 7723700 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221400 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.034 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:53.035 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 28334908 kB' 'MemUsed: 4299720 kB' 'SwapCached: 0 kB' 'Active: 2026108 kB' 'Inactive: 100316 kB' 'Active(anon): 1833112 kB' 'Inactive(anon): 0 kB' 'Active(file): 192996 kB' 'Inactive(file): 100316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1902644 kB' 'Mapped: 125788 kB' 'AnonPages: 226892 kB' 'Shmem: 1609332 kB' 'KernelStack: 10616 kB' 'PageTables: 4376 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 90252 kB' 'Slab: 318704 kB' 'SReclaimable: 90252 kB' 'SUnreclaim: 228452 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.036 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60688384 kB' 'MemFree: 49540468 kB' 'MemUsed: 11147916 kB' 'SwapCached: 0 kB' 'Active: 4729000 kB' 'Inactive: 3407548 kB' 'Active(anon): 4524672 kB' 'Inactive(anon): 0 kB' 'Active(file): 204328 kB' 'Inactive(file): 3407548 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7849984 kB' 'Mapped: 44072 kB' 'AnonPages: 286608 kB' 'Shmem: 4238108 kB' 'KernelStack: 9208 kB' 'PageTables: 3612 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 127680 kB' 'Slab: 343920 kB' 'SReclaimable: 127680 kB' 'SUnreclaim: 216240 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.037 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:53.038 node0=512 expecting 513 00:03:53.038 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:53.039 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:53.039 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:53.039 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:53.039 node1=513 expecting 512 00:03:53.039 18:40:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:53.039 00:03:53.039 real 0m3.021s 00:03:53.039 user 0m1.223s 00:03:53.039 sys 0m1.803s 00:03:53.039 18:40:37 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:53.039 18:40:37 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:53.039 ************************************ 00:03:53.039 END TEST odd_alloc 00:03:53.039 ************************************ 00:03:53.039 18:40:37 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:53.039 18:40:37 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:53.039 18:40:37 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:53.039 18:40:37 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:53.039 ************************************ 00:03:53.039 START TEST custom_alloc 00:03:53.039 ************************************ 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:53.039 18:40:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:56.335 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:56.335 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:56.335 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:56.335 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:56.335 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:56.335 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:56.335 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:56.335 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:56.335 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:56.335 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:56.335 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:56.335 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:56.335 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:56.335 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:56.335 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:56.335 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:56.335 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:56.335 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 76828056 kB' 'MemAvailable: 80245968 kB' 'Buffers: 2696 kB' 'Cached: 9750024 kB' 'SwapCached: 0 kB' 'Active: 6752996 kB' 'Inactive: 3507864 kB' 'Active(anon): 6355672 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511412 kB' 'Mapped: 169888 kB' 'Shmem: 5847532 kB' 'KReclaimable: 217900 kB' 'Slab: 663384 kB' 'SReclaimable: 217900 kB' 'SUnreclaim: 445484 kB' 'KernelStack: 19584 kB' 'PageTables: 7880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53477244 kB' 'Committed_AS: 7721692 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221320 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.335 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:56.336 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 76832072 kB' 'MemAvailable: 80249984 kB' 'Buffers: 2696 kB' 'Cached: 9750028 kB' 'SwapCached: 0 kB' 'Active: 6752856 kB' 'Inactive: 3507864 kB' 'Active(anon): 6355532 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511332 kB' 'Mapped: 169872 kB' 'Shmem: 5847536 kB' 'KReclaimable: 217900 kB' 'Slab: 663396 kB' 'SReclaimable: 217900 kB' 'SUnreclaim: 445496 kB' 'KernelStack: 19616 kB' 'PageTables: 7972 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53477244 kB' 'Committed_AS: 7721708 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221288 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.337 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.338 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 76832072 kB' 'MemAvailable: 80249984 kB' 'Buffers: 2696 kB' 'Cached: 9750048 kB' 'SwapCached: 0 kB' 'Active: 6752900 kB' 'Inactive: 3507864 kB' 'Active(anon): 6355576 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511332 kB' 'Mapped: 169872 kB' 'Shmem: 5847556 kB' 'KReclaimable: 217900 kB' 'Slab: 663396 kB' 'SReclaimable: 217900 kB' 'SUnreclaim: 445496 kB' 'KernelStack: 19616 kB' 'PageTables: 7972 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53477244 kB' 'Committed_AS: 7721728 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221304 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.339 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.340 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:56.642 nr_hugepages=1536 00:03:56.642 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:56.642 resv_hugepages=0 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:56.643 surplus_hugepages=0 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:56.643 anon_hugepages=0 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 76833944 kB' 'MemAvailable: 80251856 kB' 'Buffers: 2696 kB' 'Cached: 9750068 kB' 'SwapCached: 0 kB' 'Active: 6752908 kB' 'Inactive: 3507864 kB' 'Active(anon): 6355584 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511296 kB' 'Mapped: 169872 kB' 'Shmem: 5847576 kB' 'KReclaimable: 217900 kB' 'Slab: 663396 kB' 'SReclaimable: 217900 kB' 'SUnreclaim: 445496 kB' 'KernelStack: 19600 kB' 'PageTables: 7924 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53477244 kB' 'Committed_AS: 7721752 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221304 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.643 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:03:56.644 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 28351656 kB' 'MemUsed: 4282972 kB' 'SwapCached: 0 kB' 'Active: 2026728 kB' 'Inactive: 100316 kB' 'Active(anon): 1833732 kB' 'Inactive(anon): 0 kB' 'Active(file): 192996 kB' 'Inactive(file): 100316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1902800 kB' 'Mapped: 125800 kB' 'AnonPages: 227536 kB' 'Shmem: 1609488 kB' 'KernelStack: 10632 kB' 'PageTables: 4656 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 90220 kB' 'Slab: 319496 kB' 'SReclaimable: 90220 kB' 'SUnreclaim: 229276 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.645 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60688384 kB' 'MemFree: 48482264 kB' 'MemUsed: 12206120 kB' 'SwapCached: 0 kB' 'Active: 4726220 kB' 'Inactive: 3407548 kB' 'Active(anon): 4521892 kB' 'Inactive(anon): 0 kB' 'Active(file): 204328 kB' 'Inactive(file): 3407548 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7849984 kB' 'Mapped: 44072 kB' 'AnonPages: 283784 kB' 'Shmem: 4238108 kB' 'KernelStack: 8968 kB' 'PageTables: 3316 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 127680 kB' 'Slab: 343900 kB' 'SReclaimable: 127680 kB' 'SUnreclaim: 216220 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.646 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:56.647 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:56.648 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:56.648 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:56.648 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:56.648 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:56.648 node0=512 expecting 512 00:03:56.648 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:56.648 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:56.648 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:56.648 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:56.648 node1=1024 expecting 1024 00:03:56.648 18:40:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:56.648 00:03:56.648 real 0m3.505s 00:03:56.648 user 0m1.430s 00:03:56.648 sys 0m2.139s 00:03:56.648 18:40:41 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:56.648 18:40:41 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:56.648 ************************************ 00:03:56.648 END TEST custom_alloc 00:03:56.648 ************************************ 00:03:56.648 18:40:41 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:56.648 18:40:41 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:56.648 18:40:41 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:56.648 18:40:41 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:56.648 ************************************ 00:03:56.648 START TEST no_shrink_alloc 00:03:56.648 ************************************ 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:56.648 18:40:41 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:59.192 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:59.453 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:59.453 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:59.453 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:59.453 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:59.453 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:59.454 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:59.454 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:59.454 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:59.454 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:59.454 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:59.454 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:59.454 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:59.454 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:59.454 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:59.454 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:59.454 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:59.454 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77866116 kB' 'MemAvailable: 81284028 kB' 'Buffers: 2696 kB' 'Cached: 9750180 kB' 'SwapCached: 0 kB' 'Active: 6754628 kB' 'Inactive: 3507864 kB' 'Active(anon): 6357304 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512464 kB' 'Mapped: 170004 kB' 'Shmem: 5847688 kB' 'KReclaimable: 217900 kB' 'Slab: 662952 kB' 'SReclaimable: 217900 kB' 'SUnreclaim: 445052 kB' 'KernelStack: 19664 kB' 'PageTables: 8160 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7723672 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221352 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.454 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.455 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77867080 kB' 'MemAvailable: 81284992 kB' 'Buffers: 2696 kB' 'Cached: 9750184 kB' 'SwapCached: 0 kB' 'Active: 6753952 kB' 'Inactive: 3507864 kB' 'Active(anon): 6356628 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512296 kB' 'Mapped: 169884 kB' 'Shmem: 5847692 kB' 'KReclaimable: 217900 kB' 'Slab: 662868 kB' 'SReclaimable: 217900 kB' 'SUnreclaim: 444968 kB' 'KernelStack: 19584 kB' 'PageTables: 7908 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7723528 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221304 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.456 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.719 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.720 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77865044 kB' 'MemAvailable: 81282956 kB' 'Buffers: 2696 kB' 'Cached: 9750200 kB' 'SwapCached: 0 kB' 'Active: 6754076 kB' 'Inactive: 3507864 kB' 'Active(anon): 6356752 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512348 kB' 'Mapped: 169884 kB' 'Shmem: 5847708 kB' 'KReclaimable: 217900 kB' 'Slab: 662868 kB' 'SReclaimable: 217900 kB' 'SUnreclaim: 444968 kB' 'KernelStack: 19776 kB' 'PageTables: 8244 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7725040 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221416 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.721 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.722 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:59.723 nr_hugepages=1024 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:59.723 resv_hugepages=0 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:59.723 surplus_hugepages=0 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:59.723 anon_hugepages=0 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77864572 kB' 'MemAvailable: 81282484 kB' 'Buffers: 2696 kB' 'Cached: 9750224 kB' 'SwapCached: 0 kB' 'Active: 6754132 kB' 'Inactive: 3507864 kB' 'Active(anon): 6356808 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512356 kB' 'Mapped: 169884 kB' 'Shmem: 5847732 kB' 'KReclaimable: 217900 kB' 'Slab: 662868 kB' 'SReclaimable: 217900 kB' 'SUnreclaim: 444968 kB' 'KernelStack: 19776 kB' 'PageTables: 8292 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7725064 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221384 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.723 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.724 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 27305248 kB' 'MemUsed: 5329380 kB' 'SwapCached: 0 kB' 'Active: 2027028 kB' 'Inactive: 100316 kB' 'Active(anon): 1834032 kB' 'Inactive(anon): 0 kB' 'Active(file): 192996 kB' 'Inactive(file): 100316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1902956 kB' 'Mapped: 125812 kB' 'AnonPages: 227656 kB' 'Shmem: 1609644 kB' 'KernelStack: 10600 kB' 'PageTables: 4568 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 90220 kB' 'Slab: 318824 kB' 'SReclaimable: 90220 kB' 'SUnreclaim: 228604 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.725 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:59.726 node0=1024 expecting 1024 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:59.726 18:40:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:02.261 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:04:02.520 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:02.520 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:02.520 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:02.520 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:02.520 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:02.520 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:02.520 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:02.520 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:02.520 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:02.520 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:02.520 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:02.520 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:02.520 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:02.520 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:02.520 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:02.520 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:02.520 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:02.520 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77820876 kB' 'MemAvailable: 81238788 kB' 'Buffers: 2696 kB' 'Cached: 9750312 kB' 'SwapCached: 0 kB' 'Active: 6754880 kB' 'Inactive: 3507864 kB' 'Active(anon): 6357556 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512480 kB' 'Mapped: 170012 kB' 'Shmem: 5847820 kB' 'KReclaimable: 217900 kB' 'Slab: 662860 kB' 'SReclaimable: 217900 kB' 'SUnreclaim: 444960 kB' 'KernelStack: 19632 kB' 'PageTables: 8020 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7723108 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221352 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.784 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.785 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77821692 kB' 'MemAvailable: 81239604 kB' 'Buffers: 2696 kB' 'Cached: 9750316 kB' 'SwapCached: 0 kB' 'Active: 6754464 kB' 'Inactive: 3507864 kB' 'Active(anon): 6357140 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512088 kB' 'Mapped: 169968 kB' 'Shmem: 5847824 kB' 'KReclaimable: 217900 kB' 'Slab: 662860 kB' 'SReclaimable: 217900 kB' 'SUnreclaim: 444960 kB' 'KernelStack: 19584 kB' 'PageTables: 7880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7723128 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221320 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.786 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.787 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77822004 kB' 'MemAvailable: 81239916 kB' 'Buffers: 2696 kB' 'Cached: 9750332 kB' 'SwapCached: 0 kB' 'Active: 6754044 kB' 'Inactive: 3507864 kB' 'Active(anon): 6356720 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512152 kB' 'Mapped: 169892 kB' 'Shmem: 5847840 kB' 'KReclaimable: 217900 kB' 'Slab: 662856 kB' 'SReclaimable: 217900 kB' 'SUnreclaim: 444956 kB' 'KernelStack: 19616 kB' 'PageTables: 7968 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7723148 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221320 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.788 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.789 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:02.790 nr_hugepages=1024 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:02.790 resv_hugepages=0 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:02.790 surplus_hugepages=0 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:02.790 anon_hugepages=0 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77822516 kB' 'MemAvailable: 81240428 kB' 'Buffers: 2696 kB' 'Cached: 9750356 kB' 'SwapCached: 0 kB' 'Active: 6754660 kB' 'Inactive: 3507864 kB' 'Active(anon): 6357336 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512808 kB' 'Mapped: 169832 kB' 'Shmem: 5847864 kB' 'KReclaimable: 217900 kB' 'Slab: 662856 kB' 'SReclaimable: 217900 kB' 'SUnreclaim: 444956 kB' 'KernelStack: 19648 kB' 'PageTables: 8072 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7722804 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221240 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2053076 kB' 'DirectMap2M: 13355008 kB' 'DirectMap1G: 87031808 kB' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.790 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.791 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 27267972 kB' 'MemUsed: 5366656 kB' 'SwapCached: 0 kB' 'Active: 2026016 kB' 'Inactive: 100316 kB' 'Active(anon): 1833020 kB' 'Inactive(anon): 0 kB' 'Active(file): 192996 kB' 'Inactive(file): 100316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1903096 kB' 'Mapped: 125820 kB' 'AnonPages: 226460 kB' 'Shmem: 1609784 kB' 'KernelStack: 10584 kB' 'PageTables: 4512 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 90220 kB' 'Slab: 318656 kB' 'SReclaimable: 90220 kB' 'SUnreclaim: 228436 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:02.794 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:02.794 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:02.794 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:02.794 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:02.794 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:02.794 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:02.794 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:02.794 node0=1024 expecting 1024 00:04:02.794 18:40:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:02.794 00:04:02.794 real 0m6.244s 00:04:02.794 user 0m2.449s 00:04:02.794 sys 0m3.845s 00:04:02.794 18:40:47 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:02.794 18:40:47 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:02.794 ************************************ 00:04:02.794 END TEST no_shrink_alloc 00:04:02.794 ************************************ 00:04:02.794 18:40:47 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:04:02.794 18:40:47 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:02.794 18:40:47 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:02.794 18:40:47 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:02.794 18:40:47 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:02.794 18:40:47 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:02.794 18:40:47 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:02.794 18:40:47 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:02.794 18:40:47 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:02.794 18:40:47 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:02.794 18:40:47 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:02.794 18:40:47 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:02.794 18:40:47 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:02.794 18:40:47 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:02.794 00:04:02.794 real 0m23.300s 00:04:02.794 user 0m8.963s 00:04:02.794 sys 0m13.644s 00:04:02.794 18:40:47 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:02.794 18:40:47 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:02.794 ************************************ 00:04:02.794 END TEST hugepages 00:04:02.794 ************************************ 00:04:03.053 18:40:47 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:03.053 18:40:47 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:03.053 18:40:47 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:03.053 18:40:47 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:03.053 ************************************ 00:04:03.053 START TEST driver 00:04:03.053 ************************************ 00:04:03.053 18:40:47 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:03.053 * Looking for test storage... 00:04:03.053 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:03.053 18:40:47 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:04:03.053 18:40:47 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:03.053 18:40:47 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:08.326 18:40:52 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:08.326 18:40:52 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:08.326 18:40:52 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:08.326 18:40:52 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:08.326 ************************************ 00:04:08.326 START TEST guess_driver 00:04:08.326 ************************************ 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 220 > 0 )) 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:08.326 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:08.326 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:08.326 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:08.326 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:08.326 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:08.326 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:08.326 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:08.326 Looking for driver=vfio-pci 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:04:08.326 18:40:52 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:10.233 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ denied == \-\> ]] 00:04:10.233 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:04:10.233 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.492 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.492 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.492 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.492 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.492 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.492 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.752 18:40:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:11.689 18:40:56 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:11.689 18:40:56 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:11.690 18:40:56 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:11.690 18:40:56 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:11.690 18:40:56 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:04:11.690 18:40:56 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:11.690 18:40:56 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:15.895 00:04:15.895 real 0m8.560s 00:04:15.895 user 0m2.618s 00:04:15.895 sys 0m4.411s 00:04:15.895 18:41:00 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:15.895 18:41:00 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:04:15.895 ************************************ 00:04:15.895 END TEST guess_driver 00:04:15.895 ************************************ 00:04:16.153 00:04:16.154 real 0m13.092s 00:04:16.154 user 0m4.009s 00:04:16.154 sys 0m6.805s 00:04:16.154 18:41:00 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:16.154 18:41:00 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:16.154 ************************************ 00:04:16.154 END TEST driver 00:04:16.154 ************************************ 00:04:16.154 18:41:00 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:16.154 18:41:00 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:16.154 18:41:00 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:16.154 18:41:00 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:16.154 ************************************ 00:04:16.154 START TEST devices 00:04:16.154 ************************************ 00:04:16.154 18:41:00 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:16.154 * Looking for test storage... 00:04:16.154 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:16.154 18:41:01 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:16.154 18:41:01 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:04:16.154 18:41:01 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:16.154 18:41:01 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:20.343 18:41:04 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1667 -- # zoned_devs=() 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1667 -- # local -gA zoned_devs 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1668 -- # local nvme bdf 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1670 -- # for nvme in /sys/block/nvme* 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1671 -- # is_block_zoned nvme0n1 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1660 -- # local device=nvme0n1 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1662 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1663 -- # [[ none != none ]] 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1670 -- # for nvme in /sys/block/nvme* 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1671 -- # is_block_zoned nvme1n1 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1660 -- # local device=nvme1n1 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1662 -- # [[ -e /sys/block/nvme1n1/queue/zoned ]] 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1663 -- # [[ none != none ]] 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1670 -- # for nvme in /sys/block/nvme* 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1671 -- # is_block_zoned nvme1n2 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1660 -- # local device=nvme1n2 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1662 -- # [[ -e /sys/block/nvme1n2/queue/zoned ]] 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1663 -- # [[ host-managed != none ]] 00:04:20.343 18:41:04 setup.sh.devices -- common/autotest_common.sh@1672 -- # zoned_devs["${nvme##*/}"]=0000:5f:00.0 00:04:20.343 18:41:04 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:04:20.343 18:41:04 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:04:20.343 18:41:04 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:20.343 18:41:04 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:20.343 18:41:04 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:20.343 18:41:04 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:20.343 18:41:04 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:20.343 18:41:04 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@203 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:20.344 18:41:04 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:04:20.344 18:41:04 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:20.344 No valid GPT data, bailing 00:04:20.344 18:41:04 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:20.344 18:41:04 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:04:20.344 18:41:04 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:20.344 18:41:04 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:20.344 18:41:04 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:20.344 18:41:04 setup.sh.devices -- setup/common.sh@80 -- # echo 1000204886016 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@204 -- # (( 1000204886016 >= min_disk_size )) 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme1n1 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme1 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5f:00.0 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@203 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\f\:\0\0\.\0* ]] 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@203 -- # continue 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme1n2 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme1 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5f:00.0 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@203 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\f\:\0\0\.\0* ]] 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@203 -- # continue 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:20.344 18:41:04 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:20.344 18:41:04 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:20.344 18:41:04 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:20.344 18:41:04 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:20.344 ************************************ 00:04:20.344 START TEST nvme_mount 00:04:20.344 ************************************ 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:20.344 18:41:04 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:20.912 Creating new GPT entries in memory. 00:04:20.912 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:20.912 other utilities. 00:04:20.912 18:41:05 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:20.912 18:41:05 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:20.912 18:41:05 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:20.912 18:41:05 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:20.912 18:41:05 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:21.849 Creating new GPT entries in memory. 00:04:21.849 The operation has completed successfully. 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 1984153 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:21.849 18:41:06 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:24.419 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:24.419 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.419 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:24.419 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:24.419 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:24.419 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:24.420 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.679 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:24.679 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:24.679 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:24.679 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:24.679 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:24.679 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:04:24.679 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:24.679 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:24.679 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:24.679 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:24.679 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:24.679 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:24.679 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:24.938 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:24.938 /dev/nvme0n1: 8 bytes were erased at offset 0xe8e0db5e00 (gpt): 45 46 49 20 50 41 52 54 00:04:24.938 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:24.938 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:24.938 18:41:09 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.227 18:41:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:28.227 18:41:13 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:30.761 18:41:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:30.761 18:41:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:31.329 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:31.329 00:04:31.329 real 0m11.578s 00:04:31.329 user 0m3.349s 00:04:31.329 sys 0m5.806s 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:31.329 18:41:16 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:04:31.329 ************************************ 00:04:31.329 END TEST nvme_mount 00:04:31.329 ************************************ 00:04:31.330 18:41:16 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:31.330 18:41:16 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:31.330 18:41:16 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:31.330 18:41:16 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:31.330 ************************************ 00:04:31.330 START TEST dm_mount 00:04:31.330 ************************************ 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:31.330 18:41:16 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:32.709 Creating new GPT entries in memory. 00:04:32.709 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:32.709 other utilities. 00:04:32.709 18:41:17 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:32.709 18:41:17 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:32.709 18:41:17 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:32.709 18:41:17 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:32.709 18:41:17 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:33.658 Creating new GPT entries in memory. 00:04:33.658 The operation has completed successfully. 00:04:33.658 18:41:18 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:33.658 18:41:18 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:33.658 18:41:18 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:33.658 18:41:18 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:33.658 18:41:18 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:34.597 The operation has completed successfully. 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 1988901 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:34.597 18:41:19 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.886 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:37.887 18:41:22 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:40.421 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:40.421 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:40.680 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:40.681 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:40.681 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:40.681 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:40.681 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:40.681 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:40.681 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:40.681 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:40.681 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:40.940 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:40.940 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:40.940 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:04:40.940 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:04:40.940 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:40.940 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:40.940 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:40.940 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:40.940 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:40.940 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:40.940 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:40.940 18:41:25 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:40.940 00:04:40.940 real 0m9.533s 00:04:40.940 user 0m2.383s 00:04:40.940 sys 0m4.117s 00:04:40.940 18:41:25 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:40.940 18:41:25 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:04:40.940 ************************************ 00:04:40.940 END TEST dm_mount 00:04:40.940 ************************************ 00:04:40.940 18:41:25 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:04:40.940 18:41:25 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:04:40.940 18:41:25 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:40.940 18:41:25 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:40.940 18:41:25 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:40.940 18:41:25 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:40.940 18:41:25 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:41.200 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:41.200 /dev/nvme0n1: 8 bytes were erased at offset 0xe8e0db5e00 (gpt): 45 46 49 20 50 41 52 54 00:04:41.200 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:41.200 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:41.200 18:41:26 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:04:41.200 18:41:26 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:41.200 18:41:26 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:41.200 18:41:26 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:41.200 18:41:26 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:41.200 18:41:26 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:41.200 18:41:26 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:41.200 00:04:41.200 real 0m25.135s 00:04:41.200 user 0m7.217s 00:04:41.200 sys 0m12.333s 00:04:41.200 18:41:26 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:41.200 18:41:26 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:41.200 ************************************ 00:04:41.200 END TEST devices 00:04:41.200 ************************************ 00:04:41.200 00:04:41.200 real 1m23.882s 00:04:41.200 user 0m27.838s 00:04:41.200 sys 0m45.960s 00:04:41.200 18:41:26 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:41.200 18:41:26 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:41.200 ************************************ 00:04:41.200 END TEST setup.sh 00:04:41.200 ************************************ 00:04:41.200 18:41:26 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:04:44.487 Hugepages 00:04:44.487 node hugesize free / total 00:04:44.487 node0 1048576kB 0 / 0 00:04:44.487 node0 2048kB 1024 / 1024 00:04:44.487 node1 1048576kB 0 / 0 00:04:44.487 node1 2048kB 1024 / 1024 00:04:44.487 00:04:44.487 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:44.487 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:44.488 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:44.488 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:44.488 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:44.488 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:44.488 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:44.488 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:44.488 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:44.488 NVMe 0000:5e:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:04:44.488 NVMe 0000:5f:00.0 1b96 2600 0 nvme nvme1 nvme1n1 nvme1n2 00:04:44.488 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:44.488 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:44.488 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:44.488 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:44.488 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:44.488 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:44.747 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:44.747 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:44.747 18:41:29 -- spdk/autotest.sh@130 -- # uname -s 00:04:44.747 18:41:29 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:04:44.747 18:41:29 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:04:44.747 18:41:29 -- common/autotest_common.sh@1529 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:47.282 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:04:47.861 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:47.861 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:47.861 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:47.861 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:47.861 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:47.861 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:47.861 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:47.861 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:47.861 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:47.861 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:47.861 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:47.861 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:47.861 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:47.861 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:47.861 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:47.861 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:48.830 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:04:48.830 18:41:33 -- common/autotest_common.sh@1530 -- # sleep 1 00:04:49.768 18:41:34 -- common/autotest_common.sh@1531 -- # bdfs=() 00:04:49.768 18:41:34 -- common/autotest_common.sh@1531 -- # local bdfs 00:04:49.768 18:41:34 -- common/autotest_common.sh@1532 -- # bdfs=($(get_nvme_bdfs)) 00:04:49.768 18:41:34 -- common/autotest_common.sh@1532 -- # get_nvme_bdfs 00:04:49.768 18:41:34 -- common/autotest_common.sh@1511 -- # bdfs=() 00:04:49.768 18:41:34 -- common/autotest_common.sh@1511 -- # local bdfs 00:04:49.768 18:41:34 -- common/autotest_common.sh@1512 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:49.768 18:41:34 -- common/autotest_common.sh@1512 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:49.768 18:41:34 -- common/autotest_common.sh@1512 -- # jq -r '.config[].params.traddr' 00:04:50.027 18:41:34 -- common/autotest_common.sh@1513 -- # (( 1 == 0 )) 00:04:50.027 18:41:34 -- common/autotest_common.sh@1517 -- # printf '%s\n' 0000:5e:00.0 00:04:50.027 18:41:34 -- common/autotest_common.sh@1534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:52.555 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:04:52.555 Waiting for block devices as requested 00:04:52.555 0000:5e:00.0 (8086 0a54): vfio-pci -> nvme 00:04:52.555 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:52.813 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:52.813 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:52.813 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:52.813 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:53.071 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:53.071 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:53.071 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:53.071 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:53.330 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:53.330 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:53.330 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:53.588 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:53.588 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:53.588 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:53.847 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:53.847 18:41:38 -- common/autotest_common.sh@1536 -- # for bdf in "${bdfs[@]}" 00:04:53.847 18:41:38 -- common/autotest_common.sh@1537 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:04:53.848 18:41:38 -- common/autotest_common.sh@1500 -- # grep 0000:5e:00.0/nvme/nvme 00:04:53.848 18:41:38 -- common/autotest_common.sh@1500 -- # readlink -f /sys/class/nvme/nvme0 /sys/class/nvme/nvme1 00:04:53.848 18:41:38 -- common/autotest_common.sh@1500 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:04:53.848 18:41:38 -- common/autotest_common.sh@1501 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:04:53.848 18:41:38 -- common/autotest_common.sh@1505 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:04:53.848 18:41:38 -- common/autotest_common.sh@1505 -- # printf '%s\n' nvme0 00:04:53.848 18:41:38 -- common/autotest_common.sh@1537 -- # nvme_ctrlr=/dev/nvme0 00:04:53.848 18:41:38 -- common/autotest_common.sh@1538 -- # [[ -z /dev/nvme0 ]] 00:04:53.848 18:41:38 -- common/autotest_common.sh@1543 -- # nvme id-ctrl /dev/nvme0 00:04:53.848 18:41:38 -- common/autotest_common.sh@1543 -- # grep oacs 00:04:53.848 18:41:38 -- common/autotest_common.sh@1543 -- # cut -d: -f2 00:04:53.848 18:41:38 -- common/autotest_common.sh@1543 -- # oacs=' 0xf' 00:04:53.848 18:41:38 -- common/autotest_common.sh@1544 -- # oacs_ns_manage=8 00:04:53.848 18:41:38 -- common/autotest_common.sh@1546 -- # [[ 8 -ne 0 ]] 00:04:53.848 18:41:38 -- common/autotest_common.sh@1552 -- # nvme id-ctrl /dev/nvme0 00:04:53.848 18:41:38 -- common/autotest_common.sh@1552 -- # cut -d: -f2 00:04:53.848 18:41:38 -- common/autotest_common.sh@1552 -- # grep unvmcap 00:04:53.848 18:41:38 -- common/autotest_common.sh@1552 -- # unvmcap=' 0' 00:04:53.848 18:41:38 -- common/autotest_common.sh@1553 -- # [[ 0 -eq 0 ]] 00:04:53.848 18:41:38 -- common/autotest_common.sh@1555 -- # continue 00:04:53.848 18:41:38 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:04:53.848 18:41:38 -- common/autotest_common.sh@728 -- # xtrace_disable 00:04:53.848 18:41:38 -- common/autotest_common.sh@10 -- # set +x 00:04:53.848 18:41:38 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:04:53.848 18:41:38 -- common/autotest_common.sh@722 -- # xtrace_disable 00:04:53.848 18:41:38 -- common/autotest_common.sh@10 -- # set +x 00:04:53.848 18:41:38 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:56.381 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:04:56.947 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:56.947 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:56.947 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:56.947 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:56.947 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:56.947 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:56.947 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:56.947 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:56.947 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:56.947 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:56.947 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:56.947 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:56.947 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:56.947 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:56.947 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:56.947 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:57.882 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:04:57.882 18:41:42 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:04:57.882 18:41:42 -- common/autotest_common.sh@728 -- # xtrace_disable 00:04:57.882 18:41:42 -- common/autotest_common.sh@10 -- # set +x 00:04:57.882 18:41:42 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:04:57.882 18:41:42 -- common/autotest_common.sh@1589 -- # mapfile -t bdfs 00:04:57.882 18:41:42 -- common/autotest_common.sh@1589 -- # get_nvme_bdfs_by_id 0x0a54 00:04:57.882 18:41:42 -- common/autotest_common.sh@1575 -- # bdfs=() 00:04:57.882 18:41:42 -- common/autotest_common.sh@1575 -- # local bdfs 00:04:57.882 18:41:42 -- common/autotest_common.sh@1577 -- # get_nvme_bdfs 00:04:57.882 18:41:42 -- common/autotest_common.sh@1511 -- # bdfs=() 00:04:57.882 18:41:42 -- common/autotest_common.sh@1511 -- # local bdfs 00:04:57.882 18:41:42 -- common/autotest_common.sh@1512 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:57.882 18:41:42 -- common/autotest_common.sh@1512 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:57.882 18:41:42 -- common/autotest_common.sh@1512 -- # jq -r '.config[].params.traddr' 00:04:58.140 18:41:42 -- common/autotest_common.sh@1513 -- # (( 1 == 0 )) 00:04:58.140 18:41:42 -- common/autotest_common.sh@1517 -- # printf '%s\n' 0000:5e:00.0 00:04:58.140 18:41:42 -- common/autotest_common.sh@1577 -- # for bdf in $(get_nvme_bdfs) 00:04:58.140 18:41:42 -- common/autotest_common.sh@1578 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:04:58.140 18:41:42 -- common/autotest_common.sh@1578 -- # device=0x0a54 00:04:58.140 18:41:42 -- common/autotest_common.sh@1579 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:04:58.140 18:41:42 -- common/autotest_common.sh@1580 -- # bdfs+=($bdf) 00:04:58.140 18:41:42 -- common/autotest_common.sh@1584 -- # printf '%s\n' 0000:5e:00.0 00:04:58.140 18:41:42 -- common/autotest_common.sh@1590 -- # [[ -z 0000:5e:00.0 ]] 00:04:58.140 18:41:42 -- common/autotest_common.sh@1595 -- # spdk_tgt_pid=1998855 00:04:58.140 18:41:42 -- common/autotest_common.sh@1596 -- # waitforlisten 1998855 00:04:58.140 18:41:42 -- common/autotest_common.sh@1594 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:04:58.140 18:41:42 -- common/autotest_common.sh@829 -- # '[' -z 1998855 ']' 00:04:58.140 18:41:42 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:58.140 18:41:42 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:58.140 18:41:42 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:58.140 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:58.140 18:41:42 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:58.140 18:41:42 -- common/autotest_common.sh@10 -- # set +x 00:04:58.140 [2024-07-24 18:41:42.968006] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:04:58.140 [2024-07-24 18:41:42.968058] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1998855 ] 00:04:58.140 [2024-07-24 18:41:43.027862] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:58.140 [2024-07-24 18:41:43.100504] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:04:59.074 18:41:43 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:59.074 18:41:43 -- common/autotest_common.sh@862 -- # return 0 00:04:59.074 18:41:43 -- common/autotest_common.sh@1598 -- # bdf_id=0 00:04:59.074 18:41:43 -- common/autotest_common.sh@1599 -- # for bdf in "${bdfs[@]}" 00:04:59.074 18:41:43 -- common/autotest_common.sh@1600 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:5e:00.0 00:05:02.355 nvme0n1 00:05:02.355 18:41:46 -- common/autotest_common.sh@1602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:02.355 [2024-07-24 18:41:46.877905] nvme_opal.c:2063:spdk_opal_cmd_revert_tper: *ERROR*: Error on starting admin SP session with error 18 00:05:02.355 [2024-07-24 18:41:46.877938] vbdev_opal_rpc.c: 134:rpc_bdev_nvme_opal_revert: *ERROR*: Revert TPer failure: 18 00:05:02.355 request: 00:05:02.355 { 00:05:02.355 "nvme_ctrlr_name": "nvme0", 00:05:02.355 "password": "test", 00:05:02.355 "method": "bdev_nvme_opal_revert", 00:05:02.355 "req_id": 1 00:05:02.355 } 00:05:02.355 Got JSON-RPC error response 00:05:02.355 response: 00:05:02.355 { 00:05:02.355 "code": -32603, 00:05:02.355 "message": "Internal error" 00:05:02.355 } 00:05:02.355 18:41:46 -- common/autotest_common.sh@1602 -- # true 00:05:02.355 18:41:46 -- common/autotest_common.sh@1603 -- # (( ++bdf_id )) 00:05:02.355 18:41:46 -- common/autotest_common.sh@1606 -- # killprocess 1998855 00:05:02.355 18:41:46 -- common/autotest_common.sh@948 -- # '[' -z 1998855 ']' 00:05:02.355 18:41:46 -- common/autotest_common.sh@952 -- # kill -0 1998855 00:05:02.355 18:41:46 -- common/autotest_common.sh@953 -- # uname 00:05:02.355 18:41:46 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:02.355 18:41:46 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1998855 00:05:02.355 18:41:46 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:02.355 18:41:46 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:02.355 18:41:46 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1998855' 00:05:02.355 killing process with pid 1998855 00:05:02.355 18:41:46 -- common/autotest_common.sh@967 -- # kill 1998855 00:05:02.355 18:41:46 -- common/autotest_common.sh@972 -- # wait 1998855 00:05:03.729 18:41:48 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:05:03.729 18:41:48 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:05:03.729 18:41:48 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:05:03.729 18:41:48 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:05:03.729 18:41:48 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:05:04.295 Restarting all devices. 00:05:07.583 lstat() error: No such file or directory 00:05:07.583 QAT Error: No GENERAL section found 00:05:07.583 Failed to configure qat_dev0 00:05:07.583 lstat() error: No such file or directory 00:05:07.583 QAT Error: No GENERAL section found 00:05:07.583 Failed to configure qat_dev1 00:05:07.583 lstat() error: No such file or directory 00:05:07.583 QAT Error: No GENERAL section found 00:05:07.583 Failed to configure qat_dev2 00:05:07.583 enable sriov 00:05:07.583 Checking status of all devices. 00:05:07.583 There is 3 QAT acceleration device(s) in the system: 00:05:07.583 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:1a:00.0, #accel: 5 #engines: 10 state: down 00:05:07.583 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:1c:00.0, #accel: 5 #engines: 10 state: down 00:05:07.583 qat_dev2 - type: c6xx, inst_id: 2, node_id: 0, bsf: 0000:1e:00.0, #accel: 5 #engines: 10 state: down 00:05:08.518 0000:1a:00.0 set to 16 VFs 00:05:09.083 0000:1c:00.0 set to 16 VFs 00:05:10.017 0000:1e:00.0 set to 16 VFs 00:05:11.423 Properly configured the qat device with driver uio_pci_generic. 00:05:11.423 18:41:56 -- spdk/autotest.sh@162 -- # timing_enter lib 00:05:11.423 18:41:56 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:11.423 18:41:56 -- common/autotest_common.sh@10 -- # set +x 00:05:11.423 18:41:56 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:05:11.423 18:41:56 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:11.423 18:41:56 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:11.423 18:41:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:11.423 18:41:56 -- common/autotest_common.sh@10 -- # set +x 00:05:11.423 ************************************ 00:05:11.423 START TEST env 00:05:11.423 ************************************ 00:05:11.423 18:41:56 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:11.423 * Looking for test storage... 00:05:11.423 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:05:11.423 18:41:56 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:11.423 18:41:56 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:11.423 18:41:56 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:11.423 18:41:56 env -- common/autotest_common.sh@10 -- # set +x 00:05:11.423 ************************************ 00:05:11.423 START TEST env_memory 00:05:11.423 ************************************ 00:05:11.423 18:41:56 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:11.423 00:05:11.423 00:05:11.423 CUnit - A unit testing framework for C - Version 2.1-3 00:05:11.423 http://cunit.sourceforge.net/ 00:05:11.423 00:05:11.423 00:05:11.423 Suite: memory 00:05:11.423 Test: alloc and free memory map ...[2024-07-24 18:41:56.336190] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:11.423 passed 00:05:11.423 Test: mem map translation ...[2024-07-24 18:41:56.356513] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:11.423 [2024-07-24 18:41:56.356532] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:11.423 [2024-07-24 18:41:56.356572] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:11.423 [2024-07-24 18:41:56.356579] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:11.423 passed 00:05:11.423 Test: mem map registration ...[2024-07-24 18:41:56.396690] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:11.423 [2024-07-24 18:41:56.396707] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:11.423 passed 00:05:11.684 Test: mem map adjacent registrations ...passed 00:05:11.684 00:05:11.684 Run Summary: Type Total Ran Passed Failed Inactive 00:05:11.684 suites 1 1 n/a 0 0 00:05:11.684 tests 4 4 4 0 0 00:05:11.684 asserts 152 152 152 0 n/a 00:05:11.684 00:05:11.684 Elapsed time = 0.145 seconds 00:05:11.684 00:05:11.684 real 0m0.157s 00:05:11.684 user 0m0.144s 00:05:11.684 sys 0m0.012s 00:05:11.684 18:41:56 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:11.684 18:41:56 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:05:11.684 ************************************ 00:05:11.684 END TEST env_memory 00:05:11.684 ************************************ 00:05:11.684 18:41:56 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:11.684 18:41:56 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:11.684 18:41:56 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:11.684 18:41:56 env -- common/autotest_common.sh@10 -- # set +x 00:05:11.684 ************************************ 00:05:11.684 START TEST env_vtophys 00:05:11.684 ************************************ 00:05:11.684 18:41:56 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:11.684 EAL: lib.eal log level changed from notice to debug 00:05:11.684 EAL: Detected lcore 0 as core 0 on socket 0 00:05:11.684 EAL: Detected lcore 1 as core 1 on socket 0 00:05:11.684 EAL: Detected lcore 2 as core 2 on socket 0 00:05:11.684 EAL: Detected lcore 3 as core 3 on socket 0 00:05:11.684 EAL: Detected lcore 4 as core 4 on socket 0 00:05:11.684 EAL: Detected lcore 5 as core 5 on socket 0 00:05:11.684 EAL: Detected lcore 6 as core 6 on socket 0 00:05:11.684 EAL: Detected lcore 7 as core 8 on socket 0 00:05:11.684 EAL: Detected lcore 8 as core 9 on socket 0 00:05:11.684 EAL: Detected lcore 9 as core 10 on socket 0 00:05:11.684 EAL: Detected lcore 10 as core 11 on socket 0 00:05:11.684 EAL: Detected lcore 11 as core 12 on socket 0 00:05:11.684 EAL: Detected lcore 12 as core 13 on socket 0 00:05:11.684 EAL: Detected lcore 13 as core 16 on socket 0 00:05:11.684 EAL: Detected lcore 14 as core 17 on socket 0 00:05:11.684 EAL: Detected lcore 15 as core 18 on socket 0 00:05:11.684 EAL: Detected lcore 16 as core 19 on socket 0 00:05:11.684 EAL: Detected lcore 17 as core 20 on socket 0 00:05:11.684 EAL: Detected lcore 18 as core 21 on socket 0 00:05:11.684 EAL: Detected lcore 19 as core 25 on socket 0 00:05:11.684 EAL: Detected lcore 20 as core 26 on socket 0 00:05:11.684 EAL: Detected lcore 21 as core 27 on socket 0 00:05:11.684 EAL: Detected lcore 22 as core 28 on socket 0 00:05:11.684 EAL: Detected lcore 23 as core 29 on socket 0 00:05:11.684 EAL: Detected lcore 24 as core 0 on socket 1 00:05:11.684 EAL: Detected lcore 25 as core 1 on socket 1 00:05:11.684 EAL: Detected lcore 26 as core 2 on socket 1 00:05:11.684 EAL: Detected lcore 27 as core 3 on socket 1 00:05:11.684 EAL: Detected lcore 28 as core 4 on socket 1 00:05:11.684 EAL: Detected lcore 29 as core 5 on socket 1 00:05:11.684 EAL: Detected lcore 30 as core 6 on socket 1 00:05:11.684 EAL: Detected lcore 31 as core 8 on socket 1 00:05:11.684 EAL: Detected lcore 32 as core 9 on socket 1 00:05:11.684 EAL: Detected lcore 33 as core 10 on socket 1 00:05:11.684 EAL: Detected lcore 34 as core 11 on socket 1 00:05:11.684 EAL: Detected lcore 35 as core 12 on socket 1 00:05:11.684 EAL: Detected lcore 36 as core 13 on socket 1 00:05:11.684 EAL: Detected lcore 37 as core 16 on socket 1 00:05:11.684 EAL: Detected lcore 38 as core 17 on socket 1 00:05:11.684 EAL: Detected lcore 39 as core 18 on socket 1 00:05:11.684 EAL: Detected lcore 40 as core 19 on socket 1 00:05:11.684 EAL: Detected lcore 41 as core 20 on socket 1 00:05:11.684 EAL: Detected lcore 42 as core 21 on socket 1 00:05:11.684 EAL: Detected lcore 43 as core 25 on socket 1 00:05:11.684 EAL: Detected lcore 44 as core 26 on socket 1 00:05:11.684 EAL: Detected lcore 45 as core 27 on socket 1 00:05:11.684 EAL: Detected lcore 46 as core 28 on socket 1 00:05:11.684 EAL: Detected lcore 47 as core 29 on socket 1 00:05:11.684 EAL: Detected lcore 48 as core 0 on socket 0 00:05:11.684 EAL: Detected lcore 49 as core 1 on socket 0 00:05:11.684 EAL: Detected lcore 50 as core 2 on socket 0 00:05:11.684 EAL: Detected lcore 51 as core 3 on socket 0 00:05:11.684 EAL: Detected lcore 52 as core 4 on socket 0 00:05:11.684 EAL: Detected lcore 53 as core 5 on socket 0 00:05:11.684 EAL: Detected lcore 54 as core 6 on socket 0 00:05:11.684 EAL: Detected lcore 55 as core 8 on socket 0 00:05:11.684 EAL: Detected lcore 56 as core 9 on socket 0 00:05:11.684 EAL: Detected lcore 57 as core 10 on socket 0 00:05:11.684 EAL: Detected lcore 58 as core 11 on socket 0 00:05:11.684 EAL: Detected lcore 59 as core 12 on socket 0 00:05:11.684 EAL: Detected lcore 60 as core 13 on socket 0 00:05:11.684 EAL: Detected lcore 61 as core 16 on socket 0 00:05:11.684 EAL: Detected lcore 62 as core 17 on socket 0 00:05:11.685 EAL: Detected lcore 63 as core 18 on socket 0 00:05:11.685 EAL: Detected lcore 64 as core 19 on socket 0 00:05:11.685 EAL: Detected lcore 65 as core 20 on socket 0 00:05:11.685 EAL: Detected lcore 66 as core 21 on socket 0 00:05:11.685 EAL: Detected lcore 67 as core 25 on socket 0 00:05:11.685 EAL: Detected lcore 68 as core 26 on socket 0 00:05:11.685 EAL: Detected lcore 69 as core 27 on socket 0 00:05:11.685 EAL: Detected lcore 70 as core 28 on socket 0 00:05:11.685 EAL: Detected lcore 71 as core 29 on socket 0 00:05:11.685 EAL: Detected lcore 72 as core 0 on socket 1 00:05:11.685 EAL: Detected lcore 73 as core 1 on socket 1 00:05:11.685 EAL: Detected lcore 74 as core 2 on socket 1 00:05:11.685 EAL: Detected lcore 75 as core 3 on socket 1 00:05:11.685 EAL: Detected lcore 76 as core 4 on socket 1 00:05:11.685 EAL: Detected lcore 77 as core 5 on socket 1 00:05:11.685 EAL: Detected lcore 78 as core 6 on socket 1 00:05:11.685 EAL: Detected lcore 79 as core 8 on socket 1 00:05:11.685 EAL: Detected lcore 80 as core 9 on socket 1 00:05:11.685 EAL: Detected lcore 81 as core 10 on socket 1 00:05:11.685 EAL: Detected lcore 82 as core 11 on socket 1 00:05:11.685 EAL: Detected lcore 83 as core 12 on socket 1 00:05:11.685 EAL: Detected lcore 84 as core 13 on socket 1 00:05:11.685 EAL: Detected lcore 85 as core 16 on socket 1 00:05:11.685 EAL: Detected lcore 86 as core 17 on socket 1 00:05:11.685 EAL: Detected lcore 87 as core 18 on socket 1 00:05:11.685 EAL: Detected lcore 88 as core 19 on socket 1 00:05:11.685 EAL: Detected lcore 89 as core 20 on socket 1 00:05:11.685 EAL: Detected lcore 90 as core 21 on socket 1 00:05:11.685 EAL: Detected lcore 91 as core 25 on socket 1 00:05:11.685 EAL: Detected lcore 92 as core 26 on socket 1 00:05:11.685 EAL: Detected lcore 93 as core 27 on socket 1 00:05:11.685 EAL: Detected lcore 94 as core 28 on socket 1 00:05:11.685 EAL: Detected lcore 95 as core 29 on socket 1 00:05:11.685 EAL: Maximum logical cores by configuration: 128 00:05:11.685 EAL: Detected CPU lcores: 96 00:05:11.685 EAL: Detected NUMA nodes: 2 00:05:11.685 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:05:11.685 EAL: Detected shared linkage of DPDK 00:05:11.685 EAL: No shared files mode enabled, IPC will be disabled 00:05:11.685 EAL: No shared files mode enabled, IPC is disabled 00:05:11.685 EAL: PCI driver qat for device 0000:1a:01.0 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1a:01.1 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1a:01.2 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1a:01.3 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1a:01.4 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1a:01.5 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1a:01.6 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1a:01.7 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1a:02.0 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1a:02.1 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1a:02.2 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1a:02.3 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1a:02.4 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1a:02.5 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1a:02.6 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1a:02.7 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1c:01.0 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1c:01.1 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1c:01.2 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1c:01.3 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1c:01.4 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1c:01.5 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1c:01.6 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1c:01.7 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1c:02.0 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1c:02.1 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1c:02.2 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1c:02.3 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1c:02.4 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1c:02.5 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1c:02.6 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1c:02.7 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1e:01.0 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1e:01.1 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1e:01.2 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1e:01.3 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1e:01.4 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1e:01.5 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1e:01.6 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1e:01.7 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1e:02.0 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1e:02.1 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1e:02.2 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1e:02.3 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1e:02.4 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1e:02.5 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1e:02.6 wants IOVA as 'PA' 00:05:11.685 EAL: PCI driver qat for device 0000:1e:02.7 wants IOVA as 'PA' 00:05:11.685 EAL: Bus pci wants IOVA as 'PA' 00:05:11.685 EAL: Bus auxiliary wants IOVA as 'DC' 00:05:11.685 EAL: Bus vdev wants IOVA as 'DC' 00:05:11.685 EAL: Selected IOVA mode 'PA' 00:05:11.685 EAL: Probing VFIO support... 00:05:11.685 EAL: IOMMU type 1 (Type 1) is supported 00:05:11.685 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:11.685 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:11.685 EAL: VFIO support initialized 00:05:11.685 EAL: Ask a virtual area of 0x2e000 bytes 00:05:11.685 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:11.685 EAL: Setting up physically contiguous memory... 00:05:11.685 EAL: Setting maximum number of open files to 524288 00:05:11.685 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:11.685 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:11.685 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:11.685 EAL: Ask a virtual area of 0x61000 bytes 00:05:11.685 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:11.685 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:11.685 EAL: Ask a virtual area of 0x400000000 bytes 00:05:11.685 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:11.685 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:11.685 EAL: Ask a virtual area of 0x61000 bytes 00:05:11.685 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:11.685 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:11.685 EAL: Ask a virtual area of 0x400000000 bytes 00:05:11.685 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:11.685 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:11.685 EAL: Ask a virtual area of 0x61000 bytes 00:05:11.685 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:11.685 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:11.685 EAL: Ask a virtual area of 0x400000000 bytes 00:05:11.685 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:11.685 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:11.685 EAL: Ask a virtual area of 0x61000 bytes 00:05:11.685 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:11.685 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:11.685 EAL: Ask a virtual area of 0x400000000 bytes 00:05:11.685 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:11.685 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:11.685 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:11.685 EAL: Ask a virtual area of 0x61000 bytes 00:05:11.685 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:11.685 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:11.685 EAL: Ask a virtual area of 0x400000000 bytes 00:05:11.685 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:11.685 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:11.685 EAL: Ask a virtual area of 0x61000 bytes 00:05:11.685 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:11.685 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:11.685 EAL: Ask a virtual area of 0x400000000 bytes 00:05:11.685 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:11.685 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:11.685 EAL: Ask a virtual area of 0x61000 bytes 00:05:11.685 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:11.685 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:11.685 EAL: Ask a virtual area of 0x400000000 bytes 00:05:11.685 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:11.685 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:11.685 EAL: Ask a virtual area of 0x61000 bytes 00:05:11.685 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:11.685 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:11.685 EAL: Ask a virtual area of 0x400000000 bytes 00:05:11.685 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:11.685 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:11.685 EAL: Hugepages will be freed exactly as allocated. 00:05:11.685 EAL: No shared files mode enabled, IPC is disabled 00:05:11.685 EAL: No shared files mode enabled, IPC is disabled 00:05:11.685 EAL: TSC frequency is ~2100000 KHz 00:05:11.685 EAL: Main lcore 0 is ready (tid=7f1214cfdb00;cpuset=[0]) 00:05:11.685 EAL: Trying to obtain current memory policy. 00:05:11.685 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.685 EAL: Restoring previous memory policy: 0 00:05:11.685 EAL: request: mp_malloc_sync 00:05:11.685 EAL: No shared files mode enabled, IPC is disabled 00:05:11.685 EAL: Heap on socket 0 was expanded by 2MB 00:05:11.685 EAL: PCI device 0000:1a:01.0 on NUMA socket 0 00:05:11.685 EAL: probe driver: 8086:37c9 qat 00:05:11.685 EAL: PCI memory mapped at 0x202001000000 00:05:11.685 EAL: PCI memory mapped at 0x202001001000 00:05:11.685 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.0 (socket 0) 00:05:11.685 EAL: PCI device 0000:1a:01.1 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001002000 00:05:11.686 EAL: PCI memory mapped at 0x202001003000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.1 (socket 0) 00:05:11.686 EAL: PCI device 0000:1a:01.2 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001004000 00:05:11.686 EAL: PCI memory mapped at 0x202001005000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.2 (socket 0) 00:05:11.686 EAL: PCI device 0000:1a:01.3 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001006000 00:05:11.686 EAL: PCI memory mapped at 0x202001007000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.3 (socket 0) 00:05:11.686 EAL: PCI device 0000:1a:01.4 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001008000 00:05:11.686 EAL: PCI memory mapped at 0x202001009000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.4 (socket 0) 00:05:11.686 EAL: PCI device 0000:1a:01.5 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x20200100a000 00:05:11.686 EAL: PCI memory mapped at 0x20200100b000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.5 (socket 0) 00:05:11.686 EAL: PCI device 0000:1a:01.6 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x20200100c000 00:05:11.686 EAL: PCI memory mapped at 0x20200100d000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.6 (socket 0) 00:05:11.686 EAL: PCI device 0000:1a:01.7 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x20200100e000 00:05:11.686 EAL: PCI memory mapped at 0x20200100f000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.7 (socket 0) 00:05:11.686 EAL: PCI device 0000:1a:02.0 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001010000 00:05:11.686 EAL: PCI memory mapped at 0x202001011000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.0 (socket 0) 00:05:11.686 EAL: PCI device 0000:1a:02.1 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001012000 00:05:11.686 EAL: PCI memory mapped at 0x202001013000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.1 (socket 0) 00:05:11.686 EAL: PCI device 0000:1a:02.2 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001014000 00:05:11.686 EAL: PCI memory mapped at 0x202001015000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.2 (socket 0) 00:05:11.686 EAL: PCI device 0000:1a:02.3 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001016000 00:05:11.686 EAL: PCI memory mapped at 0x202001017000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.3 (socket 0) 00:05:11.686 EAL: PCI device 0000:1a:02.4 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001018000 00:05:11.686 EAL: PCI memory mapped at 0x202001019000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.4 (socket 0) 00:05:11.686 EAL: PCI device 0000:1a:02.5 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x20200101a000 00:05:11.686 EAL: PCI memory mapped at 0x20200101b000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.5 (socket 0) 00:05:11.686 EAL: PCI device 0000:1a:02.6 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x20200101c000 00:05:11.686 EAL: PCI memory mapped at 0x20200101d000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.6 (socket 0) 00:05:11.686 EAL: PCI device 0000:1a:02.7 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x20200101e000 00:05:11.686 EAL: PCI memory mapped at 0x20200101f000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.7 (socket 0) 00:05:11.686 EAL: PCI device 0000:1c:01.0 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001020000 00:05:11.686 EAL: PCI memory mapped at 0x202001021000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.0 (socket 0) 00:05:11.686 EAL: PCI device 0000:1c:01.1 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001022000 00:05:11.686 EAL: PCI memory mapped at 0x202001023000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.1 (socket 0) 00:05:11.686 EAL: PCI device 0000:1c:01.2 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001024000 00:05:11.686 EAL: PCI memory mapped at 0x202001025000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.2 (socket 0) 00:05:11.686 EAL: PCI device 0000:1c:01.3 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001026000 00:05:11.686 EAL: PCI memory mapped at 0x202001027000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.3 (socket 0) 00:05:11.686 EAL: PCI device 0000:1c:01.4 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001028000 00:05:11.686 EAL: PCI memory mapped at 0x202001029000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.4 (socket 0) 00:05:11.686 EAL: PCI device 0000:1c:01.5 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x20200102a000 00:05:11.686 EAL: PCI memory mapped at 0x20200102b000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.5 (socket 0) 00:05:11.686 EAL: PCI device 0000:1c:01.6 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x20200102c000 00:05:11.686 EAL: PCI memory mapped at 0x20200102d000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.6 (socket 0) 00:05:11.686 EAL: PCI device 0000:1c:01.7 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x20200102e000 00:05:11.686 EAL: PCI memory mapped at 0x20200102f000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.7 (socket 0) 00:05:11.686 EAL: PCI device 0000:1c:02.0 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001030000 00:05:11.686 EAL: PCI memory mapped at 0x202001031000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.0 (socket 0) 00:05:11.686 EAL: PCI device 0000:1c:02.1 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001032000 00:05:11.686 EAL: PCI memory mapped at 0x202001033000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.1 (socket 0) 00:05:11.686 EAL: PCI device 0000:1c:02.2 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001034000 00:05:11.686 EAL: PCI memory mapped at 0x202001035000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.2 (socket 0) 00:05:11.686 EAL: PCI device 0000:1c:02.3 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001036000 00:05:11.686 EAL: PCI memory mapped at 0x202001037000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.3 (socket 0) 00:05:11.686 EAL: PCI device 0000:1c:02.4 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001038000 00:05:11.686 EAL: PCI memory mapped at 0x202001039000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.4 (socket 0) 00:05:11.686 EAL: PCI device 0000:1c:02.5 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x20200103a000 00:05:11.686 EAL: PCI memory mapped at 0x20200103b000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.5 (socket 0) 00:05:11.686 EAL: PCI device 0000:1c:02.6 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x20200103c000 00:05:11.686 EAL: PCI memory mapped at 0x20200103d000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.6 (socket 0) 00:05:11.686 EAL: PCI device 0000:1c:02.7 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x20200103e000 00:05:11.686 EAL: PCI memory mapped at 0x20200103f000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.7 (socket 0) 00:05:11.686 EAL: PCI device 0000:1e:01.0 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001040000 00:05:11.686 EAL: PCI memory mapped at 0x202001041000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.0 (socket 0) 00:05:11.686 EAL: PCI device 0000:1e:01.1 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001042000 00:05:11.686 EAL: PCI memory mapped at 0x202001043000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.1 (socket 0) 00:05:11.686 EAL: PCI device 0000:1e:01.2 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001044000 00:05:11.686 EAL: PCI memory mapped at 0x202001045000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.2 (socket 0) 00:05:11.686 EAL: PCI device 0000:1e:01.3 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001046000 00:05:11.686 EAL: PCI memory mapped at 0x202001047000 00:05:11.686 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.3 (socket 0) 00:05:11.686 EAL: PCI device 0000:1e:01.4 on NUMA socket 0 00:05:11.686 EAL: probe driver: 8086:37c9 qat 00:05:11.686 EAL: PCI memory mapped at 0x202001048000 00:05:11.687 EAL: PCI memory mapped at 0x202001049000 00:05:11.687 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.4 (socket 0) 00:05:11.687 EAL: PCI device 0000:1e:01.5 on NUMA socket 0 00:05:11.687 EAL: probe driver: 8086:37c9 qat 00:05:11.687 EAL: PCI memory mapped at 0x20200104a000 00:05:11.687 EAL: PCI memory mapped at 0x20200104b000 00:05:11.687 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.5 (socket 0) 00:05:11.687 EAL: PCI device 0000:1e:01.6 on NUMA socket 0 00:05:11.687 EAL: probe driver: 8086:37c9 qat 00:05:11.687 EAL: PCI memory mapped at 0x20200104c000 00:05:11.687 EAL: PCI memory mapped at 0x20200104d000 00:05:11.687 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.6 (socket 0) 00:05:11.687 EAL: PCI device 0000:1e:01.7 on NUMA socket 0 00:05:11.687 EAL: probe driver: 8086:37c9 qat 00:05:11.687 EAL: PCI memory mapped at 0x20200104e000 00:05:11.687 EAL: PCI memory mapped at 0x20200104f000 00:05:11.687 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.7 (socket 0) 00:05:11.687 EAL: PCI device 0000:1e:02.0 on NUMA socket 0 00:05:11.687 EAL: probe driver: 8086:37c9 qat 00:05:11.687 EAL: PCI memory mapped at 0x202001050000 00:05:11.687 EAL: PCI memory mapped at 0x202001051000 00:05:11.687 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.0 (socket 0) 00:05:11.687 EAL: PCI device 0000:1e:02.1 on NUMA socket 0 00:05:11.687 EAL: probe driver: 8086:37c9 qat 00:05:11.687 EAL: PCI memory mapped at 0x202001052000 00:05:11.687 EAL: PCI memory mapped at 0x202001053000 00:05:11.687 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.1 (socket 0) 00:05:11.687 EAL: PCI device 0000:1e:02.2 on NUMA socket 0 00:05:11.687 EAL: probe driver: 8086:37c9 qat 00:05:11.687 EAL: PCI memory mapped at 0x202001054000 00:05:11.687 EAL: PCI memory mapped at 0x202001055000 00:05:11.687 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.2 (socket 0) 00:05:11.687 EAL: PCI device 0000:1e:02.3 on NUMA socket 0 00:05:11.687 EAL: probe driver: 8086:37c9 qat 00:05:11.687 EAL: PCI memory mapped at 0x202001056000 00:05:11.687 EAL: PCI memory mapped at 0x202001057000 00:05:11.687 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.3 (socket 0) 00:05:11.687 EAL: PCI device 0000:1e:02.4 on NUMA socket 0 00:05:11.687 EAL: probe driver: 8086:37c9 qat 00:05:11.687 EAL: PCI memory mapped at 0x202001058000 00:05:11.687 EAL: PCI memory mapped at 0x202001059000 00:05:11.687 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.4 (socket 0) 00:05:11.687 EAL: PCI device 0000:1e:02.5 on NUMA socket 0 00:05:11.687 EAL: probe driver: 8086:37c9 qat 00:05:11.687 EAL: PCI memory mapped at 0x20200105a000 00:05:11.687 EAL: PCI memory mapped at 0x20200105b000 00:05:11.687 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.5 (socket 0) 00:05:11.687 EAL: PCI device 0000:1e:02.6 on NUMA socket 0 00:05:11.687 EAL: probe driver: 8086:37c9 qat 00:05:11.687 EAL: PCI memory mapped at 0x20200105c000 00:05:11.687 EAL: PCI memory mapped at 0x20200105d000 00:05:11.687 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.6 (socket 0) 00:05:11.687 EAL: PCI device 0000:1e:02.7 on NUMA socket 0 00:05:11.687 EAL: probe driver: 8086:37c9 qat 00:05:11.687 EAL: PCI memory mapped at 0x20200105e000 00:05:11.687 EAL: PCI memory mapped at 0x20200105f000 00:05:11.687 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.7 (socket 0) 00:05:11.687 EAL: No shared files mode enabled, IPC is disabled 00:05:11.687 EAL: No shared files mode enabled, IPC is disabled 00:05:11.687 EAL: No PCI address specified using 'addr=' in: bus=pci 00:05:11.687 EAL: Mem event callback 'spdk:(nil)' registered 00:05:11.687 00:05:11.687 00:05:11.687 CUnit - A unit testing framework for C - Version 2.1-3 00:05:11.687 http://cunit.sourceforge.net/ 00:05:11.687 00:05:11.687 00:05:11.687 Suite: components_suite 00:05:11.687 Test: vtophys_malloc_test ...passed 00:05:11.687 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:11.687 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.687 EAL: Restoring previous memory policy: 4 00:05:11.687 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.687 EAL: request: mp_malloc_sync 00:05:11.687 EAL: No shared files mode enabled, IPC is disabled 00:05:11.687 EAL: Heap on socket 0 was expanded by 4MB 00:05:11.687 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.687 EAL: request: mp_malloc_sync 00:05:11.687 EAL: No shared files mode enabled, IPC is disabled 00:05:11.687 EAL: Heap on socket 0 was shrunk by 4MB 00:05:11.687 EAL: Trying to obtain current memory policy. 00:05:11.687 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.687 EAL: Restoring previous memory policy: 4 00:05:11.687 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.687 EAL: request: mp_malloc_sync 00:05:11.687 EAL: No shared files mode enabled, IPC is disabled 00:05:11.687 EAL: Heap on socket 0 was expanded by 6MB 00:05:11.687 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.687 EAL: request: mp_malloc_sync 00:05:11.687 EAL: No shared files mode enabled, IPC is disabled 00:05:11.687 EAL: Heap on socket 0 was shrunk by 6MB 00:05:11.687 EAL: Trying to obtain current memory policy. 00:05:11.687 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.687 EAL: Restoring previous memory policy: 4 00:05:11.687 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.687 EAL: request: mp_malloc_sync 00:05:11.687 EAL: No shared files mode enabled, IPC is disabled 00:05:11.687 EAL: Heap on socket 0 was expanded by 10MB 00:05:11.687 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.687 EAL: request: mp_malloc_sync 00:05:11.687 EAL: No shared files mode enabled, IPC is disabled 00:05:11.687 EAL: Heap on socket 0 was shrunk by 10MB 00:05:11.687 EAL: Trying to obtain current memory policy. 00:05:11.687 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.687 EAL: Restoring previous memory policy: 4 00:05:11.687 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.687 EAL: request: mp_malloc_sync 00:05:11.687 EAL: No shared files mode enabled, IPC is disabled 00:05:11.687 EAL: Heap on socket 0 was expanded by 18MB 00:05:11.687 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.687 EAL: request: mp_malloc_sync 00:05:11.687 EAL: No shared files mode enabled, IPC is disabled 00:05:11.687 EAL: Heap on socket 0 was shrunk by 18MB 00:05:11.687 EAL: Trying to obtain current memory policy. 00:05:11.687 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.687 EAL: Restoring previous memory policy: 4 00:05:11.687 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.687 EAL: request: mp_malloc_sync 00:05:11.687 EAL: No shared files mode enabled, IPC is disabled 00:05:11.687 EAL: Heap on socket 0 was expanded by 34MB 00:05:11.687 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.687 EAL: request: mp_malloc_sync 00:05:11.687 EAL: No shared files mode enabled, IPC is disabled 00:05:11.687 EAL: Heap on socket 0 was shrunk by 34MB 00:05:11.687 EAL: Trying to obtain current memory policy. 00:05:11.687 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.687 EAL: Restoring previous memory policy: 4 00:05:11.687 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.687 EAL: request: mp_malloc_sync 00:05:11.687 EAL: No shared files mode enabled, IPC is disabled 00:05:11.687 EAL: Heap on socket 0 was expanded by 66MB 00:05:11.687 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.687 EAL: request: mp_malloc_sync 00:05:11.687 EAL: No shared files mode enabled, IPC is disabled 00:05:11.687 EAL: Heap on socket 0 was shrunk by 66MB 00:05:11.687 EAL: Trying to obtain current memory policy. 00:05:11.687 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.687 EAL: Restoring previous memory policy: 4 00:05:11.687 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.687 EAL: request: mp_malloc_sync 00:05:11.687 EAL: No shared files mode enabled, IPC is disabled 00:05:11.687 EAL: Heap on socket 0 was expanded by 130MB 00:05:11.946 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.946 EAL: request: mp_malloc_sync 00:05:11.946 EAL: No shared files mode enabled, IPC is disabled 00:05:11.946 EAL: Heap on socket 0 was shrunk by 130MB 00:05:11.946 EAL: Trying to obtain current memory policy. 00:05:11.946 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.946 EAL: Restoring previous memory policy: 4 00:05:11.946 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.946 EAL: request: mp_malloc_sync 00:05:11.946 EAL: No shared files mode enabled, IPC is disabled 00:05:11.946 EAL: Heap on socket 0 was expanded by 258MB 00:05:11.946 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.946 EAL: request: mp_malloc_sync 00:05:11.946 EAL: No shared files mode enabled, IPC is disabled 00:05:11.946 EAL: Heap on socket 0 was shrunk by 258MB 00:05:11.946 EAL: Trying to obtain current memory policy. 00:05:11.946 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.946 EAL: Restoring previous memory policy: 4 00:05:11.946 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.946 EAL: request: mp_malloc_sync 00:05:11.946 EAL: No shared files mode enabled, IPC is disabled 00:05:11.946 EAL: Heap on socket 0 was expanded by 514MB 00:05:12.205 EAL: Calling mem event callback 'spdk:(nil)' 00:05:12.205 EAL: request: mp_malloc_sync 00:05:12.205 EAL: No shared files mode enabled, IPC is disabled 00:05:12.205 EAL: Heap on socket 0 was shrunk by 514MB 00:05:12.205 EAL: Trying to obtain current memory policy. 00:05:12.205 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:12.464 EAL: Restoring previous memory policy: 4 00:05:12.464 EAL: Calling mem event callback 'spdk:(nil)' 00:05:12.464 EAL: request: mp_malloc_sync 00:05:12.464 EAL: No shared files mode enabled, IPC is disabled 00:05:12.464 EAL: Heap on socket 0 was expanded by 1026MB 00:05:12.464 EAL: Calling mem event callback 'spdk:(nil)' 00:05:12.722 EAL: request: mp_malloc_sync 00:05:12.722 EAL: No shared files mode enabled, IPC is disabled 00:05:12.722 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:12.722 passed 00:05:12.722 00:05:12.722 Run Summary: Type Total Ran Passed Failed Inactive 00:05:12.722 suites 1 1 n/a 0 0 00:05:12.722 tests 2 2 2 0 0 00:05:12.722 asserts 6618 6618 6618 0 n/a 00:05:12.722 00:05:12.722 Elapsed time = 0.958 seconds 00:05:12.722 EAL: No shared files mode enabled, IPC is disabled 00:05:12.722 EAL: No shared files mode enabled, IPC is disabled 00:05:12.722 EAL: No shared files mode enabled, IPC is disabled 00:05:12.722 00:05:12.722 real 0m1.083s 00:05:12.722 user 0m0.635s 00:05:12.722 sys 0m0.423s 00:05:12.722 18:41:57 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:12.723 18:41:57 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:05:12.723 ************************************ 00:05:12.723 END TEST env_vtophys 00:05:12.723 ************************************ 00:05:12.723 18:41:57 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:12.723 18:41:57 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:12.723 18:41:57 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:12.723 18:41:57 env -- common/autotest_common.sh@10 -- # set +x 00:05:12.723 ************************************ 00:05:12.723 START TEST env_pci 00:05:12.723 ************************************ 00:05:12.723 18:41:57 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:12.723 00:05:12.723 00:05:12.723 CUnit - A unit testing framework for C - Version 2.1-3 00:05:12.723 http://cunit.sourceforge.net/ 00:05:12.723 00:05:12.723 00:05:12.723 Suite: pci 00:05:12.723 Test: pci_hook ...[2024-07-24 18:41:57.686530] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 2001467 has claimed it 00:05:12.723 EAL: Cannot find device (10000:00:01.0) 00:05:12.723 EAL: Failed to attach device on primary process 00:05:12.723 passed 00:05:12.723 00:05:12.723 Run Summary: Type Total Ran Passed Failed Inactive 00:05:12.723 suites 1 1 n/a 0 0 00:05:12.723 tests 1 1 1 0 0 00:05:12.723 asserts 25 25 25 0 n/a 00:05:12.723 00:05:12.723 Elapsed time = 0.028 seconds 00:05:12.723 00:05:12.723 real 0m0.054s 00:05:12.723 user 0m0.023s 00:05:12.723 sys 0m0.031s 00:05:12.723 18:41:57 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:12.723 18:41:57 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:05:12.723 ************************************ 00:05:12.723 END TEST env_pci 00:05:12.723 ************************************ 00:05:12.982 18:41:57 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:12.982 18:41:57 env -- env/env.sh@15 -- # uname 00:05:12.982 18:41:57 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:12.982 18:41:57 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:12.982 18:41:57 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:12.982 18:41:57 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:05:12.982 18:41:57 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:12.982 18:41:57 env -- common/autotest_common.sh@10 -- # set +x 00:05:12.982 ************************************ 00:05:12.982 START TEST env_dpdk_post_init 00:05:12.982 ************************************ 00:05:12.982 18:41:57 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:12.982 EAL: Detected CPU lcores: 96 00:05:12.982 EAL: Detected NUMA nodes: 2 00:05:12.982 EAL: Detected shared linkage of DPDK 00:05:12.982 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:12.982 EAL: Selected IOVA mode 'PA' 00:05:12.982 EAL: VFIO support initialized 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.0 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.1 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.2 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.3 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.4 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.5 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.6 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.7 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.0 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.1 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.2 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.3 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.4 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.5 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.6 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.7 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.0 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.1 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.2 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.3 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_asym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_sym 00:05:12.982 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.4 (socket 0) 00:05:12.982 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.5 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.6 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.7 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.0 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.1 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.2 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.3 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.4 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.5 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.6 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.7 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.0 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.1 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.2 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.3 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.4 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.5 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.6 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.7 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.0 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.1 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.2 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.3 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.4 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.5 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.6 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.7 (socket 0) 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_asym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.983 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_sym 00:05:12.983 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.983 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:12.983 EAL: Using IOMMU type 1 (Type 1) 00:05:12.983 EAL: Ignore mapping IO port bar(1) 00:05:12.983 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:05:12.983 EAL: Ignore mapping IO port bar(1) 00:05:12.983 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:05:12.983 EAL: Ignore mapping IO port bar(1) 00:05:12.983 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:05:12.983 EAL: Ignore mapping IO port bar(1) 00:05:12.983 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:05:12.983 EAL: Ignore mapping IO port bar(1) 00:05:12.983 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:05:12.983 EAL: Ignore mapping IO port bar(1) 00:05:12.983 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:05:13.241 EAL: Ignore mapping IO port bar(1) 00:05:13.241 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:05:13.241 EAL: Ignore mapping IO port bar(1) 00:05:13.241 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:05:13.808 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:5e:00.0 (socket 0) 00:05:13.808 EAL: Ignore mapping IO port bar(1) 00:05:13.808 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:05:13.808 EAL: Ignore mapping IO port bar(1) 00:05:13.808 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:05:13.808 EAL: Ignore mapping IO port bar(1) 00:05:13.808 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:05:13.808 EAL: Ignore mapping IO port bar(1) 00:05:13.808 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:05:13.808 EAL: Ignore mapping IO port bar(1) 00:05:13.808 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:05:14.067 EAL: Ignore mapping IO port bar(1) 00:05:14.067 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:05:14.067 EAL: Ignore mapping IO port bar(1) 00:05:14.067 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:05:14.067 EAL: Ignore mapping IO port bar(1) 00:05:14.067 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:05:17.358 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:05:17.358 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:05:17.358 Starting DPDK initialization... 00:05:17.358 Starting SPDK post initialization... 00:05:17.358 SPDK NVMe probe 00:05:17.358 Attaching to 0000:5e:00.0 00:05:17.358 Attached to 0000:5e:00.0 00:05:17.358 Cleaning up... 00:05:17.358 00:05:17.358 real 0m4.328s 00:05:17.358 user 0m3.256s 00:05:17.358 sys 0m0.144s 00:05:17.358 18:42:02 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:17.358 18:42:02 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:05:17.358 ************************************ 00:05:17.358 END TEST env_dpdk_post_init 00:05:17.358 ************************************ 00:05:17.358 18:42:02 env -- env/env.sh@26 -- # uname 00:05:17.358 18:42:02 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:17.358 18:42:02 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:17.358 18:42:02 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:17.358 18:42:02 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:17.358 18:42:02 env -- common/autotest_common.sh@10 -- # set +x 00:05:17.358 ************************************ 00:05:17.358 START TEST env_mem_callbacks 00:05:17.358 ************************************ 00:05:17.358 18:42:02 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:17.358 EAL: Detected CPU lcores: 96 00:05:17.358 EAL: Detected NUMA nodes: 2 00:05:17.358 EAL: Detected shared linkage of DPDK 00:05:17.358 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:17.358 EAL: Selected IOVA mode 'PA' 00:05:17.358 EAL: VFIO support initialized 00:05:17.358 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.0 (socket 0) 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_asym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_sym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.358 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.1 (socket 0) 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_asym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_sym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.358 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.2 (socket 0) 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_asym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_sym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.358 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.3 (socket 0) 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_asym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_sym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.358 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.4 (socket 0) 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_asym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_sym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.358 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.5 (socket 0) 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_asym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_sym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.358 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.6 (socket 0) 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_asym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_sym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.358 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.7 (socket 0) 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_asym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_sym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.358 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.0 (socket 0) 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_asym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_sym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.358 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.1 (socket 0) 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_asym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_sym 00:05:17.358 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.358 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.2 (socket 0) 00:05:17.358 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.3 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.4 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.5 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.6 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.7 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.0 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.1 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.2 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.3 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.4 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.5 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.6 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.7 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.0 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.1 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.2 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.3 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.4 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.5 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.6 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.7 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.0 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.1 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.2 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.3 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.4 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_sym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.5 (socket 0) 00:05:17.359 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_asym 00:05:17.359 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_sym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.6 (socket 0) 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_asym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_sym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.7 (socket 0) 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_asym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_sym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.0 (socket 0) 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_asym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_sym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.1 (socket 0) 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_asym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_sym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.2 (socket 0) 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_asym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_sym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.3 (socket 0) 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_asym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_sym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.4 (socket 0) 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_asym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_sym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.5 (socket 0) 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_asym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_sym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.6 (socket 0) 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_asym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_sym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.7 (socket 0) 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_asym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:17.360 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_sym 00:05:17.360 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:17.360 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:17.360 00:05:17.360 00:05:17.360 CUnit - A unit testing framework for C - Version 2.1-3 00:05:17.360 http://cunit.sourceforge.net/ 00:05:17.360 00:05:17.360 00:05:17.360 Suite: memory 00:05:17.360 Test: test ... 00:05:17.360 register 0x200000200000 2097152 00:05:17.360 malloc 3145728 00:05:17.360 register 0x200000400000 4194304 00:05:17.360 buf 0x200000500000 len 3145728 PASSED 00:05:17.360 malloc 64 00:05:17.360 buf 0x2000004fff40 len 64 PASSED 00:05:17.360 malloc 4194304 00:05:17.360 register 0x200000800000 6291456 00:05:17.360 buf 0x200000a00000 len 4194304 PASSED 00:05:17.360 free 0x200000500000 3145728 00:05:17.360 free 0x2000004fff40 64 00:05:17.360 unregister 0x200000400000 4194304 PASSED 00:05:17.360 free 0x200000a00000 4194304 00:05:17.360 unregister 0x200000800000 6291456 PASSED 00:05:17.360 malloc 8388608 00:05:17.360 register 0x200000400000 10485760 00:05:17.360 buf 0x200000600000 len 8388608 PASSED 00:05:17.360 free 0x200000600000 8388608 00:05:17.360 unregister 0x200000400000 10485760 PASSED 00:05:17.360 passed 00:05:17.360 00:05:17.360 Run Summary: Type Total Ran Passed Failed Inactive 00:05:17.360 suites 1 1 n/a 0 0 00:05:17.360 tests 1 1 1 0 0 00:05:17.360 asserts 15 15 15 0 n/a 00:05:17.360 00:05:17.360 Elapsed time = 0.006 seconds 00:05:17.360 00:05:17.360 real 0m0.068s 00:05:17.360 user 0m0.024s 00:05:17.360 sys 0m0.044s 00:05:17.360 18:42:02 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:17.360 18:42:02 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:05:17.360 ************************************ 00:05:17.360 END TEST env_mem_callbacks 00:05:17.360 ************************************ 00:05:17.360 00:05:17.360 real 0m6.101s 00:05:17.360 user 0m4.235s 00:05:17.360 sys 0m0.941s 00:05:17.360 18:42:02 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:17.360 18:42:02 env -- common/autotest_common.sh@10 -- # set +x 00:05:17.360 ************************************ 00:05:17.360 END TEST env 00:05:17.360 ************************************ 00:05:17.360 18:42:02 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:17.360 18:42:02 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:17.360 18:42:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:17.360 18:42:02 -- common/autotest_common.sh@10 -- # set +x 00:05:17.360 ************************************ 00:05:17.360 START TEST rpc 00:05:17.360 ************************************ 00:05:17.360 18:42:02 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:17.620 * Looking for test storage... 00:05:17.620 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:17.620 18:42:02 rpc -- rpc/rpc.sh@65 -- # spdk_pid=2002571 00:05:17.620 18:42:02 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:17.620 18:42:02 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:17.620 18:42:02 rpc -- rpc/rpc.sh@67 -- # waitforlisten 2002571 00:05:17.620 18:42:02 rpc -- common/autotest_common.sh@829 -- # '[' -z 2002571 ']' 00:05:17.620 18:42:02 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:17.620 18:42:02 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:17.620 18:42:02 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:17.620 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:17.620 18:42:02 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:17.620 18:42:02 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:17.620 [2024-07-24 18:42:02.502338] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:05:17.620 [2024-07-24 18:42:02.502380] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2002571 ] 00:05:17.620 [2024-07-24 18:42:02.567795] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:17.879 [2024-07-24 18:42:02.646313] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:17.879 [2024-07-24 18:42:02.646346] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 2002571' to capture a snapshot of events at runtime. 00:05:17.879 [2024-07-24 18:42:02.646356] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:17.879 [2024-07-24 18:42:02.646362] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:17.879 [2024-07-24 18:42:02.646366] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid2002571 for offline analysis/debug. 00:05:17.879 [2024-07-24 18:42:02.646384] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:18.448 18:42:03 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:18.448 18:42:03 rpc -- common/autotest_common.sh@862 -- # return 0 00:05:18.448 18:42:03 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:18.448 18:42:03 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:18.448 18:42:03 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:18.448 18:42:03 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:18.448 18:42:03 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:18.448 18:42:03 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:18.448 18:42:03 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:18.448 ************************************ 00:05:18.448 START TEST rpc_integrity 00:05:18.448 ************************************ 00:05:18.448 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:05:18.448 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:18.448 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:18.448 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:18.448 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:18.448 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:18.448 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:18.448 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:18.448 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:18.448 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:18.448 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:18.448 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:18.448 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:18.448 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:18.448 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:18.448 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:18.448 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:18.448 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:18.448 { 00:05:18.448 "name": "Malloc0", 00:05:18.448 "aliases": [ 00:05:18.448 "6850d5e1-fce6-4212-8b11-6194477f7fb0" 00:05:18.448 ], 00:05:18.448 "product_name": "Malloc disk", 00:05:18.448 "block_size": 512, 00:05:18.448 "num_blocks": 16384, 00:05:18.448 "uuid": "6850d5e1-fce6-4212-8b11-6194477f7fb0", 00:05:18.448 "assigned_rate_limits": { 00:05:18.448 "rw_ios_per_sec": 0, 00:05:18.448 "rw_mbytes_per_sec": 0, 00:05:18.448 "r_mbytes_per_sec": 0, 00:05:18.448 "w_mbytes_per_sec": 0 00:05:18.448 }, 00:05:18.448 "claimed": false, 00:05:18.448 "zoned": false, 00:05:18.448 "supported_io_types": { 00:05:18.448 "read": true, 00:05:18.448 "write": true, 00:05:18.448 "unmap": true, 00:05:18.448 "flush": true, 00:05:18.448 "reset": true, 00:05:18.448 "nvme_admin": false, 00:05:18.448 "nvme_io": false, 00:05:18.448 "nvme_io_md": false, 00:05:18.448 "write_zeroes": true, 00:05:18.448 "zcopy": true, 00:05:18.448 "get_zone_info": false, 00:05:18.448 "zone_management": false, 00:05:18.448 "zone_append": false, 00:05:18.448 "compare": false, 00:05:18.448 "compare_and_write": false, 00:05:18.448 "abort": true, 00:05:18.448 "seek_hole": false, 00:05:18.448 "seek_data": false, 00:05:18.448 "copy": true, 00:05:18.448 "nvme_iov_md": false 00:05:18.448 }, 00:05:18.448 "memory_domains": [ 00:05:18.448 { 00:05:18.448 "dma_device_id": "system", 00:05:18.448 "dma_device_type": 1 00:05:18.448 }, 00:05:18.448 { 00:05:18.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:18.448 "dma_device_type": 2 00:05:18.448 } 00:05:18.448 ], 00:05:18.448 "driver_specific": {} 00:05:18.448 } 00:05:18.448 ]' 00:05:18.448 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:18.707 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:18.707 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:18.707 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:18.707 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:18.707 [2024-07-24 18:42:03.462821] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:18.707 [2024-07-24 18:42:03.462848] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:18.707 [2024-07-24 18:42:03.462859] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b841d0 00:05:18.707 [2024-07-24 18:42:03.462865] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:18.707 [2024-07-24 18:42:03.463936] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:18.707 [2024-07-24 18:42:03.463956] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:18.707 Passthru0 00:05:18.707 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:18.707 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:18.707 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:18.707 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:18.707 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:18.707 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:18.707 { 00:05:18.707 "name": "Malloc0", 00:05:18.707 "aliases": [ 00:05:18.707 "6850d5e1-fce6-4212-8b11-6194477f7fb0" 00:05:18.707 ], 00:05:18.707 "product_name": "Malloc disk", 00:05:18.707 "block_size": 512, 00:05:18.707 "num_blocks": 16384, 00:05:18.707 "uuid": "6850d5e1-fce6-4212-8b11-6194477f7fb0", 00:05:18.707 "assigned_rate_limits": { 00:05:18.707 "rw_ios_per_sec": 0, 00:05:18.707 "rw_mbytes_per_sec": 0, 00:05:18.707 "r_mbytes_per_sec": 0, 00:05:18.707 "w_mbytes_per_sec": 0 00:05:18.707 }, 00:05:18.707 "claimed": true, 00:05:18.707 "claim_type": "exclusive_write", 00:05:18.707 "zoned": false, 00:05:18.707 "supported_io_types": { 00:05:18.707 "read": true, 00:05:18.707 "write": true, 00:05:18.707 "unmap": true, 00:05:18.707 "flush": true, 00:05:18.707 "reset": true, 00:05:18.707 "nvme_admin": false, 00:05:18.707 "nvme_io": false, 00:05:18.707 "nvme_io_md": false, 00:05:18.707 "write_zeroes": true, 00:05:18.707 "zcopy": true, 00:05:18.707 "get_zone_info": false, 00:05:18.707 "zone_management": false, 00:05:18.707 "zone_append": false, 00:05:18.707 "compare": false, 00:05:18.707 "compare_and_write": false, 00:05:18.707 "abort": true, 00:05:18.707 "seek_hole": false, 00:05:18.707 "seek_data": false, 00:05:18.707 "copy": true, 00:05:18.707 "nvme_iov_md": false 00:05:18.707 }, 00:05:18.707 "memory_domains": [ 00:05:18.707 { 00:05:18.707 "dma_device_id": "system", 00:05:18.707 "dma_device_type": 1 00:05:18.707 }, 00:05:18.707 { 00:05:18.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:18.707 "dma_device_type": 2 00:05:18.707 } 00:05:18.707 ], 00:05:18.707 "driver_specific": {} 00:05:18.707 }, 00:05:18.707 { 00:05:18.707 "name": "Passthru0", 00:05:18.707 "aliases": [ 00:05:18.707 "9125ae33-998a-5fbb-bcb9-a91b04262d40" 00:05:18.707 ], 00:05:18.707 "product_name": "passthru", 00:05:18.707 "block_size": 512, 00:05:18.707 "num_blocks": 16384, 00:05:18.707 "uuid": "9125ae33-998a-5fbb-bcb9-a91b04262d40", 00:05:18.707 "assigned_rate_limits": { 00:05:18.707 "rw_ios_per_sec": 0, 00:05:18.707 "rw_mbytes_per_sec": 0, 00:05:18.707 "r_mbytes_per_sec": 0, 00:05:18.707 "w_mbytes_per_sec": 0 00:05:18.707 }, 00:05:18.707 "claimed": false, 00:05:18.707 "zoned": false, 00:05:18.707 "supported_io_types": { 00:05:18.707 "read": true, 00:05:18.707 "write": true, 00:05:18.707 "unmap": true, 00:05:18.707 "flush": true, 00:05:18.707 "reset": true, 00:05:18.707 "nvme_admin": false, 00:05:18.707 "nvme_io": false, 00:05:18.707 "nvme_io_md": false, 00:05:18.707 "write_zeroes": true, 00:05:18.707 "zcopy": true, 00:05:18.707 "get_zone_info": false, 00:05:18.707 "zone_management": false, 00:05:18.707 "zone_append": false, 00:05:18.707 "compare": false, 00:05:18.707 "compare_and_write": false, 00:05:18.707 "abort": true, 00:05:18.707 "seek_hole": false, 00:05:18.707 "seek_data": false, 00:05:18.707 "copy": true, 00:05:18.707 "nvme_iov_md": false 00:05:18.707 }, 00:05:18.707 "memory_domains": [ 00:05:18.707 { 00:05:18.707 "dma_device_id": "system", 00:05:18.707 "dma_device_type": 1 00:05:18.707 }, 00:05:18.707 { 00:05:18.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:18.707 "dma_device_type": 2 00:05:18.707 } 00:05:18.707 ], 00:05:18.707 "driver_specific": { 00:05:18.707 "passthru": { 00:05:18.707 "name": "Passthru0", 00:05:18.707 "base_bdev_name": "Malloc0" 00:05:18.707 } 00:05:18.707 } 00:05:18.707 } 00:05:18.707 ]' 00:05:18.707 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:18.707 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:18.707 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:18.707 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:18.707 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:18.707 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:18.707 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:18.707 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:18.707 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:18.707 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:18.707 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:18.707 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:18.707 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:18.707 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:18.707 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:18.707 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:18.707 18:42:03 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:18.707 00:05:18.707 real 0m0.263s 00:05:18.707 user 0m0.160s 00:05:18.707 sys 0m0.039s 00:05:18.707 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:18.707 18:42:03 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:18.707 ************************************ 00:05:18.707 END TEST rpc_integrity 00:05:18.707 ************************************ 00:05:18.707 18:42:03 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:18.707 18:42:03 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:18.707 18:42:03 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:18.707 18:42:03 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:18.707 ************************************ 00:05:18.707 START TEST rpc_plugins 00:05:18.707 ************************************ 00:05:18.707 18:42:03 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:05:18.707 18:42:03 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:18.707 18:42:03 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:18.707 18:42:03 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:18.707 18:42:03 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:18.708 18:42:03 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:18.708 18:42:03 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:18.708 18:42:03 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:18.708 18:42:03 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:18.708 18:42:03 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:18.708 18:42:03 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:18.708 { 00:05:18.708 "name": "Malloc1", 00:05:18.708 "aliases": [ 00:05:18.708 "6abdd883-d98e-4222-96d4-3ba966592ce4" 00:05:18.708 ], 00:05:18.708 "product_name": "Malloc disk", 00:05:18.708 "block_size": 4096, 00:05:18.708 "num_blocks": 256, 00:05:18.708 "uuid": "6abdd883-d98e-4222-96d4-3ba966592ce4", 00:05:18.708 "assigned_rate_limits": { 00:05:18.708 "rw_ios_per_sec": 0, 00:05:18.708 "rw_mbytes_per_sec": 0, 00:05:18.708 "r_mbytes_per_sec": 0, 00:05:18.708 "w_mbytes_per_sec": 0 00:05:18.708 }, 00:05:18.708 "claimed": false, 00:05:18.708 "zoned": false, 00:05:18.708 "supported_io_types": { 00:05:18.708 "read": true, 00:05:18.708 "write": true, 00:05:18.708 "unmap": true, 00:05:18.708 "flush": true, 00:05:18.708 "reset": true, 00:05:18.708 "nvme_admin": false, 00:05:18.708 "nvme_io": false, 00:05:18.708 "nvme_io_md": false, 00:05:18.708 "write_zeroes": true, 00:05:18.708 "zcopy": true, 00:05:18.708 "get_zone_info": false, 00:05:18.708 "zone_management": false, 00:05:18.708 "zone_append": false, 00:05:18.708 "compare": false, 00:05:18.708 "compare_and_write": false, 00:05:18.708 "abort": true, 00:05:18.708 "seek_hole": false, 00:05:18.708 "seek_data": false, 00:05:18.708 "copy": true, 00:05:18.708 "nvme_iov_md": false 00:05:18.708 }, 00:05:18.708 "memory_domains": [ 00:05:18.708 { 00:05:18.708 "dma_device_id": "system", 00:05:18.708 "dma_device_type": 1 00:05:18.708 }, 00:05:18.708 { 00:05:18.708 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:18.708 "dma_device_type": 2 00:05:18.708 } 00:05:18.708 ], 00:05:18.708 "driver_specific": {} 00:05:18.708 } 00:05:18.708 ]' 00:05:18.708 18:42:03 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:05:18.966 18:42:03 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:18.966 18:42:03 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:18.966 18:42:03 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:18.966 18:42:03 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:18.966 18:42:03 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:18.966 18:42:03 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:18.966 18:42:03 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:18.966 18:42:03 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:18.966 18:42:03 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:18.966 18:42:03 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:18.966 18:42:03 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:05:18.966 18:42:03 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:18.966 00:05:18.966 real 0m0.129s 00:05:18.966 user 0m0.078s 00:05:18.966 sys 0m0.015s 00:05:18.966 18:42:03 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:18.966 18:42:03 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:18.966 ************************************ 00:05:18.966 END TEST rpc_plugins 00:05:18.966 ************************************ 00:05:18.966 18:42:03 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:18.966 18:42:03 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:18.966 18:42:03 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:18.966 18:42:03 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:18.966 ************************************ 00:05:18.966 START TEST rpc_trace_cmd_test 00:05:18.966 ************************************ 00:05:18.966 18:42:03 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:05:18.966 18:42:03 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:05:18.966 18:42:03 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:18.966 18:42:03 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:18.967 18:42:03 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:18.967 18:42:03 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:18.967 18:42:03 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:05:18.967 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid2002571", 00:05:18.967 "tpoint_group_mask": "0x8", 00:05:18.967 "iscsi_conn": { 00:05:18.967 "mask": "0x2", 00:05:18.967 "tpoint_mask": "0x0" 00:05:18.967 }, 00:05:18.967 "scsi": { 00:05:18.967 "mask": "0x4", 00:05:18.967 "tpoint_mask": "0x0" 00:05:18.967 }, 00:05:18.967 "bdev": { 00:05:18.967 "mask": "0x8", 00:05:18.967 "tpoint_mask": "0xffffffffffffffff" 00:05:18.967 }, 00:05:18.967 "nvmf_rdma": { 00:05:18.967 "mask": "0x10", 00:05:18.967 "tpoint_mask": "0x0" 00:05:18.967 }, 00:05:18.967 "nvmf_tcp": { 00:05:18.967 "mask": "0x20", 00:05:18.967 "tpoint_mask": "0x0" 00:05:18.967 }, 00:05:18.967 "ftl": { 00:05:18.967 "mask": "0x40", 00:05:18.967 "tpoint_mask": "0x0" 00:05:18.967 }, 00:05:18.967 "blobfs": { 00:05:18.967 "mask": "0x80", 00:05:18.967 "tpoint_mask": "0x0" 00:05:18.967 }, 00:05:18.967 "dsa": { 00:05:18.967 "mask": "0x200", 00:05:18.967 "tpoint_mask": "0x0" 00:05:18.967 }, 00:05:18.967 "thread": { 00:05:18.967 "mask": "0x400", 00:05:18.967 "tpoint_mask": "0x0" 00:05:18.967 }, 00:05:18.967 "nvme_pcie": { 00:05:18.967 "mask": "0x800", 00:05:18.967 "tpoint_mask": "0x0" 00:05:18.967 }, 00:05:18.967 "iaa": { 00:05:18.967 "mask": "0x1000", 00:05:18.967 "tpoint_mask": "0x0" 00:05:18.967 }, 00:05:18.967 "nvme_tcp": { 00:05:18.967 "mask": "0x2000", 00:05:18.967 "tpoint_mask": "0x0" 00:05:18.967 }, 00:05:18.967 "bdev_nvme": { 00:05:18.967 "mask": "0x4000", 00:05:18.967 "tpoint_mask": "0x0" 00:05:18.967 }, 00:05:18.967 "sock": { 00:05:18.967 "mask": "0x8000", 00:05:18.967 "tpoint_mask": "0x0" 00:05:18.967 } 00:05:18.967 }' 00:05:18.967 18:42:03 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:05:18.967 18:42:03 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:05:18.967 18:42:03 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:18.967 18:42:03 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:18.967 18:42:03 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:19.226 18:42:03 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:19.226 18:42:03 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:19.226 18:42:04 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:19.226 18:42:04 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:19.226 18:42:04 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:19.226 00:05:19.226 real 0m0.193s 00:05:19.226 user 0m0.161s 00:05:19.226 sys 0m0.024s 00:05:19.226 18:42:04 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:19.226 18:42:04 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:19.226 ************************************ 00:05:19.226 END TEST rpc_trace_cmd_test 00:05:19.226 ************************************ 00:05:19.226 18:42:04 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:19.226 18:42:04 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:19.226 18:42:04 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:19.226 18:42:04 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:19.226 18:42:04 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:19.226 18:42:04 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:19.226 ************************************ 00:05:19.226 START TEST rpc_daemon_integrity 00:05:19.226 ************************************ 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:19.226 { 00:05:19.226 "name": "Malloc2", 00:05:19.226 "aliases": [ 00:05:19.226 "98b4d05b-07fb-4a02-a825-f5bf8b453eec" 00:05:19.226 ], 00:05:19.226 "product_name": "Malloc disk", 00:05:19.226 "block_size": 512, 00:05:19.226 "num_blocks": 16384, 00:05:19.226 "uuid": "98b4d05b-07fb-4a02-a825-f5bf8b453eec", 00:05:19.226 "assigned_rate_limits": { 00:05:19.226 "rw_ios_per_sec": 0, 00:05:19.226 "rw_mbytes_per_sec": 0, 00:05:19.226 "r_mbytes_per_sec": 0, 00:05:19.226 "w_mbytes_per_sec": 0 00:05:19.226 }, 00:05:19.226 "claimed": false, 00:05:19.226 "zoned": false, 00:05:19.226 "supported_io_types": { 00:05:19.226 "read": true, 00:05:19.226 "write": true, 00:05:19.226 "unmap": true, 00:05:19.226 "flush": true, 00:05:19.226 "reset": true, 00:05:19.226 "nvme_admin": false, 00:05:19.226 "nvme_io": false, 00:05:19.226 "nvme_io_md": false, 00:05:19.226 "write_zeroes": true, 00:05:19.226 "zcopy": true, 00:05:19.226 "get_zone_info": false, 00:05:19.226 "zone_management": false, 00:05:19.226 "zone_append": false, 00:05:19.226 "compare": false, 00:05:19.226 "compare_and_write": false, 00:05:19.226 "abort": true, 00:05:19.226 "seek_hole": false, 00:05:19.226 "seek_data": false, 00:05:19.226 "copy": true, 00:05:19.226 "nvme_iov_md": false 00:05:19.226 }, 00:05:19.226 "memory_domains": [ 00:05:19.226 { 00:05:19.226 "dma_device_id": "system", 00:05:19.226 "dma_device_type": 1 00:05:19.226 }, 00:05:19.226 { 00:05:19.226 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:19.226 "dma_device_type": 2 00:05:19.226 } 00:05:19.226 ], 00:05:19.226 "driver_specific": {} 00:05:19.226 } 00:05:19.226 ]' 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:19.226 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:19.486 [2024-07-24 18:42:04.236928] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:19.486 [2024-07-24 18:42:04.236953] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:19.486 [2024-07-24 18:42:04.236966] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b83dc0 00:05:19.486 [2024-07-24 18:42:04.236972] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:19.486 [2024-07-24 18:42:04.237912] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:19.486 [2024-07-24 18:42:04.237931] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:19.486 Passthru0 00:05:19.486 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:19.486 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:19.486 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:19.486 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:19.486 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:19.486 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:19.486 { 00:05:19.486 "name": "Malloc2", 00:05:19.486 "aliases": [ 00:05:19.486 "98b4d05b-07fb-4a02-a825-f5bf8b453eec" 00:05:19.486 ], 00:05:19.486 "product_name": "Malloc disk", 00:05:19.486 "block_size": 512, 00:05:19.486 "num_blocks": 16384, 00:05:19.486 "uuid": "98b4d05b-07fb-4a02-a825-f5bf8b453eec", 00:05:19.486 "assigned_rate_limits": { 00:05:19.486 "rw_ios_per_sec": 0, 00:05:19.486 "rw_mbytes_per_sec": 0, 00:05:19.486 "r_mbytes_per_sec": 0, 00:05:19.486 "w_mbytes_per_sec": 0 00:05:19.486 }, 00:05:19.486 "claimed": true, 00:05:19.486 "claim_type": "exclusive_write", 00:05:19.486 "zoned": false, 00:05:19.486 "supported_io_types": { 00:05:19.486 "read": true, 00:05:19.486 "write": true, 00:05:19.486 "unmap": true, 00:05:19.486 "flush": true, 00:05:19.486 "reset": true, 00:05:19.486 "nvme_admin": false, 00:05:19.486 "nvme_io": false, 00:05:19.486 "nvme_io_md": false, 00:05:19.486 "write_zeroes": true, 00:05:19.486 "zcopy": true, 00:05:19.486 "get_zone_info": false, 00:05:19.486 "zone_management": false, 00:05:19.486 "zone_append": false, 00:05:19.486 "compare": false, 00:05:19.486 "compare_and_write": false, 00:05:19.486 "abort": true, 00:05:19.486 "seek_hole": false, 00:05:19.486 "seek_data": false, 00:05:19.486 "copy": true, 00:05:19.486 "nvme_iov_md": false 00:05:19.486 }, 00:05:19.486 "memory_domains": [ 00:05:19.486 { 00:05:19.486 "dma_device_id": "system", 00:05:19.486 "dma_device_type": 1 00:05:19.486 }, 00:05:19.486 { 00:05:19.487 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:19.487 "dma_device_type": 2 00:05:19.487 } 00:05:19.487 ], 00:05:19.487 "driver_specific": {} 00:05:19.487 }, 00:05:19.487 { 00:05:19.487 "name": "Passthru0", 00:05:19.487 "aliases": [ 00:05:19.487 "ba53f416-7d3c-5e4e-ab3c-cba2ee30fdf8" 00:05:19.487 ], 00:05:19.487 "product_name": "passthru", 00:05:19.487 "block_size": 512, 00:05:19.487 "num_blocks": 16384, 00:05:19.487 "uuid": "ba53f416-7d3c-5e4e-ab3c-cba2ee30fdf8", 00:05:19.487 "assigned_rate_limits": { 00:05:19.487 "rw_ios_per_sec": 0, 00:05:19.487 "rw_mbytes_per_sec": 0, 00:05:19.487 "r_mbytes_per_sec": 0, 00:05:19.487 "w_mbytes_per_sec": 0 00:05:19.487 }, 00:05:19.487 "claimed": false, 00:05:19.487 "zoned": false, 00:05:19.487 "supported_io_types": { 00:05:19.487 "read": true, 00:05:19.487 "write": true, 00:05:19.487 "unmap": true, 00:05:19.487 "flush": true, 00:05:19.487 "reset": true, 00:05:19.487 "nvme_admin": false, 00:05:19.487 "nvme_io": false, 00:05:19.487 "nvme_io_md": false, 00:05:19.487 "write_zeroes": true, 00:05:19.487 "zcopy": true, 00:05:19.487 "get_zone_info": false, 00:05:19.487 "zone_management": false, 00:05:19.487 "zone_append": false, 00:05:19.487 "compare": false, 00:05:19.487 "compare_and_write": false, 00:05:19.487 "abort": true, 00:05:19.487 "seek_hole": false, 00:05:19.487 "seek_data": false, 00:05:19.487 "copy": true, 00:05:19.487 "nvme_iov_md": false 00:05:19.487 }, 00:05:19.487 "memory_domains": [ 00:05:19.487 { 00:05:19.487 "dma_device_id": "system", 00:05:19.487 "dma_device_type": 1 00:05:19.487 }, 00:05:19.487 { 00:05:19.487 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:19.487 "dma_device_type": 2 00:05:19.487 } 00:05:19.487 ], 00:05:19.487 "driver_specific": { 00:05:19.487 "passthru": { 00:05:19.487 "name": "Passthru0", 00:05:19.487 "base_bdev_name": "Malloc2" 00:05:19.487 } 00:05:19.487 } 00:05:19.487 } 00:05:19.487 ]' 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:19.487 00:05:19.487 real 0m0.257s 00:05:19.487 user 0m0.162s 00:05:19.487 sys 0m0.036s 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:19.487 18:42:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:19.487 ************************************ 00:05:19.487 END TEST rpc_daemon_integrity 00:05:19.487 ************************************ 00:05:19.487 18:42:04 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:19.487 18:42:04 rpc -- rpc/rpc.sh@84 -- # killprocess 2002571 00:05:19.487 18:42:04 rpc -- common/autotest_common.sh@948 -- # '[' -z 2002571 ']' 00:05:19.487 18:42:04 rpc -- common/autotest_common.sh@952 -- # kill -0 2002571 00:05:19.487 18:42:04 rpc -- common/autotest_common.sh@953 -- # uname 00:05:19.487 18:42:04 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:19.487 18:42:04 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2002571 00:05:19.487 18:42:04 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:19.487 18:42:04 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:19.487 18:42:04 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2002571' 00:05:19.487 killing process with pid 2002571 00:05:19.487 18:42:04 rpc -- common/autotest_common.sh@967 -- # kill 2002571 00:05:19.487 18:42:04 rpc -- common/autotest_common.sh@972 -- # wait 2002571 00:05:19.746 00:05:19.746 real 0m2.383s 00:05:19.746 user 0m3.020s 00:05:19.746 sys 0m0.664s 00:05:19.746 18:42:04 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:19.746 18:42:04 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:19.747 ************************************ 00:05:19.747 END TEST rpc 00:05:19.747 ************************************ 00:05:20.005 18:42:04 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:20.005 18:42:04 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:20.005 18:42:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:20.005 18:42:04 -- common/autotest_common.sh@10 -- # set +x 00:05:20.005 ************************************ 00:05:20.005 START TEST skip_rpc 00:05:20.005 ************************************ 00:05:20.005 18:42:04 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:20.005 * Looking for test storage... 00:05:20.005 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:20.005 18:42:04 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:20.005 18:42:04 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:20.005 18:42:04 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:05:20.005 18:42:04 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:20.005 18:42:04 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:20.005 18:42:04 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:20.005 ************************************ 00:05:20.005 START TEST skip_rpc 00:05:20.005 ************************************ 00:05:20.005 18:42:04 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:05:20.005 18:42:04 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:05:20.005 18:42:04 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=2003192 00:05:20.005 18:42:04 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:20.005 18:42:04 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:05:20.005 [2024-07-24 18:42:04.976630] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:05:20.006 [2024-07-24 18:42:04.976667] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2003192 ] 00:05:20.264 [2024-07-24 18:42:05.040481] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:20.264 [2024-07-24 18:42:05.113123] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 2003192 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 2003192 ']' 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 2003192 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2003192 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2003192' 00:05:25.537 killing process with pid 2003192 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 2003192 00:05:25.537 18:42:09 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 2003192 00:05:25.537 00:05:25.537 real 0m5.364s 00:05:25.537 user 0m5.115s 00:05:25.537 sys 0m0.259s 00:05:25.537 18:42:10 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:25.537 18:42:10 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:25.537 ************************************ 00:05:25.537 END TEST skip_rpc 00:05:25.537 ************************************ 00:05:25.537 18:42:10 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:05:25.537 18:42:10 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:25.537 18:42:10 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:25.537 18:42:10 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:25.537 ************************************ 00:05:25.537 START TEST skip_rpc_with_json 00:05:25.537 ************************************ 00:05:25.537 18:42:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:05:25.537 18:42:10 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:05:25.537 18:42:10 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=2004518 00:05:25.537 18:42:10 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:25.537 18:42:10 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:25.537 18:42:10 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 2004518 00:05:25.537 18:42:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 2004518 ']' 00:05:25.537 18:42:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:25.537 18:42:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:25.537 18:42:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:25.537 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:25.537 18:42:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:25.537 18:42:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:25.537 [2024-07-24 18:42:10.408489] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:05:25.537 [2024-07-24 18:42:10.408527] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2004518 ] 00:05:25.537 [2024-07-24 18:42:10.472133] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:25.796 [2024-07-24 18:42:10.549862] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:26.364 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:26.364 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:05:26.364 18:42:11 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:05:26.364 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:26.364 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:26.364 [2024-07-24 18:42:11.203647] nvmf_rpc.c:2569:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:05:26.364 request: 00:05:26.364 { 00:05:26.364 "trtype": "tcp", 00:05:26.364 "method": "nvmf_get_transports", 00:05:26.364 "req_id": 1 00:05:26.364 } 00:05:26.364 Got JSON-RPC error response 00:05:26.364 response: 00:05:26.364 { 00:05:26.364 "code": -19, 00:05:26.364 "message": "No such device" 00:05:26.364 } 00:05:26.364 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:05:26.364 18:42:11 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:05:26.364 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:26.364 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:26.364 [2024-07-24 18:42:11.211751] tcp.c: 729:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:05:26.364 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:26.364 18:42:11 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:05:26.364 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:26.364 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:26.364 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:26.364 18:42:11 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:26.364 { 00:05:26.364 "subsystems": [ 00:05:26.364 { 00:05:26.364 "subsystem": "keyring", 00:05:26.364 "config": [] 00:05:26.364 }, 00:05:26.364 { 00:05:26.364 "subsystem": "iobuf", 00:05:26.364 "config": [ 00:05:26.364 { 00:05:26.364 "method": "iobuf_set_options", 00:05:26.364 "params": { 00:05:26.364 "small_pool_count": 8192, 00:05:26.364 "large_pool_count": 1024, 00:05:26.364 "small_bufsize": 8192, 00:05:26.364 "large_bufsize": 135168 00:05:26.364 } 00:05:26.364 } 00:05:26.364 ] 00:05:26.364 }, 00:05:26.364 { 00:05:26.364 "subsystem": "sock", 00:05:26.364 "config": [ 00:05:26.364 { 00:05:26.364 "method": "sock_set_default_impl", 00:05:26.364 "params": { 00:05:26.364 "impl_name": "posix" 00:05:26.364 } 00:05:26.364 }, 00:05:26.364 { 00:05:26.364 "method": "sock_impl_set_options", 00:05:26.364 "params": { 00:05:26.364 "impl_name": "ssl", 00:05:26.364 "recv_buf_size": 4096, 00:05:26.364 "send_buf_size": 4096, 00:05:26.364 "enable_recv_pipe": true, 00:05:26.364 "enable_quickack": false, 00:05:26.364 "enable_placement_id": 0, 00:05:26.364 "enable_zerocopy_send_server": true, 00:05:26.364 "enable_zerocopy_send_client": false, 00:05:26.364 "zerocopy_threshold": 0, 00:05:26.364 "tls_version": 0, 00:05:26.364 "enable_ktls": false 00:05:26.364 } 00:05:26.364 }, 00:05:26.364 { 00:05:26.364 "method": "sock_impl_set_options", 00:05:26.364 "params": { 00:05:26.364 "impl_name": "posix", 00:05:26.364 "recv_buf_size": 2097152, 00:05:26.364 "send_buf_size": 2097152, 00:05:26.364 "enable_recv_pipe": true, 00:05:26.364 "enable_quickack": false, 00:05:26.364 "enable_placement_id": 0, 00:05:26.364 "enable_zerocopy_send_server": true, 00:05:26.364 "enable_zerocopy_send_client": false, 00:05:26.364 "zerocopy_threshold": 0, 00:05:26.364 "tls_version": 0, 00:05:26.364 "enable_ktls": false 00:05:26.364 } 00:05:26.364 } 00:05:26.364 ] 00:05:26.364 }, 00:05:26.364 { 00:05:26.364 "subsystem": "vmd", 00:05:26.364 "config": [] 00:05:26.364 }, 00:05:26.364 { 00:05:26.364 "subsystem": "accel", 00:05:26.364 "config": [ 00:05:26.364 { 00:05:26.364 "method": "accel_set_options", 00:05:26.364 "params": { 00:05:26.364 "small_cache_size": 128, 00:05:26.364 "large_cache_size": 16, 00:05:26.364 "task_count": 2048, 00:05:26.364 "sequence_count": 2048, 00:05:26.364 "buf_count": 2048 00:05:26.364 } 00:05:26.364 } 00:05:26.364 ] 00:05:26.364 }, 00:05:26.364 { 00:05:26.364 "subsystem": "bdev", 00:05:26.364 "config": [ 00:05:26.364 { 00:05:26.365 "method": "bdev_set_options", 00:05:26.365 "params": { 00:05:26.365 "bdev_io_pool_size": 65535, 00:05:26.365 "bdev_io_cache_size": 256, 00:05:26.365 "bdev_auto_examine": true, 00:05:26.365 "iobuf_small_cache_size": 128, 00:05:26.365 "iobuf_large_cache_size": 16 00:05:26.365 } 00:05:26.365 }, 00:05:26.365 { 00:05:26.365 "method": "bdev_raid_set_options", 00:05:26.365 "params": { 00:05:26.365 "process_window_size_kb": 1024, 00:05:26.365 "process_max_bandwidth_mb_sec": 0 00:05:26.365 } 00:05:26.365 }, 00:05:26.365 { 00:05:26.365 "method": "bdev_iscsi_set_options", 00:05:26.365 "params": { 00:05:26.365 "timeout_sec": 30 00:05:26.365 } 00:05:26.365 }, 00:05:26.365 { 00:05:26.365 "method": "bdev_nvme_set_options", 00:05:26.365 "params": { 00:05:26.365 "action_on_timeout": "none", 00:05:26.365 "timeout_us": 0, 00:05:26.365 "timeout_admin_us": 0, 00:05:26.365 "keep_alive_timeout_ms": 10000, 00:05:26.365 "arbitration_burst": 0, 00:05:26.365 "low_priority_weight": 0, 00:05:26.365 "medium_priority_weight": 0, 00:05:26.365 "high_priority_weight": 0, 00:05:26.365 "nvme_adminq_poll_period_us": 10000, 00:05:26.365 "nvme_ioq_poll_period_us": 0, 00:05:26.365 "io_queue_requests": 0, 00:05:26.365 "delay_cmd_submit": true, 00:05:26.365 "transport_retry_count": 4, 00:05:26.365 "bdev_retry_count": 3, 00:05:26.365 "transport_ack_timeout": 0, 00:05:26.365 "ctrlr_loss_timeout_sec": 0, 00:05:26.365 "reconnect_delay_sec": 0, 00:05:26.365 "fast_io_fail_timeout_sec": 0, 00:05:26.365 "disable_auto_failback": false, 00:05:26.365 "generate_uuids": false, 00:05:26.365 "transport_tos": 0, 00:05:26.365 "nvme_error_stat": false, 00:05:26.365 "rdma_srq_size": 0, 00:05:26.365 "io_path_stat": false, 00:05:26.365 "allow_accel_sequence": false, 00:05:26.365 "rdma_max_cq_size": 0, 00:05:26.365 "rdma_cm_event_timeout_ms": 0, 00:05:26.365 "dhchap_digests": [ 00:05:26.365 "sha256", 00:05:26.365 "sha384", 00:05:26.365 "sha512" 00:05:26.365 ], 00:05:26.365 "dhchap_dhgroups": [ 00:05:26.365 "null", 00:05:26.365 "ffdhe2048", 00:05:26.365 "ffdhe3072", 00:05:26.365 "ffdhe4096", 00:05:26.365 "ffdhe6144", 00:05:26.365 "ffdhe8192" 00:05:26.365 ] 00:05:26.365 } 00:05:26.365 }, 00:05:26.365 { 00:05:26.365 "method": "bdev_nvme_set_hotplug", 00:05:26.365 "params": { 00:05:26.365 "period_us": 100000, 00:05:26.365 "enable": false 00:05:26.365 } 00:05:26.365 }, 00:05:26.365 { 00:05:26.365 "method": "bdev_wait_for_examine" 00:05:26.365 } 00:05:26.365 ] 00:05:26.365 }, 00:05:26.365 { 00:05:26.365 "subsystem": "scsi", 00:05:26.365 "config": null 00:05:26.365 }, 00:05:26.365 { 00:05:26.365 "subsystem": "scheduler", 00:05:26.365 "config": [ 00:05:26.365 { 00:05:26.365 "method": "framework_set_scheduler", 00:05:26.365 "params": { 00:05:26.365 "name": "static" 00:05:26.365 } 00:05:26.365 } 00:05:26.365 ] 00:05:26.365 }, 00:05:26.365 { 00:05:26.365 "subsystem": "vhost_scsi", 00:05:26.365 "config": [] 00:05:26.365 }, 00:05:26.365 { 00:05:26.365 "subsystem": "vhost_blk", 00:05:26.365 "config": [] 00:05:26.365 }, 00:05:26.365 { 00:05:26.365 "subsystem": "ublk", 00:05:26.365 "config": [] 00:05:26.365 }, 00:05:26.365 { 00:05:26.365 "subsystem": "nbd", 00:05:26.365 "config": [] 00:05:26.365 }, 00:05:26.365 { 00:05:26.365 "subsystem": "nvmf", 00:05:26.365 "config": [ 00:05:26.365 { 00:05:26.365 "method": "nvmf_set_config", 00:05:26.365 "params": { 00:05:26.365 "discovery_filter": "match_any", 00:05:26.365 "admin_cmd_passthru": { 00:05:26.365 "identify_ctrlr": false 00:05:26.365 } 00:05:26.365 } 00:05:26.365 }, 00:05:26.365 { 00:05:26.365 "method": "nvmf_set_max_subsystems", 00:05:26.365 "params": { 00:05:26.365 "max_subsystems": 1024 00:05:26.365 } 00:05:26.365 }, 00:05:26.365 { 00:05:26.365 "method": "nvmf_set_crdt", 00:05:26.365 "params": { 00:05:26.365 "crdt1": 0, 00:05:26.365 "crdt2": 0, 00:05:26.365 "crdt3": 0 00:05:26.365 } 00:05:26.365 }, 00:05:26.365 { 00:05:26.365 "method": "nvmf_create_transport", 00:05:26.365 "params": { 00:05:26.365 "trtype": "TCP", 00:05:26.365 "max_queue_depth": 128, 00:05:26.365 "max_io_qpairs_per_ctrlr": 127, 00:05:26.365 "in_capsule_data_size": 4096, 00:05:26.365 "max_io_size": 131072, 00:05:26.365 "io_unit_size": 131072, 00:05:26.365 "max_aq_depth": 128, 00:05:26.365 "num_shared_buffers": 511, 00:05:26.365 "buf_cache_size": 4294967295, 00:05:26.365 "dif_insert_or_strip": false, 00:05:26.365 "zcopy": false, 00:05:26.365 "c2h_success": true, 00:05:26.365 "sock_priority": 0, 00:05:26.365 "abort_timeout_sec": 1, 00:05:26.365 "ack_timeout": 0, 00:05:26.365 "data_wr_pool_size": 0 00:05:26.365 } 00:05:26.365 } 00:05:26.365 ] 00:05:26.365 }, 00:05:26.365 { 00:05:26.365 "subsystem": "iscsi", 00:05:26.365 "config": [ 00:05:26.365 { 00:05:26.365 "method": "iscsi_set_options", 00:05:26.365 "params": { 00:05:26.365 "node_base": "iqn.2016-06.io.spdk", 00:05:26.365 "max_sessions": 128, 00:05:26.365 "max_connections_per_session": 2, 00:05:26.365 "max_queue_depth": 64, 00:05:26.365 "default_time2wait": 2, 00:05:26.365 "default_time2retain": 20, 00:05:26.365 "first_burst_length": 8192, 00:05:26.365 "immediate_data": true, 00:05:26.365 "allow_duplicated_isid": false, 00:05:26.365 "error_recovery_level": 0, 00:05:26.365 "nop_timeout": 60, 00:05:26.365 "nop_in_interval": 30, 00:05:26.365 "disable_chap": false, 00:05:26.365 "require_chap": false, 00:05:26.365 "mutual_chap": false, 00:05:26.365 "chap_group": 0, 00:05:26.365 "max_large_datain_per_connection": 64, 00:05:26.365 "max_r2t_per_connection": 4, 00:05:26.365 "pdu_pool_size": 36864, 00:05:26.365 "immediate_data_pool_size": 16384, 00:05:26.365 "data_out_pool_size": 2048 00:05:26.365 } 00:05:26.365 } 00:05:26.365 ] 00:05:26.365 } 00:05:26.365 ] 00:05:26.365 } 00:05:26.365 18:42:11 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:05:26.365 18:42:11 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 2004518 00:05:26.365 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 2004518 ']' 00:05:26.365 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 2004518 00:05:26.365 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:05:26.365 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:26.365 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2004518 00:05:26.624 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:26.624 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:26.624 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2004518' 00:05:26.624 killing process with pid 2004518 00:05:26.624 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 2004518 00:05:26.624 18:42:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 2004518 00:05:26.884 18:42:11 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=2004754 00:05:26.884 18:42:11 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:26.884 18:42:11 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:05:32.153 18:42:16 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 2004754 00:05:32.153 18:42:16 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 2004754 ']' 00:05:32.153 18:42:16 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 2004754 00:05:32.153 18:42:16 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:05:32.153 18:42:16 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:32.153 18:42:16 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2004754 00:05:32.153 18:42:16 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:32.153 18:42:16 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:32.153 18:42:16 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2004754' 00:05:32.153 killing process with pid 2004754 00:05:32.153 18:42:16 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 2004754 00:05:32.153 18:42:16 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 2004754 00:05:32.153 18:42:17 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:32.153 18:42:17 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:32.153 00:05:32.153 real 0m6.693s 00:05:32.153 user 0m6.465s 00:05:32.153 sys 0m0.591s 00:05:32.153 18:42:17 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:32.153 18:42:17 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:32.153 ************************************ 00:05:32.153 END TEST skip_rpc_with_json 00:05:32.153 ************************************ 00:05:32.153 18:42:17 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:05:32.153 18:42:17 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:32.153 18:42:17 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:32.153 18:42:17 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:32.153 ************************************ 00:05:32.153 START TEST skip_rpc_with_delay 00:05:32.153 ************************************ 00:05:32.153 18:42:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:05:32.153 18:42:17 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:32.153 18:42:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:05:32.153 18:42:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:32.153 18:42:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:32.153 18:42:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:32.153 18:42:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:32.153 18:42:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:32.153 18:42:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:32.153 18:42:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:32.153 18:42:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:32.153 18:42:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:32.153 18:42:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:32.413 [2024-07-24 18:42:17.163032] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:05:32.413 [2024-07-24 18:42:17.163090] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:05:32.413 18:42:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:05:32.413 18:42:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:32.413 18:42:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:05:32.413 18:42:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:32.413 00:05:32.413 real 0m0.073s 00:05:32.413 user 0m0.040s 00:05:32.413 sys 0m0.032s 00:05:32.413 18:42:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:32.413 18:42:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:05:32.413 ************************************ 00:05:32.413 END TEST skip_rpc_with_delay 00:05:32.413 ************************************ 00:05:32.413 18:42:17 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:05:32.413 18:42:17 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:05:32.413 18:42:17 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:05:32.413 18:42:17 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:32.413 18:42:17 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:32.413 18:42:17 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:32.413 ************************************ 00:05:32.413 START TEST exit_on_failed_rpc_init 00:05:32.413 ************************************ 00:05:32.413 18:42:17 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:05:32.413 18:42:17 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=2005722 00:05:32.413 18:42:17 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 2005722 00:05:32.413 18:42:17 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:32.413 18:42:17 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 2005722 ']' 00:05:32.413 18:42:17 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:32.413 18:42:17 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:32.413 18:42:17 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:32.413 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:32.413 18:42:17 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:32.413 18:42:17 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:05:32.413 [2024-07-24 18:42:17.296139] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:05:32.413 [2024-07-24 18:42:17.296179] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2005722 ] 00:05:32.413 [2024-07-24 18:42:17.359876] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:32.672 [2024-07-24 18:42:17.437798] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:33.240 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:33.240 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:05:33.240 18:42:18 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:33.240 18:42:18 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:33.240 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:05:33.240 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:33.240 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:33.240 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:33.240 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:33.240 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:33.240 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:33.240 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:33.240 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:33.240 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:33.240 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:33.240 [2024-07-24 18:42:18.126674] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:05:33.240 [2024-07-24 18:42:18.126717] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2005840 ] 00:05:33.240 [2024-07-24 18:42:18.189252] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:33.500 [2024-07-24 18:42:18.265968] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:05:33.500 [2024-07-24 18:42:18.266027] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:05:33.500 [2024-07-24 18:42:18.266036] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:05:33.500 [2024-07-24 18:42:18.266042] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:33.500 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:05:33.500 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:33.500 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:05:33.500 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:05:33.500 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:05:33.500 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:33.500 18:42:18 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:05:33.500 18:42:18 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 2005722 00:05:33.500 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 2005722 ']' 00:05:33.500 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 2005722 00:05:33.500 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:05:33.500 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:33.500 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2005722 00:05:33.500 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:33.500 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:33.500 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2005722' 00:05:33.500 killing process with pid 2005722 00:05:33.500 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 2005722 00:05:33.500 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 2005722 00:05:33.778 00:05:33.778 real 0m1.441s 00:05:33.778 user 0m1.644s 00:05:33.778 sys 0m0.392s 00:05:33.778 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:33.778 18:42:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:05:33.778 ************************************ 00:05:33.778 END TEST exit_on_failed_rpc_init 00:05:33.778 ************************************ 00:05:33.778 18:42:18 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:33.778 00:05:33.778 real 0m13.900s 00:05:33.778 user 0m13.405s 00:05:33.778 sys 0m1.483s 00:05:33.778 18:42:18 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:33.778 18:42:18 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:33.778 ************************************ 00:05:33.778 END TEST skip_rpc 00:05:33.778 ************************************ 00:05:33.778 18:42:18 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:33.778 18:42:18 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:33.778 18:42:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:33.778 18:42:18 -- common/autotest_common.sh@10 -- # set +x 00:05:34.076 ************************************ 00:05:34.076 START TEST rpc_client 00:05:34.076 ************************************ 00:05:34.076 18:42:18 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:34.076 * Looking for test storage... 00:05:34.076 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:05:34.076 18:42:18 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:34.076 OK 00:05:34.076 18:42:18 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:34.076 00:05:34.076 real 0m0.115s 00:05:34.076 user 0m0.054s 00:05:34.076 sys 0m0.069s 00:05:34.076 18:42:18 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:34.076 18:42:18 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:05:34.076 ************************************ 00:05:34.076 END TEST rpc_client 00:05:34.076 ************************************ 00:05:34.076 18:42:18 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:05:34.076 18:42:18 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:34.076 18:42:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:34.076 18:42:18 -- common/autotest_common.sh@10 -- # set +x 00:05:34.076 ************************************ 00:05:34.076 START TEST json_config 00:05:34.076 ************************************ 00:05:34.076 18:42:18 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:05:34.076 18:42:19 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:05:34.076 18:42:19 json_config -- nvmf/common.sh@7 -- # uname -s 00:05:34.076 18:42:19 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:34.076 18:42:19 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:34.076 18:42:19 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:34.076 18:42:19 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:34.076 18:42:19 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:34.076 18:42:19 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:34.076 18:42:19 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:34.076 18:42:19 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:34.076 18:42:19 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:34.076 18:42:19 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:34.076 18:42:19 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:05:34.076 18:42:19 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:05:34.076 18:42:19 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:34.076 18:42:19 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:34.076 18:42:19 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:34.076 18:42:19 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:34.076 18:42:19 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:05:34.076 18:42:19 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:34.076 18:42:19 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:34.076 18:42:19 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:34.077 18:42:19 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:34.077 18:42:19 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:34.077 18:42:19 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:34.077 18:42:19 json_config -- paths/export.sh@5 -- # export PATH 00:05:34.077 18:42:19 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:34.077 18:42:19 json_config -- nvmf/common.sh@47 -- # : 0 00:05:34.077 18:42:19 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:05:34.077 18:42:19 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:05:34.077 18:42:19 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:34.077 18:42:19 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:34.077 18:42:19 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:34.077 18:42:19 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:05:34.077 18:42:19 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:05:34.077 18:42:19 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@359 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@360 -- # echo 'INFO: JSON configuration test init' 00:05:34.077 INFO: JSON configuration test init 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@361 -- # json_config_test_init 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@266 -- # timing_enter json_config_test_init 00:05:34.077 18:42:19 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:34.077 18:42:19 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@267 -- # timing_enter json_config_setup_target 00:05:34.077 18:42:19 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:34.077 18:42:19 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:34.077 18:42:19 json_config -- json_config/json_config.sh@269 -- # json_config_test_start_app target --wait-for-rpc 00:05:34.077 18:42:19 json_config -- json_config/common.sh@9 -- # local app=target 00:05:34.077 18:42:19 json_config -- json_config/common.sh@10 -- # shift 00:05:34.077 18:42:19 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:34.077 18:42:19 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:34.077 18:42:19 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:05:34.077 18:42:19 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:34.077 18:42:19 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:34.077 18:42:19 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=2006064 00:05:34.077 18:42:19 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:34.077 Waiting for target to run... 00:05:34.077 18:42:19 json_config -- json_config/common.sh@25 -- # waitforlisten 2006064 /var/tmp/spdk_tgt.sock 00:05:34.077 18:42:19 json_config -- common/autotest_common.sh@829 -- # '[' -z 2006064 ']' 00:05:34.077 18:42:19 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:34.077 18:42:19 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:34.077 18:42:19 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:05:34.077 18:42:19 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:34.077 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:34.077 18:42:19 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:34.077 18:42:19 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:34.336 [2024-07-24 18:42:19.123884] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:05:34.336 [2024-07-24 18:42:19.123925] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2006064 ] 00:05:34.595 [2024-07-24 18:42:19.568643] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:34.854 [2024-07-24 18:42:19.652617] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:35.113 18:42:19 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:35.113 18:42:19 json_config -- common/autotest_common.sh@862 -- # return 0 00:05:35.113 18:42:19 json_config -- json_config/common.sh@26 -- # echo '' 00:05:35.113 00:05:35.113 18:42:19 json_config -- json_config/json_config.sh@273 -- # create_accel_config 00:05:35.113 18:42:19 json_config -- json_config/json_config.sh@97 -- # timing_enter create_accel_config 00:05:35.113 18:42:19 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:35.113 18:42:19 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:35.113 18:42:19 json_config -- json_config/json_config.sh@99 -- # [[ 1 -eq 1 ]] 00:05:35.113 18:42:19 json_config -- json_config/json_config.sh@100 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:05:35.113 18:42:19 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:05:35.113 18:42:20 json_config -- json_config/json_config.sh@101 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:05:35.113 18:42:20 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:05:35.372 [2024-07-24 18:42:20.230351] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:05:35.372 18:42:20 json_config -- json_config/json_config.sh@102 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:05:35.372 18:42:20 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:05:35.631 [2024-07-24 18:42:20.398764] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:05:35.631 18:42:20 json_config -- json_config/json_config.sh@105 -- # timing_exit create_accel_config 00:05:35.631 18:42:20 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:35.631 18:42:20 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:35.631 18:42:20 json_config -- json_config/json_config.sh@277 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:05:35.631 18:42:20 json_config -- json_config/json_config.sh@278 -- # tgt_rpc load_config 00:05:35.631 18:42:20 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:05:35.631 [2024-07-24 18:42:20.630513] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@280 -- # tgt_check_notification_types 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:05:40.902 18:42:25 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:40.902 18:42:25 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:05:40.902 18:42:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@48 -- # local get_types 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@50 -- # local type_diff 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@51 -- # echo bdev_register bdev_unregister bdev_register bdev_unregister 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@51 -- # tr ' ' '\n' 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@51 -- # sort 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@51 -- # uniq -u 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@51 -- # type_diff= 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@53 -- # [[ -n '' ]] 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@58 -- # timing_exit tgt_check_notification_types 00:05:40.902 18:42:25 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:40.902 18:42:25 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@59 -- # return 0 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@282 -- # [[ 1 -eq 1 ]] 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@283 -- # create_bdev_subsystem_config 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@109 -- # timing_enter create_bdev_subsystem_config 00:05:40.902 18:42:25 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:40.902 18:42:25 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@111 -- # expected_notifications=() 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@111 -- # local expected_notifications 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@115 -- # expected_notifications+=($(get_notifications)) 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@115 -- # get_notifications 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@63 -- # local ev_type ev_ctx event_id 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@62 -- # tgt_rpc notify_get_notifications -i 0 00:05:40.902 18:42:25 json_config -- json_config/json_config.sh@62 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:05:40.902 18:42:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:05:41.160 18:42:25 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1 00:05:41.160 18:42:25 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:41.160 18:42:25 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:41.160 18:42:25 json_config -- json_config/json_config.sh@117 -- # [[ 1 -eq 1 ]] 00:05:41.160 18:42:25 json_config -- json_config/json_config.sh@118 -- # local lvol_store_base_bdev=Nvme0n1 00:05:41.161 18:42:25 json_config -- json_config/json_config.sh@120 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:05:41.161 18:42:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:05:41.161 Nvme0n1p0 Nvme0n1p1 00:05:41.419 18:42:26 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_split_create Malloc0 3 00:05:41.419 18:42:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:05:41.419 [2024-07-24 18:42:26.344013] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:41.419 [2024-07-24 18:42:26.344051] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:41.419 00:05:41.419 18:42:26 json_config -- json_config/json_config.sh@122 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:05:41.419 18:42:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:05:41.677 Malloc3 00:05:41.677 18:42:26 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:05:41.677 18:42:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:05:41.677 [2024-07-24 18:42:26.684926] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:05:41.677 [2024-07-24 18:42:26.684952] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:41.677 [2024-07-24 18:42:26.684963] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x220f3a0 00:05:41.677 [2024-07-24 18:42:26.684969] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:41.677 [2024-07-24 18:42:26.685948] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:41.677 [2024-07-24 18:42:26.685968] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:05:41.936 PTBdevFromMalloc3 00:05:41.936 18:42:26 json_config -- json_config/json_config.sh@125 -- # tgt_rpc bdev_null_create Null0 32 512 00:05:41.936 18:42:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:05:41.936 Null0 00:05:41.936 18:42:26 json_config -- json_config/json_config.sh@127 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:05:41.936 18:42:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:05:42.194 Malloc0 00:05:42.194 18:42:27 json_config -- json_config/json_config.sh@128 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:05:42.194 18:42:27 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:05:42.194 Malloc1 00:05:42.453 18:42:27 json_config -- json_config/json_config.sh@141 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:05:42.453 18:42:27 json_config -- json_config/json_config.sh@144 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:05:42.453 102400+0 records in 00:05:42.453 102400+0 records out 00:05:42.453 104857600 bytes (105 MB, 100 MiB) copied, 0.111989 s, 936 MB/s 00:05:42.453 18:42:27 json_config -- json_config/json_config.sh@145 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:05:42.453 18:42:27 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:05:42.711 aio_disk 00:05:42.711 18:42:27 json_config -- json_config/json_config.sh@146 -- # expected_notifications+=(bdev_register:aio_disk) 00:05:42.711 18:42:27 json_config -- json_config/json_config.sh@151 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:05:42.711 18:42:27 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:05:44.611 b86226c3-f296-43a6-b0ce-7bf979272861 00:05:44.869 18:42:29 json_config -- json_config/json_config.sh@158 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:05:44.869 18:42:29 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:05:44.869 18:42:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:05:45.126 18:42:29 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:05:45.126 18:42:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:05:45.126 18:42:30 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:05:45.126 18:42:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:05:45.384 18:42:30 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:05:45.384 18:42:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:05:45.642 18:42:30 json_config -- json_config/json_config.sh@161 -- # [[ 1 -eq 1 ]] 00:05:45.642 18:42:30 json_config -- json_config/json_config.sh@162 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:05:45.642 18:42:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:05:45.643 MallocForCryptoBdev 00:05:45.643 18:42:30 json_config -- json_config/json_config.sh@163 -- # lspci -d:37c8 00:05:45.643 18:42:30 json_config -- json_config/json_config.sh@163 -- # wc -l 00:05:45.643 18:42:30 json_config -- json_config/json_config.sh@163 -- # [[ 3 -eq 0 ]] 00:05:45.643 18:42:30 json_config -- json_config/json_config.sh@166 -- # local crypto_driver=crypto_qat 00:05:45.643 18:42:30 json_config -- json_config/json_config.sh@169 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:05:45.643 18:42:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:05:45.901 [2024-07-24 18:42:30.735300] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:05:45.901 CryptoMallocBdev 00:05:45.901 18:42:30 json_config -- json_config/json_config.sh@173 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:05:45.901 18:42:30 json_config -- json_config/json_config.sh@176 -- # [[ 0 -eq 1 ]] 00:05:45.901 18:42:30 json_config -- json_config/json_config.sh@182 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:fc31ece0-8104-4fbf-a076-8b93c706b0c7 bdev_register:74bf2927-27f9-4cde-873e-53d145a207bf bdev_register:db4ada0c-3db2-424d-b99c-b405f984c49b bdev_register:57fc9ec9-c1d0-489a-a06d-baa9bd04b482 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:05:45.901 18:42:30 json_config -- json_config/json_config.sh@71 -- # local events_to_check 00:05:45.901 18:42:30 json_config -- json_config/json_config.sh@72 -- # local recorded_events 00:05:45.901 18:42:30 json_config -- json_config/json_config.sh@75 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:05:45.901 18:42:30 json_config -- json_config/json_config.sh@75 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:fc31ece0-8104-4fbf-a076-8b93c706b0c7 bdev_register:74bf2927-27f9-4cde-873e-53d145a207bf bdev_register:db4ada0c-3db2-424d-b99c-b405f984c49b bdev_register:57fc9ec9-c1d0-489a-a06d-baa9bd04b482 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:05:45.901 18:42:30 json_config -- json_config/json_config.sh@75 -- # sort 00:05:45.901 18:42:30 json_config -- json_config/json_config.sh@76 -- # recorded_events=($(get_notifications | sort)) 00:05:45.901 18:42:30 json_config -- json_config/json_config.sh@76 -- # get_notifications 00:05:45.901 18:42:30 json_config -- json_config/json_config.sh@76 -- # sort 00:05:45.901 18:42:30 json_config -- json_config/json_config.sh@63 -- # local ev_type ev_ctx event_id 00:05:45.901 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:45.901 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:45.901 18:42:30 json_config -- json_config/json_config.sh@62 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:05:45.901 18:42:30 json_config -- json_config/json_config.sh@62 -- # tgt_rpc notify_get_notifications -i 0 00:05:45.901 18:42:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1p1 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1p0 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc3 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:PTBdevFromMalloc3 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Null0 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p2 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p1 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p0 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc1 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:aio_disk 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:fc31ece0-8104-4fbf-a076-8b93c706b0c7 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:74bf2927-27f9-4cde-873e-53d145a207bf 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:db4ada0c-3db2-424d-b99c-b405f984c49b 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:57fc9ec9-c1d0-489a-a06d-baa9bd04b482 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:MallocForCryptoBdev 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:CryptoMallocBdev 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@78 -- # [[ bdev_register:57fc9ec9-c1d0-489a-a06d-baa9bd04b482 bdev_register:74bf2927-27f9-4cde-873e-53d145a207bf bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:db4ada0c-3db2-424d-b99c-b405f984c49b bdev_register:fc31ece0-8104-4fbf-a076-8b93c706b0c7 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\5\7\f\c\9\e\c\9\-\c\1\d\0\-\4\8\9\a\-\a\0\6\d\-\b\a\a\9\b\d\0\4\b\4\8\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\7\4\b\f\2\9\2\7\-\2\7\f\9\-\4\c\d\e\-\8\7\3\e\-\5\3\d\1\4\5\a\2\0\7\b\f\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\d\b\4\a\d\a\0\c\-\3\d\b\2\-\4\2\4\d\-\b\9\9\c\-\b\4\0\5\f\9\8\4\c\4\9\b\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\f\c\3\1\e\c\e\0\-\8\1\0\4\-\4\f\b\f\-\a\0\7\6\-\8\b\9\3\c\7\0\6\b\0\c\7\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@90 -- # cat 00:05:46.161 18:42:30 json_config -- json_config/json_config.sh@90 -- # printf ' %s\n' bdev_register:57fc9ec9-c1d0-489a-a06d-baa9bd04b482 bdev_register:74bf2927-27f9-4cde-873e-53d145a207bf bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:db4ada0c-3db2-424d-b99c-b405f984c49b bdev_register:fc31ece0-8104-4fbf-a076-8b93c706b0c7 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:05:46.161 Expected events matched: 00:05:46.161 bdev_register:57fc9ec9-c1d0-489a-a06d-baa9bd04b482 00:05:46.161 bdev_register:74bf2927-27f9-4cde-873e-53d145a207bf 00:05:46.161 bdev_register:aio_disk 00:05:46.161 bdev_register:CryptoMallocBdev 00:05:46.161 bdev_register:db4ada0c-3db2-424d-b99c-b405f984c49b 00:05:46.161 bdev_register:fc31ece0-8104-4fbf-a076-8b93c706b0c7 00:05:46.161 bdev_register:Malloc0 00:05:46.161 bdev_register:Malloc0p0 00:05:46.161 bdev_register:Malloc0p1 00:05:46.161 bdev_register:Malloc0p2 00:05:46.161 bdev_register:Malloc1 00:05:46.162 bdev_register:Malloc3 00:05:46.162 bdev_register:MallocForCryptoBdev 00:05:46.162 bdev_register:Null0 00:05:46.162 bdev_register:Nvme0n1 00:05:46.162 bdev_register:Nvme0n1p0 00:05:46.162 bdev_register:Nvme0n1p1 00:05:46.162 bdev_register:PTBdevFromMalloc3 00:05:46.162 18:42:30 json_config -- json_config/json_config.sh@184 -- # timing_exit create_bdev_subsystem_config 00:05:46.162 18:42:30 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:46.162 18:42:30 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:46.162 18:42:30 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:05:46.162 18:42:30 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:05:46.162 18:42:30 json_config -- json_config/json_config.sh@294 -- # [[ 0 -eq 1 ]] 00:05:46.162 18:42:30 json_config -- json_config/json_config.sh@297 -- # timing_exit json_config_setup_target 00:05:46.162 18:42:30 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:46.162 18:42:30 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:46.162 18:42:31 json_config -- json_config/json_config.sh@299 -- # [[ 0 -eq 1 ]] 00:05:46.162 18:42:31 json_config -- json_config/json_config.sh@304 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:05:46.162 18:42:31 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:05:46.162 MallocBdevForConfigChangeCheck 00:05:46.420 18:42:31 json_config -- json_config/json_config.sh@306 -- # timing_exit json_config_test_init 00:05:46.420 18:42:31 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:46.420 18:42:31 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:46.420 18:42:31 json_config -- json_config/json_config.sh@363 -- # tgt_rpc save_config 00:05:46.421 18:42:31 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:05:46.680 18:42:31 json_config -- json_config/json_config.sh@365 -- # echo 'INFO: shutting down applications...' 00:05:46.680 INFO: shutting down applications... 00:05:46.680 18:42:31 json_config -- json_config/json_config.sh@366 -- # [[ 0 -eq 1 ]] 00:05:46.680 18:42:31 json_config -- json_config/json_config.sh@372 -- # json_config_clear target 00:05:46.680 18:42:31 json_config -- json_config/json_config.sh@336 -- # [[ -n 22 ]] 00:05:46.680 18:42:31 json_config -- json_config/json_config.sh@337 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:05:46.680 [2024-07-24 18:42:31.674040] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:05:48.583 Calling clear_iscsi_subsystem 00:05:48.583 Calling clear_nvmf_subsystem 00:05:48.583 Calling clear_nbd_subsystem 00:05:48.583 Calling clear_ublk_subsystem 00:05:48.583 Calling clear_vhost_blk_subsystem 00:05:48.583 Calling clear_vhost_scsi_subsystem 00:05:48.583 Calling clear_bdev_subsystem 00:05:48.583 18:42:33 json_config -- json_config/json_config.sh@341 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:05:48.583 18:42:33 json_config -- json_config/json_config.sh@347 -- # count=100 00:05:48.583 18:42:33 json_config -- json_config/json_config.sh@348 -- # '[' 100 -gt 0 ']' 00:05:48.583 18:42:33 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:05:48.584 18:42:33 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:05:48.584 18:42:33 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:05:48.584 18:42:33 json_config -- json_config/json_config.sh@349 -- # break 00:05:48.584 18:42:33 json_config -- json_config/json_config.sh@354 -- # '[' 100 -eq 0 ']' 00:05:48.584 18:42:33 json_config -- json_config/json_config.sh@373 -- # json_config_test_shutdown_app target 00:05:48.584 18:42:33 json_config -- json_config/common.sh@31 -- # local app=target 00:05:48.584 18:42:33 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:05:48.584 18:42:33 json_config -- json_config/common.sh@35 -- # [[ -n 2006064 ]] 00:05:48.584 18:42:33 json_config -- json_config/common.sh@38 -- # kill -SIGINT 2006064 00:05:48.584 18:42:33 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:05:48.584 18:42:33 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:48.584 18:42:33 json_config -- json_config/common.sh@41 -- # kill -0 2006064 00:05:48.584 18:42:33 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:05:49.151 18:42:33 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:05:49.151 18:42:33 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:49.151 18:42:33 json_config -- json_config/common.sh@41 -- # kill -0 2006064 00:05:49.151 18:42:33 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:05:49.151 18:42:33 json_config -- json_config/common.sh@43 -- # break 00:05:49.151 18:42:33 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:05:49.151 18:42:33 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:05:49.151 SPDK target shutdown done 00:05:49.151 18:42:33 json_config -- json_config/json_config.sh@375 -- # echo 'INFO: relaunching applications...' 00:05:49.151 INFO: relaunching applications... 00:05:49.151 18:42:33 json_config -- json_config/json_config.sh@376 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:49.151 18:42:33 json_config -- json_config/common.sh@9 -- # local app=target 00:05:49.151 18:42:33 json_config -- json_config/common.sh@10 -- # shift 00:05:49.151 18:42:33 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:49.151 18:42:33 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:49.151 18:42:33 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:05:49.151 18:42:33 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:49.151 18:42:33 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:49.151 18:42:33 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=2008727 00:05:49.151 18:42:33 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:49.151 Waiting for target to run... 00:05:49.151 18:42:33 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:49.151 18:42:33 json_config -- json_config/common.sh@25 -- # waitforlisten 2008727 /var/tmp/spdk_tgt.sock 00:05:49.151 18:42:33 json_config -- common/autotest_common.sh@829 -- # '[' -z 2008727 ']' 00:05:49.151 18:42:33 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:49.151 18:42:33 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:49.151 18:42:33 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:49.151 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:49.151 18:42:33 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:49.151 18:42:33 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:49.151 [2024-07-24 18:42:33.967165] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:05:49.151 [2024-07-24 18:42:33.967209] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2008727 ] 00:05:49.410 [2024-07-24 18:42:34.419170] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:49.668 [2024-07-24 18:42:34.499522] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:49.668 [2024-07-24 18:42:34.553010] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:05:49.668 [2024-07-24 18:42:34.561041] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:05:49.668 [2024-07-24 18:42:34.569058] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:05:49.668 [2024-07-24 18:42:34.648380] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:05:52.201 [2024-07-24 18:42:36.767916] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:05:52.201 [2024-07-24 18:42:36.767966] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:05:52.201 [2024-07-24 18:42:36.767974] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:05:52.201 [2024-07-24 18:42:36.775936] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:05:52.201 [2024-07-24 18:42:36.775953] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:05:52.201 [2024-07-24 18:42:36.783949] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:52.201 [2024-07-24 18:42:36.783962] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:52.201 [2024-07-24 18:42:36.791990] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:05:52.201 [2024-07-24 18:42:36.792005] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:05:52.201 [2024-07-24 18:42:36.792011] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:05:54.736 [2024-07-24 18:42:39.642626] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:05:54.736 [2024-07-24 18:42:39.642656] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:54.736 [2024-07-24 18:42:39.642666] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2b0b560 00:05:54.736 [2024-07-24 18:42:39.642672] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:54.736 [2024-07-24 18:42:39.642873] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:54.736 [2024-07-24 18:42:39.642884] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:05:54.995 18:42:39 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:54.995 18:42:39 json_config -- common/autotest_common.sh@862 -- # return 0 00:05:54.995 18:42:39 json_config -- json_config/common.sh@26 -- # echo '' 00:05:54.995 00:05:54.995 18:42:39 json_config -- json_config/json_config.sh@377 -- # [[ 0 -eq 1 ]] 00:05:54.995 18:42:39 json_config -- json_config/json_config.sh@381 -- # echo 'INFO: Checking if target configuration is the same...' 00:05:54.995 INFO: Checking if target configuration is the same... 00:05:54.995 18:42:39 json_config -- json_config/json_config.sh@382 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:54.995 18:42:39 json_config -- json_config/json_config.sh@382 -- # tgt_rpc save_config 00:05:54.995 18:42:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:05:54.995 + '[' 2 -ne 2 ']' 00:05:54.995 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:05:54.995 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:05:54.995 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:05:54.995 +++ basename /dev/fd/62 00:05:54.995 ++ mktemp /tmp/62.XXX 00:05:54.995 + tmp_file_1=/tmp/62.P4h 00:05:54.995 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:54.995 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:05:54.995 + tmp_file_2=/tmp/spdk_tgt_config.json.yTy 00:05:54.995 + ret=0 00:05:54.995 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:05:55.253 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:05:55.253 + diff -u /tmp/62.P4h /tmp/spdk_tgt_config.json.yTy 00:05:55.253 + echo 'INFO: JSON config files are the same' 00:05:55.253 INFO: JSON config files are the same 00:05:55.253 + rm /tmp/62.P4h /tmp/spdk_tgt_config.json.yTy 00:05:55.253 + exit 0 00:05:55.253 18:42:40 json_config -- json_config/json_config.sh@383 -- # [[ 0 -eq 1 ]] 00:05:55.253 18:42:40 json_config -- json_config/json_config.sh@388 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:05:55.253 INFO: changing configuration and checking if this can be detected... 00:05:55.253 18:42:40 json_config -- json_config/json_config.sh@390 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:05:55.253 18:42:40 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:05:55.511 18:42:40 json_config -- json_config/json_config.sh@391 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:55.511 18:42:40 json_config -- json_config/json_config.sh@391 -- # tgt_rpc save_config 00:05:55.511 18:42:40 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:05:55.511 + '[' 2 -ne 2 ']' 00:05:55.512 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:05:55.512 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:05:55.512 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:05:55.512 +++ basename /dev/fd/62 00:05:55.512 ++ mktemp /tmp/62.XXX 00:05:55.512 + tmp_file_1=/tmp/62.3I7 00:05:55.512 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:55.512 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:05:55.512 + tmp_file_2=/tmp/spdk_tgt_config.json.Adm 00:05:55.512 + ret=0 00:05:55.512 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:05:55.771 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:05:55.771 + diff -u /tmp/62.3I7 /tmp/spdk_tgt_config.json.Adm 00:05:55.771 + ret=1 00:05:55.771 + echo '=== Start of file: /tmp/62.3I7 ===' 00:05:55.771 + cat /tmp/62.3I7 00:05:55.771 + echo '=== End of file: /tmp/62.3I7 ===' 00:05:55.771 + echo '' 00:05:55.771 + echo '=== Start of file: /tmp/spdk_tgt_config.json.Adm ===' 00:05:55.771 + cat /tmp/spdk_tgt_config.json.Adm 00:05:55.771 + echo '=== End of file: /tmp/spdk_tgt_config.json.Adm ===' 00:05:55.771 + echo '' 00:05:55.771 + rm /tmp/62.3I7 /tmp/spdk_tgt_config.json.Adm 00:05:55.771 + exit 1 00:05:55.771 18:42:40 json_config -- json_config/json_config.sh@395 -- # echo 'INFO: configuration change detected.' 00:05:55.771 INFO: configuration change detected. 00:05:55.771 18:42:40 json_config -- json_config/json_config.sh@398 -- # json_config_test_fini 00:05:55.771 18:42:40 json_config -- json_config/json_config.sh@310 -- # timing_enter json_config_test_fini 00:05:55.771 18:42:40 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:55.771 18:42:40 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:55.771 18:42:40 json_config -- json_config/json_config.sh@311 -- # local ret=0 00:05:55.771 18:42:40 json_config -- json_config/json_config.sh@313 -- # [[ -n '' ]] 00:05:55.771 18:42:40 json_config -- json_config/json_config.sh@321 -- # [[ -n 2008727 ]] 00:05:55.771 18:42:40 json_config -- json_config/json_config.sh@324 -- # cleanup_bdev_subsystem_config 00:05:55.771 18:42:40 json_config -- json_config/json_config.sh@188 -- # timing_enter cleanup_bdev_subsystem_config 00:05:55.771 18:42:40 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:55.771 18:42:40 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:55.771 18:42:40 json_config -- json_config/json_config.sh@190 -- # [[ 1 -eq 1 ]] 00:05:55.771 18:42:40 json_config -- json_config/json_config.sh@191 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:05:55.771 18:42:40 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:05:56.029 18:42:40 json_config -- json_config/json_config.sh@192 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:05:56.029 18:42:40 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:05:56.029 18:42:41 json_config -- json_config/json_config.sh@193 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:05:56.029 18:42:41 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:05:56.288 18:42:41 json_config -- json_config/json_config.sh@194 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:05:56.288 18:42:41 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:05:56.618 18:42:41 json_config -- json_config/json_config.sh@197 -- # uname -s 00:05:56.618 18:42:41 json_config -- json_config/json_config.sh@197 -- # [[ Linux = Linux ]] 00:05:56.618 18:42:41 json_config -- json_config/json_config.sh@198 -- # rm -f /sample_aio 00:05:56.618 18:42:41 json_config -- json_config/json_config.sh@201 -- # [[ 0 -eq 1 ]] 00:05:56.618 18:42:41 json_config -- json_config/json_config.sh@205 -- # timing_exit cleanup_bdev_subsystem_config 00:05:56.618 18:42:41 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:56.618 18:42:41 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:56.618 18:42:41 json_config -- json_config/json_config.sh@327 -- # killprocess 2008727 00:05:56.618 18:42:41 json_config -- common/autotest_common.sh@948 -- # '[' -z 2008727 ']' 00:05:56.618 18:42:41 json_config -- common/autotest_common.sh@952 -- # kill -0 2008727 00:05:56.618 18:42:41 json_config -- common/autotest_common.sh@953 -- # uname 00:05:56.618 18:42:41 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:56.618 18:42:41 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2008727 00:05:56.618 18:42:41 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:56.618 18:42:41 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:56.618 18:42:41 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2008727' 00:05:56.618 killing process with pid 2008727 00:05:56.618 18:42:41 json_config -- common/autotest_common.sh@967 -- # kill 2008727 00:05:56.618 18:42:41 json_config -- common/autotest_common.sh@972 -- # wait 2008727 00:05:58.555 18:42:43 json_config -- json_config/json_config.sh@330 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:58.555 18:42:43 json_config -- json_config/json_config.sh@331 -- # timing_exit json_config_test_fini 00:05:58.555 18:42:43 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:58.555 18:42:43 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:58.555 18:42:43 json_config -- json_config/json_config.sh@332 -- # return 0 00:05:58.555 18:42:43 json_config -- json_config/json_config.sh@400 -- # echo 'INFO: Success' 00:05:58.555 INFO: Success 00:05:58.555 00:05:58.555 real 0m24.177s 00:05:58.555 user 0m27.201s 00:05:58.555 sys 0m2.799s 00:05:58.555 18:42:43 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:58.555 18:42:43 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:58.555 ************************************ 00:05:58.555 END TEST json_config 00:05:58.555 ************************************ 00:05:58.555 18:42:43 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:58.555 18:42:43 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:58.555 18:42:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:58.555 18:42:43 -- common/autotest_common.sh@10 -- # set +x 00:05:58.555 ************************************ 00:05:58.556 START TEST json_config_extra_key 00:05:58.556 ************************************ 00:05:58.556 18:42:43 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:58.556 18:42:43 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:05:58.556 18:42:43 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:58.556 18:42:43 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:58.556 18:42:43 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:58.556 18:42:43 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:58.556 18:42:43 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:58.556 18:42:43 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:58.556 18:42:43 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:05:58.556 18:42:43 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:05:58.556 18:42:43 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:05:58.556 18:42:43 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:05:58.556 18:42:43 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:05:58.556 18:42:43 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:05:58.556 18:42:43 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:58.556 18:42:43 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:05:58.556 18:42:43 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:58.556 18:42:43 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:05:58.556 18:42:43 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:58.556 18:42:43 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:05:58.556 18:42:43 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:58.556 18:42:43 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:05:58.556 INFO: launching applications... 00:05:58.556 18:42:43 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:05:58.556 18:42:43 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:05:58.556 18:42:43 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:05:58.556 18:42:43 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:58.556 18:42:43 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:58.556 18:42:43 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:05:58.556 18:42:43 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:58.556 18:42:43 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:58.556 18:42:43 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=2010433 00:05:58.556 18:42:43 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:58.556 Waiting for target to run... 00:05:58.556 18:42:43 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 2010433 /var/tmp/spdk_tgt.sock 00:05:58.556 18:42:43 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 2010433 ']' 00:05:58.556 18:42:43 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:05:58.556 18:42:43 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:58.556 18:42:43 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:58.556 18:42:43 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:58.556 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:58.556 18:42:43 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:58.556 18:42:43 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:05:58.556 [2024-07-24 18:42:43.359931] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:05:58.556 [2024-07-24 18:42:43.359977] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2010433 ] 00:05:58.814 [2024-07-24 18:42:43.811648] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:59.073 [2024-07-24 18:42:43.905321] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:59.332 18:42:44 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:59.332 18:42:44 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:05:59.332 18:42:44 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:05:59.332 00:05:59.332 18:42:44 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:05:59.332 INFO: shutting down applications... 00:05:59.332 18:42:44 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:05:59.332 18:42:44 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:05:59.332 18:42:44 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:05:59.332 18:42:44 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 2010433 ]] 00:05:59.332 18:42:44 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 2010433 00:05:59.332 18:42:44 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:05:59.332 18:42:44 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:59.332 18:42:44 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 2010433 00:05:59.332 18:42:44 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:05:59.901 18:42:44 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:05:59.901 18:42:44 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:59.901 18:42:44 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 2010433 00:05:59.901 18:42:44 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:05:59.901 18:42:44 json_config_extra_key -- json_config/common.sh@43 -- # break 00:05:59.901 18:42:44 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:05:59.901 18:42:44 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:05:59.901 SPDK target shutdown done 00:05:59.901 18:42:44 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:05:59.901 Success 00:05:59.901 00:05:59.901 real 0m1.434s 00:05:59.901 user 0m0.874s 00:05:59.901 sys 0m0.530s 00:05:59.901 18:42:44 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:59.901 18:42:44 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:05:59.901 ************************************ 00:05:59.901 END TEST json_config_extra_key 00:05:59.901 ************************************ 00:05:59.901 18:42:44 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:59.901 18:42:44 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:59.901 18:42:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:59.901 18:42:44 -- common/autotest_common.sh@10 -- # set +x 00:05:59.901 ************************************ 00:05:59.901 START TEST alias_rpc 00:05:59.901 ************************************ 00:05:59.901 18:42:44 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:59.901 * Looking for test storage... 00:05:59.901 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:05:59.901 18:42:44 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:59.901 18:42:44 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=2010717 00:05:59.901 18:42:44 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 2010717 00:05:59.901 18:42:44 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:59.901 18:42:44 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 2010717 ']' 00:05:59.901 18:42:44 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:59.901 18:42:44 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:59.901 18:42:44 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:59.901 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:59.901 18:42:44 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:59.901 18:42:44 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:59.901 [2024-07-24 18:42:44.848662] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:05:59.901 [2024-07-24 18:42:44.848713] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2010717 ] 00:06:00.160 [2024-07-24 18:42:44.913836] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:00.161 [2024-07-24 18:42:44.992646] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:00.728 18:42:45 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:00.728 18:42:45 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:06:00.728 18:42:45 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:00.987 18:42:45 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 2010717 00:06:00.987 18:42:45 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 2010717 ']' 00:06:00.987 18:42:45 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 2010717 00:06:00.987 18:42:45 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:06:00.987 18:42:45 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:00.987 18:42:45 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2010717 00:06:00.987 18:42:45 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:00.987 18:42:45 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:00.987 18:42:45 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2010717' 00:06:00.987 killing process with pid 2010717 00:06:00.987 18:42:45 alias_rpc -- common/autotest_common.sh@967 -- # kill 2010717 00:06:00.987 18:42:45 alias_rpc -- common/autotest_common.sh@972 -- # wait 2010717 00:06:01.246 00:06:01.246 real 0m1.469s 00:06:01.246 user 0m1.601s 00:06:01.246 sys 0m0.391s 00:06:01.246 18:42:46 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:01.246 18:42:46 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:01.246 ************************************ 00:06:01.246 END TEST alias_rpc 00:06:01.246 ************************************ 00:06:01.246 18:42:46 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:06:01.246 18:42:46 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:01.246 18:42:46 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:01.246 18:42:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:01.246 18:42:46 -- common/autotest_common.sh@10 -- # set +x 00:06:01.246 ************************************ 00:06:01.246 START TEST spdkcli_tcp 00:06:01.246 ************************************ 00:06:01.246 18:42:46 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:01.505 * Looking for test storage... 00:06:01.505 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:06:01.505 18:42:46 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:06:01.505 18:42:46 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:01.505 18:42:46 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:06:01.505 18:42:46 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:01.505 18:42:46 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:01.505 18:42:46 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:01.505 18:42:46 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:01.505 18:42:46 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:01.505 18:42:46 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:01.505 18:42:46 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=2011004 00:06:01.505 18:42:46 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 2011004 00:06:01.505 18:42:46 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 2011004 ']' 00:06:01.505 18:42:46 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:01.505 18:42:46 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:01.505 18:42:46 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:01.505 18:42:46 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:01.505 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:01.505 18:42:46 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:01.505 18:42:46 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:01.505 [2024-07-24 18:42:46.376590] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:01.505 [2024-07-24 18:42:46.376641] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2011004 ] 00:06:01.505 [2024-07-24 18:42:46.441433] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:01.764 [2024-07-24 18:42:46.519238] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:01.764 [2024-07-24 18:42:46.519241] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:02.331 18:42:47 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:02.332 18:42:47 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:06:02.332 18:42:47 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=2011223 00:06:02.332 18:42:47 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:02.332 18:42:47 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:02.332 [ 00:06:02.332 "bdev_malloc_delete", 00:06:02.332 "bdev_malloc_create", 00:06:02.332 "bdev_null_resize", 00:06:02.332 "bdev_null_delete", 00:06:02.332 "bdev_null_create", 00:06:02.332 "bdev_nvme_cuse_unregister", 00:06:02.332 "bdev_nvme_cuse_register", 00:06:02.332 "bdev_opal_new_user", 00:06:02.332 "bdev_opal_set_lock_state", 00:06:02.332 "bdev_opal_delete", 00:06:02.332 "bdev_opal_get_info", 00:06:02.332 "bdev_opal_create", 00:06:02.332 "bdev_nvme_opal_revert", 00:06:02.332 "bdev_nvme_opal_init", 00:06:02.332 "bdev_nvme_send_cmd", 00:06:02.332 "bdev_nvme_get_path_iostat", 00:06:02.332 "bdev_nvme_get_mdns_discovery_info", 00:06:02.332 "bdev_nvme_stop_mdns_discovery", 00:06:02.332 "bdev_nvme_start_mdns_discovery", 00:06:02.332 "bdev_nvme_set_multipath_policy", 00:06:02.332 "bdev_nvme_set_preferred_path", 00:06:02.332 "bdev_nvme_get_io_paths", 00:06:02.332 "bdev_nvme_remove_error_injection", 00:06:02.332 "bdev_nvme_add_error_injection", 00:06:02.332 "bdev_nvme_get_discovery_info", 00:06:02.332 "bdev_nvme_stop_discovery", 00:06:02.332 "bdev_nvme_start_discovery", 00:06:02.332 "bdev_nvme_get_controller_health_info", 00:06:02.332 "bdev_nvme_disable_controller", 00:06:02.332 "bdev_nvme_enable_controller", 00:06:02.332 "bdev_nvme_reset_controller", 00:06:02.332 "bdev_nvme_get_transport_statistics", 00:06:02.332 "bdev_nvme_apply_firmware", 00:06:02.332 "bdev_nvme_detach_controller", 00:06:02.332 "bdev_nvme_get_controllers", 00:06:02.332 "bdev_nvme_attach_controller", 00:06:02.332 "bdev_nvme_set_hotplug", 00:06:02.332 "bdev_nvme_set_options", 00:06:02.332 "bdev_passthru_delete", 00:06:02.332 "bdev_passthru_create", 00:06:02.332 "bdev_lvol_set_parent_bdev", 00:06:02.332 "bdev_lvol_set_parent", 00:06:02.332 "bdev_lvol_check_shallow_copy", 00:06:02.332 "bdev_lvol_start_shallow_copy", 00:06:02.332 "bdev_lvol_grow_lvstore", 00:06:02.332 "bdev_lvol_get_lvols", 00:06:02.332 "bdev_lvol_get_lvstores", 00:06:02.332 "bdev_lvol_delete", 00:06:02.332 "bdev_lvol_set_read_only", 00:06:02.332 "bdev_lvol_resize", 00:06:02.332 "bdev_lvol_decouple_parent", 00:06:02.332 "bdev_lvol_inflate", 00:06:02.332 "bdev_lvol_rename", 00:06:02.332 "bdev_lvol_clone_bdev", 00:06:02.332 "bdev_lvol_clone", 00:06:02.332 "bdev_lvol_snapshot", 00:06:02.332 "bdev_lvol_create", 00:06:02.332 "bdev_lvol_delete_lvstore", 00:06:02.332 "bdev_lvol_rename_lvstore", 00:06:02.332 "bdev_lvol_create_lvstore", 00:06:02.332 "bdev_raid_set_options", 00:06:02.332 "bdev_raid_remove_base_bdev", 00:06:02.332 "bdev_raid_add_base_bdev", 00:06:02.332 "bdev_raid_delete", 00:06:02.332 "bdev_raid_create", 00:06:02.332 "bdev_raid_get_bdevs", 00:06:02.332 "bdev_error_inject_error", 00:06:02.332 "bdev_error_delete", 00:06:02.332 "bdev_error_create", 00:06:02.332 "bdev_split_delete", 00:06:02.332 "bdev_split_create", 00:06:02.332 "bdev_delay_delete", 00:06:02.332 "bdev_delay_create", 00:06:02.332 "bdev_delay_update_latency", 00:06:02.332 "bdev_zone_block_delete", 00:06:02.332 "bdev_zone_block_create", 00:06:02.332 "blobfs_create", 00:06:02.332 "blobfs_detect", 00:06:02.332 "blobfs_set_cache_size", 00:06:02.332 "bdev_crypto_delete", 00:06:02.332 "bdev_crypto_create", 00:06:02.332 "bdev_compress_delete", 00:06:02.332 "bdev_compress_create", 00:06:02.332 "bdev_compress_get_orphans", 00:06:02.332 "bdev_aio_delete", 00:06:02.332 "bdev_aio_rescan", 00:06:02.332 "bdev_aio_create", 00:06:02.332 "bdev_ftl_set_property", 00:06:02.332 "bdev_ftl_get_properties", 00:06:02.332 "bdev_ftl_get_stats", 00:06:02.332 "bdev_ftl_unmap", 00:06:02.332 "bdev_ftl_unload", 00:06:02.332 "bdev_ftl_delete", 00:06:02.332 "bdev_ftl_load", 00:06:02.332 "bdev_ftl_create", 00:06:02.332 "bdev_virtio_attach_controller", 00:06:02.332 "bdev_virtio_scsi_get_devices", 00:06:02.332 "bdev_virtio_detach_controller", 00:06:02.332 "bdev_virtio_blk_set_hotplug", 00:06:02.332 "bdev_iscsi_delete", 00:06:02.332 "bdev_iscsi_create", 00:06:02.332 "bdev_iscsi_set_options", 00:06:02.332 "accel_error_inject_error", 00:06:02.332 "ioat_scan_accel_module", 00:06:02.332 "dsa_scan_accel_module", 00:06:02.332 "iaa_scan_accel_module", 00:06:02.332 "dpdk_cryptodev_get_driver", 00:06:02.332 "dpdk_cryptodev_set_driver", 00:06:02.332 "dpdk_cryptodev_scan_accel_module", 00:06:02.332 "compressdev_scan_accel_module", 00:06:02.332 "keyring_file_remove_key", 00:06:02.332 "keyring_file_add_key", 00:06:02.332 "keyring_linux_set_options", 00:06:02.332 "iscsi_get_histogram", 00:06:02.332 "iscsi_enable_histogram", 00:06:02.332 "iscsi_set_options", 00:06:02.332 "iscsi_get_auth_groups", 00:06:02.332 "iscsi_auth_group_remove_secret", 00:06:02.332 "iscsi_auth_group_add_secret", 00:06:02.332 "iscsi_delete_auth_group", 00:06:02.332 "iscsi_create_auth_group", 00:06:02.332 "iscsi_set_discovery_auth", 00:06:02.332 "iscsi_get_options", 00:06:02.332 "iscsi_target_node_request_logout", 00:06:02.332 "iscsi_target_node_set_redirect", 00:06:02.332 "iscsi_target_node_set_auth", 00:06:02.332 "iscsi_target_node_add_lun", 00:06:02.332 "iscsi_get_stats", 00:06:02.332 "iscsi_get_connections", 00:06:02.332 "iscsi_portal_group_set_auth", 00:06:02.332 "iscsi_start_portal_group", 00:06:02.332 "iscsi_delete_portal_group", 00:06:02.332 "iscsi_create_portal_group", 00:06:02.332 "iscsi_get_portal_groups", 00:06:02.332 "iscsi_delete_target_node", 00:06:02.332 "iscsi_target_node_remove_pg_ig_maps", 00:06:02.332 "iscsi_target_node_add_pg_ig_maps", 00:06:02.332 "iscsi_create_target_node", 00:06:02.332 "iscsi_get_target_nodes", 00:06:02.332 "iscsi_delete_initiator_group", 00:06:02.332 "iscsi_initiator_group_remove_initiators", 00:06:02.332 "iscsi_initiator_group_add_initiators", 00:06:02.332 "iscsi_create_initiator_group", 00:06:02.332 "iscsi_get_initiator_groups", 00:06:02.332 "nvmf_set_crdt", 00:06:02.332 "nvmf_set_config", 00:06:02.332 "nvmf_set_max_subsystems", 00:06:02.332 "nvmf_stop_mdns_prr", 00:06:02.332 "nvmf_publish_mdns_prr", 00:06:02.332 "nvmf_subsystem_get_listeners", 00:06:02.332 "nvmf_subsystem_get_qpairs", 00:06:02.332 "nvmf_subsystem_get_controllers", 00:06:02.332 "nvmf_get_stats", 00:06:02.332 "nvmf_get_transports", 00:06:02.332 "nvmf_create_transport", 00:06:02.332 "nvmf_get_targets", 00:06:02.332 "nvmf_delete_target", 00:06:02.332 "nvmf_create_target", 00:06:02.332 "nvmf_subsystem_allow_any_host", 00:06:02.332 "nvmf_subsystem_remove_host", 00:06:02.332 "nvmf_subsystem_add_host", 00:06:02.332 "nvmf_ns_remove_host", 00:06:02.332 "nvmf_ns_add_host", 00:06:02.332 "nvmf_subsystem_remove_ns", 00:06:02.332 "nvmf_subsystem_add_ns", 00:06:02.332 "nvmf_subsystem_listener_set_ana_state", 00:06:02.332 "nvmf_discovery_get_referrals", 00:06:02.332 "nvmf_discovery_remove_referral", 00:06:02.332 "nvmf_discovery_add_referral", 00:06:02.332 "nvmf_subsystem_remove_listener", 00:06:02.332 "nvmf_subsystem_add_listener", 00:06:02.332 "nvmf_delete_subsystem", 00:06:02.332 "nvmf_create_subsystem", 00:06:02.332 "nvmf_get_subsystems", 00:06:02.332 "env_dpdk_get_mem_stats", 00:06:02.332 "nbd_get_disks", 00:06:02.332 "nbd_stop_disk", 00:06:02.332 "nbd_start_disk", 00:06:02.332 "ublk_recover_disk", 00:06:02.332 "ublk_get_disks", 00:06:02.332 "ublk_stop_disk", 00:06:02.332 "ublk_start_disk", 00:06:02.332 "ublk_destroy_target", 00:06:02.332 "ublk_create_target", 00:06:02.332 "virtio_blk_create_transport", 00:06:02.332 "virtio_blk_get_transports", 00:06:02.332 "vhost_controller_set_coalescing", 00:06:02.332 "vhost_get_controllers", 00:06:02.332 "vhost_delete_controller", 00:06:02.332 "vhost_create_blk_controller", 00:06:02.332 "vhost_scsi_controller_remove_target", 00:06:02.332 "vhost_scsi_controller_add_target", 00:06:02.332 "vhost_start_scsi_controller", 00:06:02.332 "vhost_create_scsi_controller", 00:06:02.332 "thread_set_cpumask", 00:06:02.332 "framework_get_governor", 00:06:02.332 "framework_get_scheduler", 00:06:02.332 "framework_set_scheduler", 00:06:02.332 "framework_get_reactors", 00:06:02.332 "thread_get_io_channels", 00:06:02.332 "thread_get_pollers", 00:06:02.332 "thread_get_stats", 00:06:02.332 "framework_monitor_context_switch", 00:06:02.332 "spdk_kill_instance", 00:06:02.332 "log_enable_timestamps", 00:06:02.332 "log_get_flags", 00:06:02.332 "log_clear_flag", 00:06:02.332 "log_set_flag", 00:06:02.332 "log_get_level", 00:06:02.332 "log_set_level", 00:06:02.333 "log_get_print_level", 00:06:02.333 "log_set_print_level", 00:06:02.333 "framework_enable_cpumask_locks", 00:06:02.333 "framework_disable_cpumask_locks", 00:06:02.333 "framework_wait_init", 00:06:02.333 "framework_start_init", 00:06:02.333 "scsi_get_devices", 00:06:02.333 "bdev_get_histogram", 00:06:02.333 "bdev_enable_histogram", 00:06:02.333 "bdev_set_qos_limit", 00:06:02.333 "bdev_set_qd_sampling_period", 00:06:02.333 "bdev_get_bdevs", 00:06:02.333 "bdev_reset_iostat", 00:06:02.333 "bdev_get_iostat", 00:06:02.333 "bdev_examine", 00:06:02.333 "bdev_wait_for_examine", 00:06:02.333 "bdev_set_options", 00:06:02.333 "notify_get_notifications", 00:06:02.333 "notify_get_types", 00:06:02.333 "accel_get_stats", 00:06:02.333 "accel_set_options", 00:06:02.333 "accel_set_driver", 00:06:02.333 "accel_crypto_key_destroy", 00:06:02.333 "accel_crypto_keys_get", 00:06:02.333 "accel_crypto_key_create", 00:06:02.333 "accel_assign_opc", 00:06:02.333 "accel_get_module_info", 00:06:02.333 "accel_get_opc_assignments", 00:06:02.333 "vmd_rescan", 00:06:02.333 "vmd_remove_device", 00:06:02.333 "vmd_enable", 00:06:02.333 "sock_get_default_impl", 00:06:02.333 "sock_set_default_impl", 00:06:02.333 "sock_impl_set_options", 00:06:02.333 "sock_impl_get_options", 00:06:02.333 "iobuf_get_stats", 00:06:02.333 "iobuf_set_options", 00:06:02.333 "framework_get_pci_devices", 00:06:02.333 "framework_get_config", 00:06:02.333 "framework_get_subsystems", 00:06:02.333 "trace_get_info", 00:06:02.333 "trace_get_tpoint_group_mask", 00:06:02.333 "trace_disable_tpoint_group", 00:06:02.333 "trace_enable_tpoint_group", 00:06:02.333 "trace_clear_tpoint_mask", 00:06:02.333 "trace_set_tpoint_mask", 00:06:02.333 "keyring_get_keys", 00:06:02.333 "spdk_get_version", 00:06:02.333 "rpc_get_methods" 00:06:02.333 ] 00:06:02.592 18:42:47 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:02.592 18:42:47 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:02.592 18:42:47 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:02.592 18:42:47 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:02.592 18:42:47 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 2011004 00:06:02.592 18:42:47 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 2011004 ']' 00:06:02.592 18:42:47 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 2011004 00:06:02.592 18:42:47 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:06:02.592 18:42:47 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:02.592 18:42:47 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2011004 00:06:02.592 18:42:47 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:02.592 18:42:47 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:02.592 18:42:47 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2011004' 00:06:02.592 killing process with pid 2011004 00:06:02.592 18:42:47 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 2011004 00:06:02.592 18:42:47 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 2011004 00:06:02.851 00:06:02.851 real 0m1.488s 00:06:02.851 user 0m2.762s 00:06:02.851 sys 0m0.418s 00:06:02.851 18:42:47 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:02.851 18:42:47 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:02.851 ************************************ 00:06:02.851 END TEST spdkcli_tcp 00:06:02.851 ************************************ 00:06:02.851 18:42:47 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:02.851 18:42:47 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:02.851 18:42:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:02.851 18:42:47 -- common/autotest_common.sh@10 -- # set +x 00:06:02.851 ************************************ 00:06:02.851 START TEST dpdk_mem_utility 00:06:02.851 ************************************ 00:06:02.851 18:42:47 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:03.111 * Looking for test storage... 00:06:03.111 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:06:03.111 18:42:47 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:03.111 18:42:47 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=2011413 00:06:03.111 18:42:47 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 2011413 00:06:03.111 18:42:47 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:03.111 18:42:47 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 2011413 ']' 00:06:03.111 18:42:47 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:03.111 18:42:47 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:03.111 18:42:47 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:03.111 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:03.111 18:42:47 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:03.111 18:42:47 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:03.111 [2024-07-24 18:42:47.930857] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:03.111 [2024-07-24 18:42:47.930909] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2011413 ] 00:06:03.111 [2024-07-24 18:42:47.995674] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:03.111 [2024-07-24 18:42:48.072699] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:04.052 18:42:48 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:04.052 18:42:48 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:06:04.052 18:42:48 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:04.052 18:42:48 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:04.052 18:42:48 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:04.052 18:42:48 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:04.052 { 00:06:04.052 "filename": "/tmp/spdk_mem_dump.txt" 00:06:04.052 } 00:06:04.052 18:42:48 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:04.052 18:42:48 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:04.052 DPDK memory size 814.000000 MiB in 1 heap(s) 00:06:04.052 1 heaps totaling size 814.000000 MiB 00:06:04.052 size: 814.000000 MiB heap id: 0 00:06:04.052 end heaps---------- 00:06:04.052 8 mempools totaling size 598.116089 MiB 00:06:04.052 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:04.052 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:04.052 size: 84.521057 MiB name: bdev_io_2011413 00:06:04.052 size: 51.011292 MiB name: evtpool_2011413 00:06:04.052 size: 50.003479 MiB name: msgpool_2011413 00:06:04.052 size: 21.763794 MiB name: PDU_Pool 00:06:04.052 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:04.052 size: 0.026123 MiB name: Session_Pool 00:06:04.052 end mempools------- 00:06:04.052 201 memzones totaling size 4.176453 MiB 00:06:04.052 size: 1.000366 MiB name: RG_ring_0_2011413 00:06:04.052 size: 1.000366 MiB name: RG_ring_1_2011413 00:06:04.052 size: 1.000366 MiB name: RG_ring_4_2011413 00:06:04.052 size: 1.000366 MiB name: RG_ring_5_2011413 00:06:04.052 size: 0.125366 MiB name: RG_ring_2_2011413 00:06:04.052 size: 0.015991 MiB name: RG_ring_3_2011413 00:06:04.052 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:04.052 size: 0.000305 MiB name: 0000:1a:01.0_qat 00:06:04.052 size: 0.000305 MiB name: 0000:1a:01.1_qat 00:06:04.052 size: 0.000305 MiB name: 0000:1a:01.2_qat 00:06:04.052 size: 0.000305 MiB name: 0000:1a:01.3_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1a:01.4_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1a:01.5_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1a:01.6_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1a:01.7_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1a:02.0_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1a:02.1_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1a:02.2_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1a:02.3_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1a:02.4_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1a:02.5_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1a:02.6_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1a:02.7_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1c:01.0_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1c:01.1_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1c:01.2_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1c:01.3_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1c:01.4_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1c:01.5_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1c:01.6_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1c:01.7_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1c:02.0_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1c:02.1_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1c:02.2_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1c:02.3_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1c:02.4_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1c:02.5_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1c:02.6_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1c:02.7_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1e:01.0_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1e:01.1_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1e:01.2_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1e:01.3_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1e:01.4_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1e:01.5_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1e:01.6_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1e:01.7_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1e:02.0_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1e:02.1_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1e:02.2_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1e:02.3_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1e:02.4_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1e:02.5_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1e:02.6_qat 00:06:04.053 size: 0.000305 MiB name: 0000:1e:02.7_qat 00:06:04.053 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_0 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_1 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_2 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_3 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_4 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_5 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_6 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_7 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_8 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_9 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_10 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_11 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_12 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_13 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_14 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_15 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_16 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_17 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_18 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_19 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_20 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_21 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_22 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_23 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_24 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_25 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_26 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_27 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_28 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_29 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_30 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_31 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_64 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_65 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_32 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_66 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_67 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_33 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_68 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_69 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_34 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_70 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_71 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_35 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_72 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_73 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_36 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_74 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_75 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_37 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_76 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_77 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_38 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_78 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_79 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_39 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_80 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_81 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_40 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_82 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_83 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_41 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_84 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_85 00:06:04.053 size: 0.000122 MiB name: rte_compressdev_data_42 00:06:04.053 size: 0.000122 MiB name: rte_cryptodev_data_86 00:06:04.054 size: 0.000122 MiB name: rte_cryptodev_data_87 00:06:04.054 size: 0.000122 MiB name: rte_compressdev_data_43 00:06:04.054 size: 0.000122 MiB name: rte_cryptodev_data_88 00:06:04.054 size: 0.000122 MiB name: rte_cryptodev_data_89 00:06:04.054 size: 0.000122 MiB name: rte_compressdev_data_44 00:06:04.054 size: 0.000122 MiB name: rte_cryptodev_data_90 00:06:04.054 size: 0.000122 MiB name: rte_cryptodev_data_91 00:06:04.054 size: 0.000122 MiB name: rte_compressdev_data_45 00:06:04.054 size: 0.000122 MiB name: rte_cryptodev_data_92 00:06:04.054 size: 0.000122 MiB name: rte_cryptodev_data_93 00:06:04.054 size: 0.000122 MiB name: rte_compressdev_data_46 00:06:04.054 size: 0.000122 MiB name: rte_cryptodev_data_94 00:06:04.054 size: 0.000122 MiB name: rte_cryptodev_data_95 00:06:04.054 size: 0.000122 MiB name: rte_compressdev_data_47 00:06:04.054 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:04.054 end memzones------- 00:06:04.054 18:42:48 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:04.054 heap id: 0 total size: 814.000000 MiB number of busy elements: 637 number of free elements: 14 00:06:04.054 list of free elements. size: 11.781372 MiB 00:06:04.054 element at address: 0x200000400000 with size: 1.999512 MiB 00:06:04.054 element at address: 0x200018e00000 with size: 0.999878 MiB 00:06:04.054 element at address: 0x200019000000 with size: 0.999878 MiB 00:06:04.054 element at address: 0x200003e00000 with size: 0.996460 MiB 00:06:04.054 element at address: 0x200031c00000 with size: 0.994446 MiB 00:06:04.054 element at address: 0x200013800000 with size: 0.978699 MiB 00:06:04.054 element at address: 0x200007000000 with size: 0.959839 MiB 00:06:04.054 element at address: 0x200019200000 with size: 0.936584 MiB 00:06:04.054 element at address: 0x20001aa00000 with size: 0.564758 MiB 00:06:04.054 element at address: 0x200003a00000 with size: 0.494507 MiB 00:06:04.054 element at address: 0x20000b200000 with size: 0.488892 MiB 00:06:04.054 element at address: 0x200000800000 with size: 0.486511 MiB 00:06:04.054 element at address: 0x200019400000 with size: 0.485657 MiB 00:06:04.054 element at address: 0x200027e00000 with size: 0.395752 MiB 00:06:04.054 list of standard malloc elements. size: 199.898621 MiB 00:06:04.054 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:06:04.054 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:06:04.054 element at address: 0x200018efff80 with size: 1.000122 MiB 00:06:04.054 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:06:04.054 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:06:04.054 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:04.054 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:06:04.054 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:04.054 element at address: 0x20000032bc80 with size: 0.004395 MiB 00:06:04.054 element at address: 0x20000032f740 with size: 0.004395 MiB 00:06:04.054 element at address: 0x200000333200 with size: 0.004395 MiB 00:06:04.054 element at address: 0x200000336cc0 with size: 0.004395 MiB 00:06:04.054 element at address: 0x20000033a780 with size: 0.004395 MiB 00:06:04.054 element at address: 0x20000033e240 with size: 0.004395 MiB 00:06:04.054 element at address: 0x200000341d00 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003457c0 with size: 0.004395 MiB 00:06:04.054 element at address: 0x200000349280 with size: 0.004395 MiB 00:06:04.054 element at address: 0x20000034cd40 with size: 0.004395 MiB 00:06:04.054 element at address: 0x200000350800 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003542c0 with size: 0.004395 MiB 00:06:04.054 element at address: 0x200000357d80 with size: 0.004395 MiB 00:06:04.054 element at address: 0x20000035b840 with size: 0.004395 MiB 00:06:04.054 element at address: 0x20000035f300 with size: 0.004395 MiB 00:06:04.054 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:06:04.054 element at address: 0x200000366880 with size: 0.004395 MiB 00:06:04.054 element at address: 0x20000036a340 with size: 0.004395 MiB 00:06:04.054 element at address: 0x20000036de00 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:06:04.054 element at address: 0x200000375380 with size: 0.004395 MiB 00:06:04.054 element at address: 0x200000378e40 with size: 0.004395 MiB 00:06:04.054 element at address: 0x20000037c900 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:06:04.054 element at address: 0x200000383e80 with size: 0.004395 MiB 00:06:04.054 element at address: 0x200000387940 with size: 0.004395 MiB 00:06:04.054 element at address: 0x20000038b400 with size: 0.004395 MiB 00:06:04.054 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:06:04.054 element at address: 0x200000392980 with size: 0.004395 MiB 00:06:04.054 element at address: 0x200000396440 with size: 0.004395 MiB 00:06:04.054 element at address: 0x200000399f00 with size: 0.004395 MiB 00:06:04.054 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:06:04.054 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:06:04.054 element at address: 0x200000329b80 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000032ac00 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000032d640 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000032e6c0 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000331100 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000332180 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000334bc0 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000335c40 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000338680 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000339700 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000033c140 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000033d1c0 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000033fc00 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000340c80 with size: 0.004028 MiB 00:06:04.054 element at address: 0x2000003436c0 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000344740 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000347180 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000348200 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000034ac40 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000034bcc0 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000034e700 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000034f780 with size: 0.004028 MiB 00:06:04.054 element at address: 0x2000003521c0 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000353240 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000355c80 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000356d00 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000359740 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000035a7c0 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000035d200 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000035e280 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000361d40 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000364780 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000365800 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000368240 with size: 0.004028 MiB 00:06:04.054 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000370840 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000373280 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000374300 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000376d40 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000037a800 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000037b880 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000037f340 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000381d80 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000382e00 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000385840 with size: 0.004028 MiB 00:06:04.054 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000389300 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000038a380 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:06:04.054 element at address: 0x20000038de40 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000390880 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000391900 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000394340 with size: 0.004028 MiB 00:06:04.054 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:06:04.054 element at address: 0x200000397e00 with size: 0.004028 MiB 00:06:04.055 element at address: 0x200000398e80 with size: 0.004028 MiB 00:06:04.055 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:06:04.055 element at address: 0x20000039c940 with size: 0.004028 MiB 00:06:04.055 element at address: 0x20000039f380 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:06:04.055 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:06:04.055 element at address: 0x200000200000 with size: 0.000305 MiB 00:06:04.055 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:06:04.055 element at address: 0x200000200140 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000200200 with size: 0.000183 MiB 00:06:04.055 element at address: 0x2000002002c0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000200380 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000200440 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000200500 with size: 0.000183 MiB 00:06:04.055 element at address: 0x2000002005c0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000200680 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000200740 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000200800 with size: 0.000183 MiB 00:06:04.055 element at address: 0x2000002008c0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000200980 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000200a40 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000200b00 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000200bc0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000200c80 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000200d40 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000200e00 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000200ec0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x2000002010c0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000205380 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000225640 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000225700 with size: 0.000183 MiB 00:06:04.055 element at address: 0x2000002257c0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000225880 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000225940 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000225a00 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000225ac0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000225b80 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000225c40 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000225d00 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000225dc0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000225e80 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000225f40 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000226000 with size: 0.000183 MiB 00:06:04.055 element at address: 0x2000002260c0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000226180 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000226240 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000226300 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000226500 with size: 0.000183 MiB 00:06:04.055 element at address: 0x2000002265c0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000226680 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000226740 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000226800 with size: 0.000183 MiB 00:06:04.055 element at address: 0x2000002268c0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000226980 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000226a40 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000226b00 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000226bc0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000226c80 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000226d40 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000226e00 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000226ec0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000226f80 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000227040 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000227100 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000329300 with size: 0.000183 MiB 00:06:04.055 element at address: 0x2000003293c0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000329580 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000329640 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000329800 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000032ce80 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000032d040 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000032d100 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000032d2c0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000330940 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000330b00 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000330bc0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000330d80 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000334400 with size: 0.000183 MiB 00:06:04.055 element at address: 0x2000003345c0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000334680 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000334840 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000337ec0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000338080 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000338140 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000338300 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000033b980 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000033bb40 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000033bc00 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000033f440 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000033f600 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000033f6c0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000033f880 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000342f00 with size: 0.000183 MiB 00:06:04.055 element at address: 0x2000003430c0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000343180 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000343340 with size: 0.000183 MiB 00:06:04.055 element at address: 0x2000003469c0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000346b80 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000346c40 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000346e00 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000034a480 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000034a640 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000034a700 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000034a8c0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000034df40 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000034e100 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000034e1c0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000034e380 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000351a00 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000351bc0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000351c80 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000351e40 with size: 0.000183 MiB 00:06:04.055 element at address: 0x2000003554c0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000355680 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000355740 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000355900 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000358f80 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000359140 with size: 0.000183 MiB 00:06:04.055 element at address: 0x200000359200 with size: 0.000183 MiB 00:06:04.055 element at address: 0x2000003593c0 with size: 0.000183 MiB 00:06:04.055 element at address: 0x20000035ca40 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000035cc00 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000035ccc0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000035ce80 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000360500 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003606c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000360780 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000360940 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000364180 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000364240 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000364400 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000367a80 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000367c40 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000367d00 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000036b540 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000036b700 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000036b980 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000036f000 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000036f280 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000036f440 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000372c80 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000372d40 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000372f00 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000376580 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000376740 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000376800 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000037a040 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000037a200 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000037a480 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000037db00 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000037df40 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000381780 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000381840 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000381a00 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000385080 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000385240 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000385300 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000388b40 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000388d00 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000388f80 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000038c600 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000038c880 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000390280 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000390340 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000390500 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000393b80 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000393d40 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000393e00 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000397640 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000397800 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200000397a80 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000039b100 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000039b380 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000039b540 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000039f000 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000087c980 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:06:04.056 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200003a7e980 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200003a7ea40 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200003a7eb00 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200003a7ebc0 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200003a7ec80 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200003a7ed40 with size: 0.000183 MiB 00:06:04.056 element at address: 0x200003a7ee00 with size: 0.000183 MiB 00:06:04.057 element at address: 0x200003a7eec0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x200003a7ef80 with size: 0.000183 MiB 00:06:04.057 element at address: 0x200003a7f040 with size: 0.000183 MiB 00:06:04.057 element at address: 0x200003a7f100 with size: 0.000183 MiB 00:06:04.057 element at address: 0x200003a7f1c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x200003a7f280 with size: 0.000183 MiB 00:06:04.057 element at address: 0x200003a7f340 with size: 0.000183 MiB 00:06:04.057 element at address: 0x200003a7f400 with size: 0.000183 MiB 00:06:04.057 element at address: 0x200003a7f4c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x200003a7f580 with size: 0.000183 MiB 00:06:04.057 element at address: 0x200003a7f640 with size: 0.000183 MiB 00:06:04.057 element at address: 0x200003a7f700 with size: 0.000183 MiB 00:06:04.057 element at address: 0x200003a7f7c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x200003a7f880 with size: 0.000183 MiB 00:06:04.057 element at address: 0x200003a7f940 with size: 0.000183 MiB 00:06:04.057 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20000b27d280 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20000b27d340 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20000b27d400 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20000b27d4c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20000b27d580 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20000b27d640 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20000b27d700 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20000b27d7c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20000b27d880 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20000b27d940 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:06:04.057 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:06:04.057 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:06:04.057 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa90940 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa90a00 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa90ac0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa90b80 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa90c40 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa90d00 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa90dc0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa90e80 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa90f40 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa91000 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa910c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa91180 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa91240 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa91300 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa913c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa91480 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa91540 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa91600 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa916c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa91780 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa91840 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa91900 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa919c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa91a80 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa91b40 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa91c00 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa91cc0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa91d80 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa91e40 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa91f00 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa91fc0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa92080 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa92140 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa92200 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa922c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa92380 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa92440 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa92500 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa925c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa92680 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa92740 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa92800 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa928c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa92980 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa92a40 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa92b00 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa92bc0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa92c80 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa92d40 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa92e00 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa92ec0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa92f80 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa93040 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa93100 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa931c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa93280 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa93340 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa93400 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa934c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa93580 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa93640 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa93700 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa937c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa93880 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa93940 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa93a00 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa93ac0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa93b80 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa93c40 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa93d00 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa93dc0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa93e80 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa93f40 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa94000 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa940c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa94180 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa94240 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa94300 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa943c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa94480 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa94540 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa94600 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa946c0 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa94780 with size: 0.000183 MiB 00:06:04.057 element at address: 0x20001aa94840 with size: 0.000183 MiB 00:06:04.058 element at address: 0x20001aa94900 with size: 0.000183 MiB 00:06:04.058 element at address: 0x20001aa949c0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x20001aa94a80 with size: 0.000183 MiB 00:06:04.058 element at address: 0x20001aa94b40 with size: 0.000183 MiB 00:06:04.058 element at address: 0x20001aa94c00 with size: 0.000183 MiB 00:06:04.058 element at address: 0x20001aa94cc0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x20001aa94d80 with size: 0.000183 MiB 00:06:04.058 element at address: 0x20001aa94e40 with size: 0.000183 MiB 00:06:04.058 element at address: 0x20001aa94f00 with size: 0.000183 MiB 00:06:04.058 element at address: 0x20001aa94fc0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x20001aa95080 with size: 0.000183 MiB 00:06:04.058 element at address: 0x20001aa95140 with size: 0.000183 MiB 00:06:04.058 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:06:04.058 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:06:04.058 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e65500 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e655c0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6c1c0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6c3c0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6c480 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6c540 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6c600 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6c6c0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6c780 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6c840 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6c900 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6c9c0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6ca80 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6cb40 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6cc00 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6ccc0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6cd80 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6ce40 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6cf00 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6cfc0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6d080 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6d140 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6d200 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6d2c0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6d380 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6d440 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6d500 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6d5c0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6d680 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6d740 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6d800 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6d8c0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6d980 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:06:04.058 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:06:04.058 list of memzone associated elements. size: 602.320007 MiB 00:06:04.058 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:06:04.058 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:04.058 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:06:04.058 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:04.058 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:06:04.058 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_2011413_0 00:06:04.058 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:06:04.058 associated memzone info: size: 48.002930 MiB name: MP_evtpool_2011413_0 00:06:04.058 element at address: 0x200003fff380 with size: 48.003052 MiB 00:06:04.058 associated memzone info: size: 48.002930 MiB name: MP_msgpool_2011413_0 00:06:04.058 element at address: 0x2000195be940 with size: 20.255554 MiB 00:06:04.058 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:04.058 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:06:04.058 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:04.058 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:06:04.058 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_2011413 00:06:04.058 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:06:04.058 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_2011413 00:06:04.058 element at address: 0x2000002271c0 with size: 1.008118 MiB 00:06:04.058 associated memzone info: size: 1.007996 MiB name: MP_evtpool_2011413 00:06:04.058 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:06:04.058 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:04.058 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:06:04.058 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:04.058 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:06:04.058 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:04.058 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:06:04.058 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:04.058 element at address: 0x200003eff180 with size: 1.000488 MiB 00:06:04.058 associated memzone info: size: 1.000366 MiB name: RG_ring_0_2011413 00:06:04.058 element at address: 0x200003affc00 with size: 1.000488 MiB 00:06:04.058 associated memzone info: size: 1.000366 MiB name: RG_ring_1_2011413 00:06:04.058 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:06:04.058 associated memzone info: size: 1.000366 MiB name: RG_ring_4_2011413 00:06:04.058 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:06:04.058 associated memzone info: size: 1.000366 MiB name: RG_ring_5_2011413 00:06:04.058 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:06:04.058 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_2011413 00:06:04.058 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:06:04.058 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:04.058 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:06:04.058 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:04.058 element at address: 0x20001947c540 with size: 0.250488 MiB 00:06:04.058 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:04.058 element at address: 0x200000205440 with size: 0.125488 MiB 00:06:04.058 associated memzone info: size: 0.125366 MiB name: RG_ring_2_2011413 00:06:04.058 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:06:04.058 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:04.058 element at address: 0x200027e65680 with size: 0.023743 MiB 00:06:04.059 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:04.059 element at address: 0x200000201180 with size: 0.016113 MiB 00:06:04.059 associated memzone info: size: 0.015991 MiB name: RG_ring_3_2011413 00:06:04.059 element at address: 0x200027e6b7c0 with size: 0.002441 MiB 00:06:04.059 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:04.059 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:06:04.059 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:04.059 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.0_qat 00:06:04.059 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.1_qat 00:06:04.059 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.2_qat 00:06:04.059 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.3_qat 00:06:04.059 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.4_qat 00:06:04.059 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.5_qat 00:06:04.059 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.6_qat 00:06:04.059 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.7_qat 00:06:04.059 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.0_qat 00:06:04.059 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.1_qat 00:06:04.059 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.2_qat 00:06:04.059 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.3_qat 00:06:04.059 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.4_qat 00:06:04.059 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.5_qat 00:06:04.059 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.6_qat 00:06:04.059 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.7_qat 00:06:04.059 element at address: 0x20000039b700 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.0_qat 00:06:04.059 element at address: 0x200000397c40 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.1_qat 00:06:04.059 element at address: 0x200000394180 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.2_qat 00:06:04.059 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.3_qat 00:06:04.059 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.4_qat 00:06:04.059 element at address: 0x200000389140 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.5_qat 00:06:04.059 element at address: 0x200000385680 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.6_qat 00:06:04.059 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.7_qat 00:06:04.059 element at address: 0x20000037e100 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.0_qat 00:06:04.059 element at address: 0x20000037a640 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.1_qat 00:06:04.059 element at address: 0x200000376b80 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.2_qat 00:06:04.059 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.3_qat 00:06:04.059 element at address: 0x20000036f600 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.4_qat 00:06:04.059 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.5_qat 00:06:04.059 element at address: 0x200000368080 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.6_qat 00:06:04.059 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.7_qat 00:06:04.059 element at address: 0x200000360b00 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.0_qat 00:06:04.059 element at address: 0x20000035d040 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.1_qat 00:06:04.059 element at address: 0x200000359580 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.2_qat 00:06:04.059 element at address: 0x200000355ac0 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.3_qat 00:06:04.059 element at address: 0x200000352000 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.4_qat 00:06:04.059 element at address: 0x20000034e540 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.5_qat 00:06:04.059 element at address: 0x20000034aa80 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.6_qat 00:06:04.059 element at address: 0x200000346fc0 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.7_qat 00:06:04.059 element at address: 0x200000343500 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.0_qat 00:06:04.059 element at address: 0x20000033fa40 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.1_qat 00:06:04.059 element at address: 0x20000033bf80 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.2_qat 00:06:04.059 element at address: 0x2000003384c0 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.3_qat 00:06:04.059 element at address: 0x200000334a00 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.4_qat 00:06:04.059 element at address: 0x200000330f40 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.5_qat 00:06:04.059 element at address: 0x20000032d480 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.6_qat 00:06:04.059 element at address: 0x2000003299c0 with size: 0.000427 MiB 00:06:04.059 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.7_qat 00:06:04.059 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:06:04.059 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:04.059 element at address: 0x2000002263c0 with size: 0.000305 MiB 00:06:04.059 associated memzone info: size: 0.000183 MiB name: MP_msgpool_2011413 00:06:04.059 element at address: 0x200000200f80 with size: 0.000305 MiB 00:06:04.059 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_2011413 00:06:04.059 element at address: 0x200027e6c280 with size: 0.000305 MiB 00:06:04.059 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:04.059 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:06:04.059 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:04.059 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:06:04.059 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:04.059 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:06:04.059 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:06:04.059 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:06:04.059 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:04.059 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:06:04.059 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:04.059 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:06:04.059 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:06:04.059 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:06:04.059 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:04.059 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:06:04.059 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:04.059 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:06:04.059 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:06:04.059 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:06:04.059 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:04.059 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:06:04.059 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:04.059 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:06:04.059 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:06:04.059 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:06:04.059 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:04.059 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:06:04.059 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:04.059 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:06:04.059 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:06:04.059 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:06:04.059 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:04.059 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:04.060 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:06:04.060 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:04.060 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:04.060 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:06:04.060 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:04.060 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:04.060 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:06:04.060 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:04.060 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:04.060 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:06:04.060 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:04.060 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:04.060 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:06:04.060 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:04.060 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:04.060 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:06:04.060 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:04.060 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:04.060 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:06:04.060 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:04.060 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:04.060 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:06:04.060 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:04.060 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:04.060 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:06:04.060 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:04.060 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:04.060 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:06:04.060 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:04.060 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:04.060 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:06:04.060 element at address: 0x20000039b600 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:04.060 element at address: 0x20000039b440 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:04.060 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:06:04.060 element at address: 0x200000397b40 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:04.060 element at address: 0x200000397980 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:04.060 element at address: 0x200000397700 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:06:04.060 element at address: 0x200000394080 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:04.060 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:04.060 element at address: 0x200000393c40 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:06:04.060 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:04.060 element at address: 0x200000390400 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:04.060 element at address: 0x200000390180 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:06:04.060 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:04.060 element at address: 0x20000038c940 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:04.060 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:06:04.060 element at address: 0x200000389040 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:04.060 element at address: 0x200000388e80 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:04.060 element at address: 0x200000388c00 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:06:04.060 element at address: 0x200000385580 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:04.060 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:04.060 element at address: 0x200000385140 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:06:04.060 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:04.060 element at address: 0x200000381900 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:04.060 element at address: 0x200000381680 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:06:04.060 element at address: 0x20000037e000 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:04.060 element at address: 0x20000037de40 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:04.060 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:06:04.060 element at address: 0x20000037a540 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:04.060 element at address: 0x20000037a380 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:04.060 element at address: 0x20000037a100 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:06:04.060 element at address: 0x200000376a80 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:04.060 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:04.060 element at address: 0x200000376640 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:06:04.060 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:04.060 element at address: 0x200000372e00 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:04.060 element at address: 0x200000372b80 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:06:04.060 element at address: 0x20000036f500 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:04.060 element at address: 0x20000036f340 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:04.060 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:06:04.060 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:06:04.061 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:04.061 element at address: 0x20000036b880 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:04.061 element at address: 0x20000036b600 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:06:04.061 element at address: 0x200000367f80 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:04.061 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:04.061 element at address: 0x200000367b40 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:06:04.061 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:04.061 element at address: 0x200000364300 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:04.061 element at address: 0x200000364080 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:06:04.061 element at address: 0x200000360a00 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_64 00:06:04.061 element at address: 0x200000360840 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_65 00:06:04.061 element at address: 0x2000003605c0 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_32 00:06:04.061 element at address: 0x20000035cf40 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_66 00:06:04.061 element at address: 0x20000035cd80 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_67 00:06:04.061 element at address: 0x20000035cb00 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_33 00:06:04.061 element at address: 0x200000359480 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_68 00:06:04.061 element at address: 0x2000003592c0 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_69 00:06:04.061 element at address: 0x200000359040 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_34 00:06:04.061 element at address: 0x2000003559c0 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_70 00:06:04.061 element at address: 0x200000355800 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_71 00:06:04.061 element at address: 0x200000355580 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_35 00:06:04.061 element at address: 0x200000351f00 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_72 00:06:04.061 element at address: 0x200000351d40 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_73 00:06:04.061 element at address: 0x200000351ac0 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_36 00:06:04.061 element at address: 0x20000034e440 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_74 00:06:04.061 element at address: 0x20000034e280 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_75 00:06:04.061 element at address: 0x20000034e000 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_37 00:06:04.061 element at address: 0x20000034a980 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_76 00:06:04.061 element at address: 0x20000034a7c0 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_77 00:06:04.061 element at address: 0x20000034a540 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_38 00:06:04.061 element at address: 0x200000346ec0 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_78 00:06:04.061 element at address: 0x200000346d00 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_79 00:06:04.061 element at address: 0x200000346a80 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_39 00:06:04.061 element at address: 0x200000343400 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_80 00:06:04.061 element at address: 0x200000343240 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_81 00:06:04.061 element at address: 0x200000342fc0 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_40 00:06:04.061 element at address: 0x20000033f940 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_82 00:06:04.061 element at address: 0x20000033f780 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_83 00:06:04.061 element at address: 0x20000033f500 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_41 00:06:04.061 element at address: 0x20000033be80 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_84 00:06:04.061 element at address: 0x20000033bcc0 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_85 00:06:04.061 element at address: 0x20000033ba40 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_42 00:06:04.061 element at address: 0x2000003383c0 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_86 00:06:04.061 element at address: 0x200000338200 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_87 00:06:04.061 element at address: 0x200000337f80 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_43 00:06:04.061 element at address: 0x200000334900 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_88 00:06:04.061 element at address: 0x200000334740 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_89 00:06:04.061 element at address: 0x2000003344c0 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_44 00:06:04.061 element at address: 0x200000330e40 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_90 00:06:04.061 element at address: 0x200000330c80 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_91 00:06:04.061 element at address: 0x200000330a00 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_45 00:06:04.061 element at address: 0x20000032d380 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_92 00:06:04.061 element at address: 0x20000032d1c0 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_93 00:06:04.061 element at address: 0x20000032cf40 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_46 00:06:04.061 element at address: 0x2000003298c0 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_94 00:06:04.061 element at address: 0x200000329700 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_95 00:06:04.061 element at address: 0x200000329480 with size: 0.000244 MiB 00:06:04.061 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_47 00:06:04.061 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:06:04.061 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:04.061 18:42:48 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:04.061 18:42:48 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 2011413 00:06:04.061 18:42:48 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 2011413 ']' 00:06:04.061 18:42:48 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 2011413 00:06:04.061 18:42:48 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:06:04.061 18:42:48 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:04.061 18:42:48 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2011413 00:06:04.061 18:42:48 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:04.061 18:42:48 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:04.061 18:42:48 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2011413' 00:06:04.061 killing process with pid 2011413 00:06:04.061 18:42:48 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 2011413 00:06:04.061 18:42:48 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 2011413 00:06:04.321 00:06:04.321 real 0m1.429s 00:06:04.321 user 0m1.524s 00:06:04.321 sys 0m0.402s 00:06:04.321 18:42:49 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:04.321 18:42:49 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:04.321 ************************************ 00:06:04.321 END TEST dpdk_mem_utility 00:06:04.321 ************************************ 00:06:04.321 18:42:49 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:04.321 18:42:49 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:04.321 18:42:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:04.321 18:42:49 -- common/autotest_common.sh@10 -- # set +x 00:06:04.321 ************************************ 00:06:04.321 START TEST event 00:06:04.321 ************************************ 00:06:04.321 18:42:49 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:04.579 * Looking for test storage... 00:06:04.579 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:06:04.579 18:42:49 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:04.579 18:42:49 event -- bdev/nbd_common.sh@6 -- # set -e 00:06:04.579 18:42:49 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:04.579 18:42:49 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:06:04.579 18:42:49 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:04.579 18:42:49 event -- common/autotest_common.sh@10 -- # set +x 00:06:04.579 ************************************ 00:06:04.579 START TEST event_perf 00:06:04.579 ************************************ 00:06:04.579 18:42:49 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:04.579 Running I/O for 1 seconds...[2024-07-24 18:42:49.399317] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:04.579 [2024-07-24 18:42:49.399363] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2011785 ] 00:06:04.579 [2024-07-24 18:42:49.463737] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:04.580 [2024-07-24 18:42:49.538747] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:04.580 [2024-07-24 18:42:49.538844] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:06:04.580 [2024-07-24 18:42:49.538956] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:06:04.580 [2024-07-24 18:42:49.538957] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:05.958 Running I/O for 1 seconds... 00:06:05.958 lcore 0: 216188 00:06:05.958 lcore 1: 216188 00:06:05.958 lcore 2: 216187 00:06:05.958 lcore 3: 216187 00:06:05.958 done. 00:06:05.958 00:06:05.958 real 0m1.235s 00:06:05.958 user 0m4.158s 00:06:05.958 sys 0m0.074s 00:06:05.958 18:42:50 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:05.958 18:42:50 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:06:05.958 ************************************ 00:06:05.958 END TEST event_perf 00:06:05.958 ************************************ 00:06:05.958 18:42:50 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:05.958 18:42:50 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:06:05.958 18:42:50 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:05.958 18:42:50 event -- common/autotest_common.sh@10 -- # set +x 00:06:05.958 ************************************ 00:06:05.958 START TEST event_reactor 00:06:05.958 ************************************ 00:06:05.958 18:42:50 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:05.958 [2024-07-24 18:42:50.693874] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:05.958 [2024-07-24 18:42:50.693919] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2011992 ] 00:06:05.958 [2024-07-24 18:42:50.757236] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:05.958 [2024-07-24 18:42:50.827361] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.895 test_start 00:06:06.895 oneshot 00:06:06.895 tick 100 00:06:06.895 tick 100 00:06:06.895 tick 250 00:06:06.895 tick 100 00:06:06.895 tick 100 00:06:06.895 tick 100 00:06:06.895 tick 250 00:06:06.895 tick 500 00:06:06.895 tick 100 00:06:06.895 tick 100 00:06:06.895 tick 250 00:06:06.895 tick 100 00:06:06.895 tick 100 00:06:06.895 test_end 00:06:06.895 00:06:06.895 real 0m1.220s 00:06:06.895 user 0m1.145s 00:06:06.895 sys 0m0.071s 00:06:06.895 18:42:51 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:06.895 18:42:51 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:06:06.895 ************************************ 00:06:06.895 END TEST event_reactor 00:06:06.895 ************************************ 00:06:07.154 18:42:51 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:07.154 18:42:51 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:06:07.154 18:42:51 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:07.154 18:42:51 event -- common/autotest_common.sh@10 -- # set +x 00:06:07.154 ************************************ 00:06:07.154 START TEST event_reactor_perf 00:06:07.154 ************************************ 00:06:07.154 18:42:51 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:07.154 [2024-07-24 18:42:51.980295] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:07.154 [2024-07-24 18:42:51.980351] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2012193 ] 00:06:07.154 [2024-07-24 18:42:52.047259] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:07.154 [2024-07-24 18:42:52.118359] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:08.534 test_start 00:06:08.534 test_end 00:06:08.534 Performance: 522670 events per second 00:06:08.534 00:06:08.534 real 0m1.228s 00:06:08.534 user 0m1.144s 00:06:08.534 sys 0m0.080s 00:06:08.534 18:42:53 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:08.534 18:42:53 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:06:08.534 ************************************ 00:06:08.534 END TEST event_reactor_perf 00:06:08.534 ************************************ 00:06:08.534 18:42:53 event -- event/event.sh@49 -- # uname -s 00:06:08.534 18:42:53 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:08.534 18:42:53 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:08.534 18:42:53 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:08.534 18:42:53 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:08.534 18:42:53 event -- common/autotest_common.sh@10 -- # set +x 00:06:08.534 ************************************ 00:06:08.534 START TEST event_scheduler 00:06:08.534 ************************************ 00:06:08.534 18:42:53 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:08.534 * Looking for test storage... 00:06:08.534 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:06:08.534 18:42:53 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:08.534 18:42:53 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=2012471 00:06:08.534 18:42:53 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:08.534 18:42:53 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:08.534 18:42:53 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 2012471 00:06:08.534 18:42:53 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 2012471 ']' 00:06:08.534 18:42:53 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:08.534 18:42:53 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:08.534 18:42:53 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:08.534 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:08.534 18:42:53 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:08.534 18:42:53 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:08.534 [2024-07-24 18:42:53.382813] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:08.534 [2024-07-24 18:42:53.382860] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2012471 ] 00:06:08.534 [2024-07-24 18:42:53.442311] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:08.534 [2024-07-24 18:42:53.524176] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:08.534 [2024-07-24 18:42:53.524200] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:08.534 [2024-07-24 18:42:53.524266] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:06:08.534 [2024-07-24 18:42:53.524267] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:06:09.470 18:42:54 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:09.470 18:42:54 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:06:09.470 18:42:54 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:09.470 18:42:54 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:09.470 18:42:54 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:09.470 [2024-07-24 18:42:54.190611] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:06:09.470 [2024-07-24 18:42:54.190630] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:06:09.470 [2024-07-24 18:42:54.190639] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:06:09.470 [2024-07-24 18:42:54.190645] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:06:09.470 [2024-07-24 18:42:54.190650] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:06:09.470 18:42:54 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:09.470 18:42:54 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:09.470 18:42:54 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:09.470 18:42:54 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:09.470 [2024-07-24 18:42:54.270271] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:09.470 18:42:54 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:09.470 18:42:54 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:09.470 18:42:54 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:09.470 18:42:54 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:09.470 18:42:54 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:09.470 ************************************ 00:06:09.470 START TEST scheduler_create_thread 00:06:09.470 ************************************ 00:06:09.470 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:06:09.470 18:42:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:09.470 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:09.470 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:09.470 2 00:06:09.470 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:09.470 18:42:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:09.470 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:09.470 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:09.470 3 00:06:09.470 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:09.470 18:42:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:09.470 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:09.470 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:09.470 4 00:06:09.470 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:09.471 5 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:09.471 6 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:09.471 7 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:09.471 8 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:09.471 9 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:09.471 10 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:09.471 18:42:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:11.375 18:42:55 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.375 18:42:55 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:11.375 18:42:55 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:11.375 18:42:55 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:11.375 18:42:55 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:11.943 18:42:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.943 00:06:11.943 real 0m2.619s 00:06:11.943 user 0m0.022s 00:06:11.943 sys 0m0.003s 00:06:11.943 18:42:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:11.943 18:42:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:11.943 ************************************ 00:06:11.943 END TEST scheduler_create_thread 00:06:11.943 ************************************ 00:06:12.202 18:42:56 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:12.202 18:42:56 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 2012471 00:06:12.202 18:42:56 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 2012471 ']' 00:06:12.202 18:42:56 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 2012471 00:06:12.202 18:42:56 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:06:12.202 18:42:56 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:12.202 18:42:56 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2012471 00:06:12.202 18:42:56 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:06:12.202 18:42:56 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:06:12.202 18:42:56 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2012471' 00:06:12.202 killing process with pid 2012471 00:06:12.202 18:42:56 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 2012471 00:06:12.202 18:42:56 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 2012471 00:06:12.461 [2024-07-24 18:42:57.404314] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:12.719 00:06:12.719 real 0m4.344s 00:06:12.719 user 0m8.194s 00:06:12.719 sys 0m0.334s 00:06:12.719 18:42:57 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:12.719 18:42:57 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:12.719 ************************************ 00:06:12.719 END TEST event_scheduler 00:06:12.719 ************************************ 00:06:12.719 18:42:57 event -- event/event.sh@51 -- # modprobe -n nbd 00:06:12.719 18:42:57 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:12.719 18:42:57 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:12.719 18:42:57 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:12.719 18:42:57 event -- common/autotest_common.sh@10 -- # set +x 00:06:12.719 ************************************ 00:06:12.719 START TEST app_repeat 00:06:12.719 ************************************ 00:06:12.719 18:42:57 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:06:12.719 18:42:57 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:12.719 18:42:57 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:12.719 18:42:57 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:06:12.719 18:42:57 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:12.719 18:42:57 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:06:12.719 18:42:57 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:06:12.719 18:42:57 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:06:12.719 18:42:57 event.app_repeat -- event/event.sh@19 -- # repeat_pid=2013297 00:06:12.719 18:42:57 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:12.719 18:42:57 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:12.719 18:42:57 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 2013297' 00:06:12.719 Process app_repeat pid: 2013297 00:06:12.719 18:42:57 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:12.719 18:42:57 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:12.719 spdk_app_start Round 0 00:06:12.719 18:42:57 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2013297 /var/tmp/spdk-nbd.sock 00:06:12.719 18:42:57 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2013297 ']' 00:06:12.719 18:42:57 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:12.719 18:42:57 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:12.719 18:42:57 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:12.719 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:12.719 18:42:57 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:12.719 18:42:57 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:12.719 [2024-07-24 18:42:57.711899] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:12.719 [2024-07-24 18:42:57.711949] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2013297 ] 00:06:12.977 [2024-07-24 18:42:57.778414] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:12.977 [2024-07-24 18:42:57.849501] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:12.977 [2024-07-24 18:42:57.849503] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.544 18:42:58 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:13.544 18:42:58 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:06:13.544 18:42:58 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:13.802 Malloc0 00:06:13.803 18:42:58 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:14.062 Malloc1 00:06:14.062 18:42:58 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:14.062 18:42:58 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:14.062 18:42:58 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:14.062 18:42:58 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:14.062 18:42:58 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:14.062 18:42:58 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:14.062 18:42:58 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:14.062 18:42:58 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:14.062 18:42:58 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:14.062 18:42:58 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:14.062 18:42:58 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:14.062 18:42:58 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:14.062 18:42:58 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:14.062 18:42:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:14.062 18:42:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:14.062 18:42:58 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:14.062 /dev/nbd0 00:06:14.062 18:42:59 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:14.062 18:42:59 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:14.321 1+0 records in 00:06:14.321 1+0 records out 00:06:14.321 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000173943 s, 23.5 MB/s 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:14.321 18:42:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:14.321 18:42:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:14.321 18:42:59 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:14.321 /dev/nbd1 00:06:14.321 18:42:59 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:14.321 18:42:59 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:14.321 1+0 records in 00:06:14.321 1+0 records out 00:06:14.321 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000238688 s, 17.2 MB/s 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:14.321 18:42:59 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:14.321 18:42:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:14.321 18:42:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:14.321 18:42:59 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:14.321 18:42:59 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:14.321 18:42:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:14.620 { 00:06:14.620 "nbd_device": "/dev/nbd0", 00:06:14.620 "bdev_name": "Malloc0" 00:06:14.620 }, 00:06:14.620 { 00:06:14.620 "nbd_device": "/dev/nbd1", 00:06:14.620 "bdev_name": "Malloc1" 00:06:14.620 } 00:06:14.620 ]' 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:14.620 { 00:06:14.620 "nbd_device": "/dev/nbd0", 00:06:14.620 "bdev_name": "Malloc0" 00:06:14.620 }, 00:06:14.620 { 00:06:14.620 "nbd_device": "/dev/nbd1", 00:06:14.620 "bdev_name": "Malloc1" 00:06:14.620 } 00:06:14.620 ]' 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:14.620 /dev/nbd1' 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:14.620 /dev/nbd1' 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:14.620 256+0 records in 00:06:14.620 256+0 records out 00:06:14.620 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103245 s, 102 MB/s 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:14.620 256+0 records in 00:06:14.620 256+0 records out 00:06:14.620 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0129698 s, 80.8 MB/s 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:14.620 256+0 records in 00:06:14.620 256+0 records out 00:06:14.620 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0145534 s, 72.1 MB/s 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:14.620 18:42:59 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:14.913 18:42:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:14.913 18:42:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:14.913 18:42:59 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:14.913 18:42:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:14.913 18:42:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:14.913 18:42:59 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:14.913 18:42:59 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:14.913 18:42:59 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:14.913 18:42:59 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:14.913 18:42:59 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:15.172 18:42:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:15.172 18:42:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:15.172 18:42:59 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:15.172 18:42:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:15.172 18:42:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:15.172 18:42:59 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:15.172 18:42:59 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:15.172 18:42:59 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:15.172 18:42:59 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:15.172 18:42:59 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:15.172 18:42:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:15.172 18:43:00 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:15.172 18:43:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:15.172 18:43:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:15.431 18:43:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:15.432 18:43:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:15.432 18:43:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:15.432 18:43:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:15.432 18:43:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:15.432 18:43:00 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:15.432 18:43:00 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:15.432 18:43:00 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:15.432 18:43:00 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:15.432 18:43:00 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:15.432 18:43:00 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:15.691 [2024-07-24 18:43:00.582824] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:15.691 [2024-07-24 18:43:00.648286] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:15.691 [2024-07-24 18:43:00.648287] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.691 [2024-07-24 18:43:00.688003] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:15.691 [2024-07-24 18:43:00.688045] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:18.979 18:43:03 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:18.979 18:43:03 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:18.979 spdk_app_start Round 1 00:06:18.979 18:43:03 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2013297 /var/tmp/spdk-nbd.sock 00:06:18.979 18:43:03 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2013297 ']' 00:06:18.979 18:43:03 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:18.979 18:43:03 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:18.979 18:43:03 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:18.979 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:18.979 18:43:03 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:18.979 18:43:03 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:18.979 18:43:03 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:18.979 18:43:03 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:06:18.979 18:43:03 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:18.979 Malloc0 00:06:18.979 18:43:03 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:18.979 Malloc1 00:06:18.979 18:43:03 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:18.979 18:43:03 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:18.979 18:43:03 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:18.979 18:43:03 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:18.979 18:43:03 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:18.979 18:43:03 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:18.979 18:43:03 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:18.979 18:43:03 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:18.979 18:43:03 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:18.979 18:43:03 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:18.979 18:43:03 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:18.979 18:43:03 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:18.979 18:43:03 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:18.979 18:43:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:18.979 18:43:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:18.979 18:43:03 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:19.238 /dev/nbd0 00:06:19.238 18:43:04 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:19.238 18:43:04 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:19.238 18:43:04 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:19.238 18:43:04 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:19.238 18:43:04 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:19.238 18:43:04 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:19.238 18:43:04 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:19.238 18:43:04 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:19.238 18:43:04 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:19.238 18:43:04 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:19.238 18:43:04 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:19.238 1+0 records in 00:06:19.238 1+0 records out 00:06:19.238 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233594 s, 17.5 MB/s 00:06:19.238 18:43:04 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:19.238 18:43:04 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:19.238 18:43:04 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:19.238 18:43:04 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:19.238 18:43:04 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:19.238 18:43:04 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:19.238 18:43:04 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:19.239 18:43:04 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:19.498 /dev/nbd1 00:06:19.498 18:43:04 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:19.498 18:43:04 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:19.498 18:43:04 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:19.498 18:43:04 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:19.498 18:43:04 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:19.498 18:43:04 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:19.498 18:43:04 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:19.498 18:43:04 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:19.498 18:43:04 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:19.498 18:43:04 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:19.498 18:43:04 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:19.498 1+0 records in 00:06:19.498 1+0 records out 00:06:19.498 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000196291 s, 20.9 MB/s 00:06:19.498 18:43:04 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:19.498 18:43:04 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:19.498 18:43:04 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:19.498 18:43:04 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:19.498 18:43:04 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:19.498 18:43:04 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:19.498 18:43:04 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:19.498 18:43:04 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:19.498 18:43:04 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:19.498 18:43:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:19.498 18:43:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:19.498 { 00:06:19.498 "nbd_device": "/dev/nbd0", 00:06:19.498 "bdev_name": "Malloc0" 00:06:19.498 }, 00:06:19.498 { 00:06:19.498 "nbd_device": "/dev/nbd1", 00:06:19.498 "bdev_name": "Malloc1" 00:06:19.498 } 00:06:19.498 ]' 00:06:19.498 18:43:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:19.498 18:43:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:19.498 { 00:06:19.498 "nbd_device": "/dev/nbd0", 00:06:19.498 "bdev_name": "Malloc0" 00:06:19.498 }, 00:06:19.498 { 00:06:19.498 "nbd_device": "/dev/nbd1", 00:06:19.498 "bdev_name": "Malloc1" 00:06:19.498 } 00:06:19.498 ]' 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:19.757 /dev/nbd1' 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:19.757 /dev/nbd1' 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:19.757 256+0 records in 00:06:19.757 256+0 records out 00:06:19.757 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0102871 s, 102 MB/s 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:19.757 256+0 records in 00:06:19.757 256+0 records out 00:06:19.757 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.013619 s, 77.0 MB/s 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:19.757 256+0 records in 00:06:19.757 256+0 records out 00:06:19.757 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0146404 s, 71.6 MB/s 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:19.757 18:43:04 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:20.020 18:43:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:20.278 18:43:05 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:20.278 18:43:05 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:20.278 18:43:05 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:20.278 18:43:05 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:20.278 18:43:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:20.278 18:43:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:20.278 18:43:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:20.278 18:43:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:20.278 18:43:05 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:20.278 18:43:05 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:20.278 18:43:05 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:20.278 18:43:05 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:20.278 18:43:05 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:20.536 18:43:05 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:20.795 [2024-07-24 18:43:05.575617] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:20.795 [2024-07-24 18:43:05.653716] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:20.795 [2024-07-24 18:43:05.653718] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.795 [2024-07-24 18:43:05.695282] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:20.795 [2024-07-24 18:43:05.695323] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:24.080 18:43:08 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:24.080 18:43:08 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:24.081 spdk_app_start Round 2 00:06:24.081 18:43:08 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2013297 /var/tmp/spdk-nbd.sock 00:06:24.081 18:43:08 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2013297 ']' 00:06:24.081 18:43:08 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:24.081 18:43:08 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:24.081 18:43:08 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:24.081 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:24.081 18:43:08 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:24.081 18:43:08 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:24.081 18:43:08 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:24.081 18:43:08 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:06:24.081 18:43:08 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:24.081 Malloc0 00:06:24.081 18:43:08 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:24.081 Malloc1 00:06:24.081 18:43:08 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:24.081 18:43:08 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:24.081 18:43:08 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:24.081 18:43:08 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:24.081 18:43:08 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:24.081 18:43:08 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:24.081 18:43:08 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:24.081 18:43:08 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:24.081 18:43:08 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:24.081 18:43:08 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:24.081 18:43:08 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:24.081 18:43:08 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:24.081 18:43:08 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:24.081 18:43:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:24.081 18:43:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:24.081 18:43:08 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:24.081 /dev/nbd0 00:06:24.081 18:43:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:24.081 18:43:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:24.081 18:43:09 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:24.081 18:43:09 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:24.081 18:43:09 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:24.081 18:43:09 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:24.081 18:43:09 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:24.081 18:43:09 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:24.081 18:43:09 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:24.081 18:43:09 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:24.081 18:43:09 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:24.081 1+0 records in 00:06:24.081 1+0 records out 00:06:24.081 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000176835 s, 23.2 MB/s 00:06:24.081 18:43:09 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:24.340 18:43:09 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:24.340 18:43:09 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:24.340 18:43:09 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:24.340 18:43:09 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:24.340 18:43:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:24.340 18:43:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:24.340 18:43:09 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:24.340 /dev/nbd1 00:06:24.340 18:43:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:24.340 18:43:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:24.340 18:43:09 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:24.340 18:43:09 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:24.340 18:43:09 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:24.340 18:43:09 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:24.340 18:43:09 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:24.340 18:43:09 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:24.340 18:43:09 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:24.340 18:43:09 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:24.340 18:43:09 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:24.340 1+0 records in 00:06:24.340 1+0 records out 00:06:24.340 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000218903 s, 18.7 MB/s 00:06:24.340 18:43:09 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:24.340 18:43:09 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:24.340 18:43:09 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:24.340 18:43:09 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:24.340 18:43:09 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:24.340 18:43:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:24.340 18:43:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:24.340 18:43:09 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:24.340 18:43:09 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:24.340 18:43:09 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:24.599 { 00:06:24.599 "nbd_device": "/dev/nbd0", 00:06:24.599 "bdev_name": "Malloc0" 00:06:24.599 }, 00:06:24.599 { 00:06:24.599 "nbd_device": "/dev/nbd1", 00:06:24.599 "bdev_name": "Malloc1" 00:06:24.599 } 00:06:24.599 ]' 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:24.599 { 00:06:24.599 "nbd_device": "/dev/nbd0", 00:06:24.599 "bdev_name": "Malloc0" 00:06:24.599 }, 00:06:24.599 { 00:06:24.599 "nbd_device": "/dev/nbd1", 00:06:24.599 "bdev_name": "Malloc1" 00:06:24.599 } 00:06:24.599 ]' 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:24.599 /dev/nbd1' 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:24.599 /dev/nbd1' 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:24.599 256+0 records in 00:06:24.599 256+0 records out 00:06:24.599 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104786 s, 100 MB/s 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:24.599 256+0 records in 00:06:24.599 256+0 records out 00:06:24.599 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0132528 s, 79.1 MB/s 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:24.599 256+0 records in 00:06:24.599 256+0 records out 00:06:24.599 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0142811 s, 73.4 MB/s 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:24.599 18:43:09 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:24.859 18:43:09 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:24.859 18:43:09 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:24.859 18:43:09 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:24.859 18:43:09 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:24.859 18:43:09 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:24.859 18:43:09 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:24.859 18:43:09 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:24.859 18:43:09 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:24.859 18:43:09 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:24.859 18:43:09 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:25.118 18:43:09 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:25.118 18:43:09 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:25.118 18:43:09 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:25.118 18:43:09 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:25.118 18:43:09 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:25.118 18:43:09 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:25.118 18:43:09 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:25.118 18:43:09 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:25.118 18:43:09 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:25.118 18:43:09 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:25.118 18:43:09 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:25.377 18:43:10 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:25.377 18:43:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:25.377 18:43:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:25.377 18:43:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:25.377 18:43:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:25.377 18:43:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:25.377 18:43:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:25.377 18:43:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:25.377 18:43:10 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:25.377 18:43:10 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:25.377 18:43:10 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:25.377 18:43:10 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:25.377 18:43:10 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:25.636 18:43:10 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:25.636 [2024-07-24 18:43:10.575991] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:25.636 [2024-07-24 18:43:10.641584] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:25.636 [2024-07-24 18:43:10.641586] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.896 [2024-07-24 18:43:10.682182] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:25.896 [2024-07-24 18:43:10.682224] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:28.432 18:43:13 event.app_repeat -- event/event.sh@38 -- # waitforlisten 2013297 /var/tmp/spdk-nbd.sock 00:06:28.432 18:43:13 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2013297 ']' 00:06:28.432 18:43:13 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:28.432 18:43:13 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:28.432 18:43:13 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:28.432 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:28.432 18:43:13 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:28.432 18:43:13 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:28.691 18:43:13 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:28.691 18:43:13 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:06:28.691 18:43:13 event.app_repeat -- event/event.sh@39 -- # killprocess 2013297 00:06:28.691 18:43:13 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 2013297 ']' 00:06:28.691 18:43:13 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 2013297 00:06:28.691 18:43:13 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:06:28.691 18:43:13 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:28.691 18:43:13 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2013297 00:06:28.691 18:43:13 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:28.691 18:43:13 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:28.691 18:43:13 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2013297' 00:06:28.691 killing process with pid 2013297 00:06:28.691 18:43:13 event.app_repeat -- common/autotest_common.sh@967 -- # kill 2013297 00:06:28.691 18:43:13 event.app_repeat -- common/autotest_common.sh@972 -- # wait 2013297 00:06:28.950 spdk_app_start is called in Round 0. 00:06:28.950 Shutdown signal received, stop current app iteration 00:06:28.950 Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 reinitialization... 00:06:28.950 spdk_app_start is called in Round 1. 00:06:28.950 Shutdown signal received, stop current app iteration 00:06:28.950 Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 reinitialization... 00:06:28.950 spdk_app_start is called in Round 2. 00:06:28.950 Shutdown signal received, stop current app iteration 00:06:28.950 Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 reinitialization... 00:06:28.950 spdk_app_start is called in Round 3. 00:06:28.950 Shutdown signal received, stop current app iteration 00:06:28.950 18:43:13 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:28.950 18:43:13 event.app_repeat -- event/event.sh@42 -- # return 0 00:06:28.950 00:06:28.950 real 0m16.092s 00:06:28.950 user 0m34.708s 00:06:28.950 sys 0m2.351s 00:06:28.950 18:43:13 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:28.950 18:43:13 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:28.950 ************************************ 00:06:28.950 END TEST app_repeat 00:06:28.950 ************************************ 00:06:28.950 18:43:13 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:28.950 00:06:28.950 real 0m24.520s 00:06:28.950 user 0m49.492s 00:06:28.950 sys 0m3.198s 00:06:28.950 18:43:13 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:28.950 18:43:13 event -- common/autotest_common.sh@10 -- # set +x 00:06:28.950 ************************************ 00:06:28.950 END TEST event 00:06:28.950 ************************************ 00:06:28.950 18:43:13 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:06:28.950 18:43:13 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:28.950 18:43:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:28.950 18:43:13 -- common/autotest_common.sh@10 -- # set +x 00:06:28.950 ************************************ 00:06:28.950 START TEST thread 00:06:28.950 ************************************ 00:06:28.950 18:43:13 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:06:28.950 * Looking for test storage... 00:06:28.950 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:06:28.950 18:43:13 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:28.950 18:43:13 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:06:28.950 18:43:13 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:28.950 18:43:13 thread -- common/autotest_common.sh@10 -- # set +x 00:06:29.209 ************************************ 00:06:29.209 START TEST thread_poller_perf 00:06:29.209 ************************************ 00:06:29.209 18:43:13 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:29.209 [2024-07-24 18:43:14.001797] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:29.209 [2024-07-24 18:43:14.001856] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2016260 ] 00:06:29.209 [2024-07-24 18:43:14.069099] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:29.209 [2024-07-24 18:43:14.140374] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.209 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:30.587 ====================================== 00:06:30.587 busy:2105237694 (cyc) 00:06:30.587 total_run_count: 425000 00:06:30.587 tsc_hz: 2100000000 (cyc) 00:06:30.587 ====================================== 00:06:30.587 poller_cost: 4953 (cyc), 2358 (nsec) 00:06:30.587 00:06:30.587 real 0m1.237s 00:06:30.587 user 0m1.148s 00:06:30.587 sys 0m0.084s 00:06:30.587 18:43:15 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:30.587 18:43:15 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:06:30.587 ************************************ 00:06:30.587 END TEST thread_poller_perf 00:06:30.587 ************************************ 00:06:30.587 18:43:15 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:30.587 18:43:15 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:06:30.587 18:43:15 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:30.587 18:43:15 thread -- common/autotest_common.sh@10 -- # set +x 00:06:30.587 ************************************ 00:06:30.587 START TEST thread_poller_perf 00:06:30.587 ************************************ 00:06:30.587 18:43:15 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:30.587 [2024-07-24 18:43:15.306938] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:30.587 [2024-07-24 18:43:15.306999] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2016512 ] 00:06:30.587 [2024-07-24 18:43:15.374149] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:30.587 [2024-07-24 18:43:15.442636] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:30.587 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:31.523 ====================================== 00:06:31.524 busy:2101262316 (cyc) 00:06:31.524 total_run_count: 5559000 00:06:31.524 tsc_hz: 2100000000 (cyc) 00:06:31.524 ====================================== 00:06:31.524 poller_cost: 377 (cyc), 179 (nsec) 00:06:31.524 00:06:31.524 real 0m1.232s 00:06:31.524 user 0m1.143s 00:06:31.524 sys 0m0.085s 00:06:31.524 18:43:16 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:31.524 18:43:16 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:06:31.524 ************************************ 00:06:31.524 END TEST thread_poller_perf 00:06:31.524 ************************************ 00:06:31.783 18:43:16 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:06:31.783 00:06:31.783 real 0m2.687s 00:06:31.783 user 0m2.377s 00:06:31.783 sys 0m0.316s 00:06:31.783 18:43:16 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:31.783 18:43:16 thread -- common/autotest_common.sh@10 -- # set +x 00:06:31.783 ************************************ 00:06:31.783 END TEST thread 00:06:31.783 ************************************ 00:06:31.783 18:43:16 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:06:31.783 18:43:16 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:31.783 18:43:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:31.783 18:43:16 -- common/autotest_common.sh@10 -- # set +x 00:06:31.783 ************************************ 00:06:31.783 START TEST accel 00:06:31.783 ************************************ 00:06:31.783 18:43:16 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:06:31.783 * Looking for test storage... 00:06:31.783 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:06:31.783 18:43:16 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:06:31.783 18:43:16 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:06:31.783 18:43:16 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:31.783 18:43:16 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=2016801 00:06:31.783 18:43:16 accel -- accel/accel.sh@63 -- # waitforlisten 2016801 00:06:31.783 18:43:16 accel -- common/autotest_common.sh@829 -- # '[' -z 2016801 ']' 00:06:31.783 18:43:16 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:31.783 18:43:16 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:31.783 18:43:16 accel -- accel/accel.sh@61 -- # build_accel_config 00:06:31.783 18:43:16 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:31.783 18:43:16 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:31.783 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:31.783 18:43:16 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:31.783 18:43:16 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:31.783 18:43:16 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:31.783 18:43:16 accel -- common/autotest_common.sh@10 -- # set +x 00:06:31.783 18:43:16 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.783 18:43:16 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.783 18:43:16 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:31.783 18:43:16 accel -- accel/accel.sh@40 -- # local IFS=, 00:06:31.783 18:43:16 accel -- accel/accel.sh@41 -- # jq -r . 00:06:31.783 [2024-07-24 18:43:16.748064] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:31.783 [2024-07-24 18:43:16.748109] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2016801 ] 00:06:32.041 [2024-07-24 18:43:16.812579] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.041 [2024-07-24 18:43:16.884180] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.609 18:43:17 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:32.609 18:43:17 accel -- common/autotest_common.sh@862 -- # return 0 00:06:32.609 18:43:17 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:06:32.609 18:43:17 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:06:32.609 18:43:17 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:06:32.609 18:43:17 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:06:32.609 18:43:17 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:32.609 18:43:17 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:06:32.609 18:43:17 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:32.609 18:43:17 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:32.609 18:43:17 accel -- common/autotest_common.sh@10 -- # set +x 00:06:32.609 18:43:17 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:32.609 18:43:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.609 18:43:17 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.609 18:43:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.609 18:43:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.609 18:43:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.610 18:43:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.610 18:43:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.610 18:43:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.610 18:43:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.610 18:43:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.610 18:43:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.610 18:43:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.610 18:43:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.610 18:43:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.610 18:43:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.610 18:43:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.610 18:43:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.610 18:43:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.610 18:43:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.610 18:43:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.610 18:43:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.610 18:43:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.610 18:43:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.610 18:43:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.610 18:43:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.610 18:43:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.610 18:43:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.610 18:43:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.610 18:43:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.610 18:43:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.610 18:43:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.610 18:43:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.610 18:43:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.610 18:43:17 accel -- accel/accel.sh@75 -- # killprocess 2016801 00:06:32.610 18:43:17 accel -- common/autotest_common.sh@948 -- # '[' -z 2016801 ']' 00:06:32.610 18:43:17 accel -- common/autotest_common.sh@952 -- # kill -0 2016801 00:06:32.610 18:43:17 accel -- common/autotest_common.sh@953 -- # uname 00:06:32.610 18:43:17 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:32.610 18:43:17 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2016801 00:06:32.869 18:43:17 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:32.869 18:43:17 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:32.869 18:43:17 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2016801' 00:06:32.869 killing process with pid 2016801 00:06:32.869 18:43:17 accel -- common/autotest_common.sh@967 -- # kill 2016801 00:06:32.869 18:43:17 accel -- common/autotest_common.sh@972 -- # wait 2016801 00:06:33.128 18:43:17 accel -- accel/accel.sh@76 -- # trap - ERR 00:06:33.128 18:43:17 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:06:33.128 18:43:17 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:06:33.128 18:43:17 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:33.128 18:43:17 accel -- common/autotest_common.sh@10 -- # set +x 00:06:33.128 18:43:17 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:06:33.128 18:43:17 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:33.128 18:43:17 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:06:33.128 18:43:17 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:33.128 18:43:17 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:33.128 18:43:17 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:33.128 18:43:17 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:33.128 18:43:17 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:33.128 18:43:17 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:06:33.128 18:43:17 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:06:33.128 18:43:18 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:33.128 18:43:18 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:06:33.128 18:43:18 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:33.128 18:43:18 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:06:33.128 18:43:18 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:33.128 18:43:18 accel -- common/autotest_common.sh@10 -- # set +x 00:06:33.128 ************************************ 00:06:33.128 START TEST accel_missing_filename 00:06:33.128 ************************************ 00:06:33.128 18:43:18 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:06:33.128 18:43:18 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:06:33.128 18:43:18 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:33.128 18:43:18 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:06:33.128 18:43:18 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:33.128 18:43:18 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:06:33.128 18:43:18 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:33.128 18:43:18 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:06:33.128 18:43:18 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:33.128 18:43:18 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:06:33.128 18:43:18 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:33.129 18:43:18 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:33.129 18:43:18 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:33.129 18:43:18 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:33.129 18:43:18 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:33.129 18:43:18 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:06:33.129 18:43:18 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:06:33.129 [2024-07-24 18:43:18.090307] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:33.129 [2024-07-24 18:43:18.090352] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2017065 ] 00:06:33.387 [2024-07-24 18:43:18.154969] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.388 [2024-07-24 18:43:18.225118] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.388 [2024-07-24 18:43:18.278529] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:33.388 [2024-07-24 18:43:18.338719] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:06:33.647 A filename is required. 00:06:33.647 18:43:18 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:06:33.647 18:43:18 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:33.647 18:43:18 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:06:33.647 18:43:18 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:06:33.647 18:43:18 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:06:33.647 18:43:18 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:33.647 00:06:33.647 real 0m0.352s 00:06:33.647 user 0m0.261s 00:06:33.647 sys 0m0.119s 00:06:33.647 18:43:18 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:33.647 18:43:18 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:06:33.647 ************************************ 00:06:33.647 END TEST accel_missing_filename 00:06:33.647 ************************************ 00:06:33.647 18:43:18 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:33.647 18:43:18 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:06:33.647 18:43:18 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:33.647 18:43:18 accel -- common/autotest_common.sh@10 -- # set +x 00:06:33.647 ************************************ 00:06:33.647 START TEST accel_compress_verify 00:06:33.647 ************************************ 00:06:33.647 18:43:18 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:33.647 18:43:18 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:06:33.647 18:43:18 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:33.647 18:43:18 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:06:33.647 18:43:18 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:33.647 18:43:18 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:06:33.647 18:43:18 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:33.647 18:43:18 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:33.647 18:43:18 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:06:33.647 18:43:18 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:33.647 18:43:18 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:33.647 18:43:18 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:33.647 18:43:18 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:33.647 18:43:18 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:33.647 18:43:18 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:33.648 18:43:18 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:06:33.648 18:43:18 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:06:33.648 [2024-07-24 18:43:18.499168] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:33.648 [2024-07-24 18:43:18.499209] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2017092 ] 00:06:33.648 [2024-07-24 18:43:18.562979] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.648 [2024-07-24 18:43:18.634238] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.907 [2024-07-24 18:43:18.688551] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:33.907 [2024-07-24 18:43:18.749145] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:06:33.907 00:06:33.907 Compression does not support the verify option, aborting. 00:06:33.907 18:43:18 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:06:33.907 18:43:18 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:33.907 18:43:18 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:06:33.907 18:43:18 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:06:33.907 18:43:18 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:06:33.907 18:43:18 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:33.907 00:06:33.907 real 0m0.344s 00:06:33.907 user 0m0.248s 00:06:33.907 sys 0m0.118s 00:06:33.907 18:43:18 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:33.907 18:43:18 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:06:33.907 ************************************ 00:06:33.907 END TEST accel_compress_verify 00:06:33.907 ************************************ 00:06:33.907 18:43:18 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:33.907 18:43:18 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:06:33.907 18:43:18 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:33.907 18:43:18 accel -- common/autotest_common.sh@10 -- # set +x 00:06:33.907 ************************************ 00:06:33.907 START TEST accel_wrong_workload 00:06:33.907 ************************************ 00:06:33.907 18:43:18 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:06:33.907 18:43:18 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:06:33.907 18:43:18 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:33.907 18:43:18 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:06:33.907 18:43:18 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:33.907 18:43:18 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:06:33.907 18:43:18 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:33.907 18:43:18 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:06:33.907 18:43:18 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:33.907 18:43:18 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:06:33.907 18:43:18 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:33.907 18:43:18 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:33.907 18:43:18 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:33.907 18:43:18 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:33.907 18:43:18 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:33.907 18:43:18 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:06:33.907 18:43:18 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:06:34.167 Unsupported workload type: foobar 00:06:34.167 [2024-07-24 18:43:18.917534] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:34.167 accel_perf options: 00:06:34.167 [-h help message] 00:06:34.167 [-q queue depth per core] 00:06:34.167 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:34.167 [-T number of threads per core 00:06:34.167 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:34.167 [-t time in seconds] 00:06:34.167 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:34.167 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:06:34.167 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:34.167 [-l for compress/decompress workloads, name of uncompressed input file 00:06:34.167 [-S for crc32c workload, use this seed value (default 0) 00:06:34.167 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:34.167 [-f for fill workload, use this BYTE value (default 255) 00:06:34.167 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:34.167 [-y verify result if this switch is on] 00:06:34.167 [-a tasks to allocate per core (default: same value as -q)] 00:06:34.167 Can be used to spread operations across a wider range of memory. 00:06:34.167 18:43:18 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:06:34.167 18:43:18 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:34.167 18:43:18 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:34.167 18:43:18 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:34.167 00:06:34.167 real 0m0.039s 00:06:34.167 user 0m0.023s 00:06:34.167 sys 0m0.016s 00:06:34.167 18:43:18 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:34.167 18:43:18 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:06:34.167 ************************************ 00:06:34.167 END TEST accel_wrong_workload 00:06:34.167 ************************************ 00:06:34.167 Error: writing output failed: Broken pipe 00:06:34.167 18:43:18 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:34.167 18:43:18 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:06:34.167 18:43:18 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:34.167 18:43:18 accel -- common/autotest_common.sh@10 -- # set +x 00:06:34.167 ************************************ 00:06:34.167 START TEST accel_negative_buffers 00:06:34.167 ************************************ 00:06:34.167 18:43:18 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:34.167 18:43:18 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:06:34.167 18:43:18 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:34.167 18:43:18 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:06:34.167 18:43:18 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:34.167 18:43:18 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:06:34.167 18:43:18 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:34.167 18:43:18 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:06:34.167 18:43:18 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:34.167 18:43:18 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:06:34.167 18:43:18 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:34.167 18:43:18 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:34.167 18:43:18 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:34.167 18:43:18 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:34.167 18:43:18 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:34.167 18:43:18 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:06:34.167 18:43:18 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:06:34.167 -x option must be non-negative. 00:06:34.167 [2024-07-24 18:43:19.015117] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:34.167 accel_perf options: 00:06:34.167 [-h help message] 00:06:34.167 [-q queue depth per core] 00:06:34.167 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:34.167 [-T number of threads per core 00:06:34.167 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:34.167 [-t time in seconds] 00:06:34.167 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:34.167 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:06:34.167 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:34.167 [-l for compress/decompress workloads, name of uncompressed input file 00:06:34.167 [-S for crc32c workload, use this seed value (default 0) 00:06:34.167 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:34.167 [-f for fill workload, use this BYTE value (default 255) 00:06:34.167 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:34.167 [-y verify result if this switch is on] 00:06:34.167 [-a tasks to allocate per core (default: same value as -q)] 00:06:34.167 Can be used to spread operations across a wider range of memory. 00:06:34.167 18:43:19 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:06:34.167 18:43:19 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:34.167 18:43:19 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:34.167 18:43:19 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:34.167 00:06:34.167 real 0m0.038s 00:06:34.167 user 0m0.024s 00:06:34.167 sys 0m0.014s 00:06:34.167 18:43:19 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:34.167 18:43:19 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:06:34.167 ************************************ 00:06:34.167 END TEST accel_negative_buffers 00:06:34.167 ************************************ 00:06:34.167 Error: writing output failed: Broken pipe 00:06:34.167 18:43:19 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:34.167 18:43:19 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:06:34.167 18:43:19 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:34.167 18:43:19 accel -- common/autotest_common.sh@10 -- # set +x 00:06:34.167 ************************************ 00:06:34.167 START TEST accel_crc32c 00:06:34.167 ************************************ 00:06:34.167 18:43:19 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:34.167 18:43:19 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:06:34.167 18:43:19 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:06:34.167 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:34.167 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:34.167 18:43:19 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:34.167 18:43:19 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:34.167 18:43:19 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:06:34.167 18:43:19 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:34.167 18:43:19 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:34.167 18:43:19 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:34.167 18:43:19 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:34.167 18:43:19 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:34.167 18:43:19 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:06:34.167 18:43:19 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:06:34.167 [2024-07-24 18:43:19.111418] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:34.167 [2024-07-24 18:43:19.111464] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2017217 ] 00:06:34.427 [2024-07-24 18:43:19.180576] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.427 [2024-07-24 18:43:19.254170] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:34.427 18:43:19 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:35.818 18:43:20 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:06:35.819 18:43:20 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:35.819 00:06:35.819 real 0m1.365s 00:06:35.819 user 0m1.250s 00:06:35.819 sys 0m0.121s 00:06:35.819 18:43:20 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:35.819 18:43:20 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:06:35.819 ************************************ 00:06:35.819 END TEST accel_crc32c 00:06:35.819 ************************************ 00:06:35.819 18:43:20 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:35.819 18:43:20 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:06:35.819 18:43:20 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:35.819 18:43:20 accel -- common/autotest_common.sh@10 -- # set +x 00:06:35.819 ************************************ 00:06:35.819 START TEST accel_crc32c_C2 00:06:35.819 ************************************ 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:06:35.819 [2024-07-24 18:43:20.538228] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:35.819 [2024-07-24 18:43:20.538274] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2017525 ] 00:06:35.819 [2024-07-24 18:43:20.603101] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.819 [2024-07-24 18:43:20.675308] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.819 18:43:20 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:37.222 00:06:37.222 real 0m1.360s 00:06:37.222 user 0m1.243s 00:06:37.222 sys 0m0.122s 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:37.222 18:43:21 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:06:37.222 ************************************ 00:06:37.222 END TEST accel_crc32c_C2 00:06:37.222 ************************************ 00:06:37.222 18:43:21 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:37.222 18:43:21 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:06:37.222 18:43:21 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:37.222 18:43:21 accel -- common/autotest_common.sh@10 -- # set +x 00:06:37.222 ************************************ 00:06:37.222 START TEST accel_copy 00:06:37.222 ************************************ 00:06:37.222 18:43:21 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:06:37.222 18:43:21 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:06:37.222 18:43:21 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:06:37.222 18:43:21 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:37.222 18:43:21 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:21 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:21 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:37.222 18:43:21 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:06:37.222 18:43:21 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:37.222 18:43:21 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:37.222 18:43:21 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.222 18:43:21 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.222 18:43:21 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:37.222 18:43:21 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:06:37.222 18:43:21 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:06:37.222 [2024-07-24 18:43:21.954338] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:37.222 [2024-07-24 18:43:21.954381] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2017804 ] 00:06:37.222 [2024-07-24 18:43:22.018584] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.222 [2024-07-24 18:43:22.090025] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:37.222 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:37.223 18:43:22 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:37.223 18:43:22 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:37.223 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:37.223 18:43:22 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:06:38.600 18:43:23 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:38.600 00:06:38.600 real 0m1.355s 00:06:38.600 user 0m1.236s 00:06:38.600 sys 0m0.118s 00:06:38.600 18:43:23 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:38.600 18:43:23 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:06:38.600 ************************************ 00:06:38.600 END TEST accel_copy 00:06:38.600 ************************************ 00:06:38.600 18:43:23 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:38.600 18:43:23 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:06:38.600 18:43:23 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:38.600 18:43:23 accel -- common/autotest_common.sh@10 -- # set +x 00:06:38.600 ************************************ 00:06:38.600 START TEST accel_fill 00:06:38.600 ************************************ 00:06:38.600 18:43:23 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:06:38.600 [2024-07-24 18:43:23.375866] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:38.600 [2024-07-24 18:43:23.375901] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2018091 ] 00:06:38.600 [2024-07-24 18:43:23.438165] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.600 [2024-07-24 18:43:23.508614] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.600 18:43:23 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:06:39.978 18:43:24 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:39.978 00:06:39.978 real 0m1.346s 00:06:39.978 user 0m1.234s 00:06:39.978 sys 0m0.118s 00:06:39.978 18:43:24 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:39.978 18:43:24 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:06:39.978 ************************************ 00:06:39.978 END TEST accel_fill 00:06:39.978 ************************************ 00:06:39.978 18:43:24 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:39.978 18:43:24 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:06:39.978 18:43:24 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:39.978 18:43:24 accel -- common/autotest_common.sh@10 -- # set +x 00:06:39.978 ************************************ 00:06:39.978 START TEST accel_copy_crc32c 00:06:39.978 ************************************ 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:06:39.978 [2024-07-24 18:43:24.794887] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:39.978 [2024-07-24 18:43:24.794934] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2018356 ] 00:06:39.978 [2024-07-24 18:43:24.859790] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.978 [2024-07-24 18:43:24.931291] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.978 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.237 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.238 18:43:24 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.238 18:43:25 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:40.238 18:43:25 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.238 18:43:25 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.238 18:43:25 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:41.175 00:06:41.175 real 0m1.361s 00:06:41.175 user 0m1.240s 00:06:41.175 sys 0m0.126s 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:41.175 18:43:26 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:06:41.175 ************************************ 00:06:41.175 END TEST accel_copy_crc32c 00:06:41.175 ************************************ 00:06:41.175 18:43:26 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:41.175 18:43:26 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:06:41.175 18:43:26 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:41.175 18:43:26 accel -- common/autotest_common.sh@10 -- # set +x 00:06:41.435 ************************************ 00:06:41.435 START TEST accel_copy_crc32c_C2 00:06:41.435 ************************************ 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:06:41.435 [2024-07-24 18:43:26.220699] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:41.435 [2024-07-24 18:43:26.220748] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2018598 ] 00:06:41.435 [2024-07-24 18:43:26.284465] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:41.435 [2024-07-24 18:43:26.355274] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.435 18:43:26 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:42.812 00:06:42.812 real 0m1.353s 00:06:42.812 user 0m1.237s 00:06:42.812 sys 0m0.123s 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:42.812 18:43:27 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:06:42.812 ************************************ 00:06:42.812 END TEST accel_copy_crc32c_C2 00:06:42.812 ************************************ 00:06:42.812 18:43:27 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:42.812 18:43:27 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:06:42.813 18:43:27 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:42.813 18:43:27 accel -- common/autotest_common.sh@10 -- # set +x 00:06:42.813 ************************************ 00:06:42.813 START TEST accel_dualcast 00:06:42.813 ************************************ 00:06:42.813 18:43:27 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:06:42.813 18:43:27 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:06:42.813 18:43:27 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:06:42.813 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:42.813 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:42.813 18:43:27 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:42.813 18:43:27 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:42.813 18:43:27 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:06:42.813 18:43:27 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:42.813 18:43:27 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:42.813 18:43:27 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:42.813 18:43:27 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:42.813 18:43:27 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:42.813 18:43:27 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:06:42.813 18:43:27 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:06:42.813 [2024-07-24 18:43:27.636947] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:42.813 [2024-07-24 18:43:27.636994] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2018851 ] 00:06:42.813 [2024-07-24 18:43:27.700999] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.813 [2024-07-24 18:43:27.771510] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.813 18:43:27 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.072 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.073 18:43:27 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:43.073 18:43:27 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.073 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.073 18:43:27 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:44.009 18:43:28 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:44.009 18:43:28 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:44.009 18:43:28 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:44.009 18:43:28 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:44.009 18:43:28 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:44.009 18:43:28 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:44.009 18:43:28 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:44.009 18:43:28 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:44.009 18:43:28 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:44.009 18:43:28 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:44.009 18:43:28 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:44.009 18:43:28 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:44.009 18:43:28 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:44.009 18:43:28 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:44.010 18:43:28 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:44.010 18:43:28 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:44.010 18:43:28 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:44.010 18:43:28 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:44.010 18:43:28 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:44.010 18:43:28 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:44.010 18:43:28 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:44.010 18:43:28 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:44.010 18:43:28 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:44.010 18:43:28 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:44.010 18:43:28 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:44.010 18:43:28 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:06:44.010 18:43:28 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:44.010 00:06:44.010 real 0m1.352s 00:06:44.010 user 0m1.241s 00:06:44.010 sys 0m0.117s 00:06:44.010 18:43:28 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:44.010 18:43:28 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:06:44.010 ************************************ 00:06:44.010 END TEST accel_dualcast 00:06:44.010 ************************************ 00:06:44.010 18:43:28 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:44.010 18:43:28 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:06:44.010 18:43:28 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:44.010 18:43:28 accel -- common/autotest_common.sh@10 -- # set +x 00:06:44.269 ************************************ 00:06:44.269 START TEST accel_compare 00:06:44.269 ************************************ 00:06:44.269 18:43:29 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:06:44.269 [2024-07-24 18:43:29.052994] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:44.269 [2024-07-24 18:43:29.053044] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2019097 ] 00:06:44.269 [2024-07-24 18:43:29.117136] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.269 [2024-07-24 18:43:29.186916] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:44.269 18:43:29 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:06:45.648 18:43:30 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:45.648 00:06:45.648 real 0m1.350s 00:06:45.648 user 0m1.230s 00:06:45.648 sys 0m0.125s 00:06:45.648 18:43:30 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:45.648 18:43:30 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:06:45.648 ************************************ 00:06:45.648 END TEST accel_compare 00:06:45.648 ************************************ 00:06:45.648 18:43:30 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:45.648 18:43:30 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:06:45.648 18:43:30 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:45.648 18:43:30 accel -- common/autotest_common.sh@10 -- # set +x 00:06:45.648 ************************************ 00:06:45.648 START TEST accel_xor 00:06:45.648 ************************************ 00:06:45.648 18:43:30 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:06:45.648 [2024-07-24 18:43:30.465962] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:45.648 [2024-07-24 18:43:30.466007] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2019347 ] 00:06:45.648 [2024-07-24 18:43:30.529968] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.648 [2024-07-24 18:43:30.601326] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.648 18:43:30 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.907 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.908 18:43:30 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.844 18:43:31 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.844 18:43:31 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.844 18:43:31 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.844 18:43:31 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.844 18:43:31 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.844 18:43:31 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.844 18:43:31 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.844 18:43:31 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.844 18:43:31 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.844 18:43:31 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.844 18:43:31 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.844 18:43:31 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.845 18:43:31 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.845 18:43:31 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.845 18:43:31 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.845 18:43:31 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.845 18:43:31 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.845 18:43:31 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.845 18:43:31 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.845 18:43:31 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.845 18:43:31 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.845 18:43:31 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.845 18:43:31 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.845 18:43:31 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.845 18:43:31 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:46.845 18:43:31 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:06:46.845 18:43:31 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:46.845 00:06:46.845 real 0m1.356s 00:06:46.845 user 0m1.244s 00:06:46.845 sys 0m0.116s 00:06:46.845 18:43:31 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:46.845 18:43:31 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:06:46.845 ************************************ 00:06:46.845 END TEST accel_xor 00:06:46.845 ************************************ 00:06:46.845 18:43:31 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:46.845 18:43:31 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:06:46.845 18:43:31 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:46.845 18:43:31 accel -- common/autotest_common.sh@10 -- # set +x 00:06:47.104 ************************************ 00:06:47.104 START TEST accel_xor 00:06:47.104 ************************************ 00:06:47.104 18:43:31 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:06:47.104 18:43:31 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:06:47.104 18:43:31 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:06:47.104 18:43:31 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.104 18:43:31 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.104 18:43:31 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:47.104 18:43:31 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:47.104 18:43:31 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:06:47.104 18:43:31 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:47.104 18:43:31 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:47.104 18:43:31 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:47.104 18:43:31 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:47.104 18:43:31 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:47.104 18:43:31 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:06:47.104 18:43:31 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:06:47.104 [2024-07-24 18:43:31.890407] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:47.104 [2024-07-24 18:43:31.890455] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2019593 ] 00:06:47.104 [2024-07-24 18:43:31.956001] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.104 [2024-07-24 18:43:32.026429] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.104 18:43:32 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:47.104 18:43:32 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.104 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.104 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.104 18:43:32 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.105 18:43:32 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:06:48.482 18:43:33 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:48.482 00:06:48.482 real 0m1.355s 00:06:48.482 user 0m1.234s 00:06:48.482 sys 0m0.128s 00:06:48.482 18:43:33 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:48.482 18:43:33 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:06:48.482 ************************************ 00:06:48.482 END TEST accel_xor 00:06:48.482 ************************************ 00:06:48.482 18:43:33 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:06:48.482 18:43:33 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:06:48.482 18:43:33 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:48.482 18:43:33 accel -- common/autotest_common.sh@10 -- # set +x 00:06:48.482 ************************************ 00:06:48.482 START TEST accel_dif_verify 00:06:48.482 ************************************ 00:06:48.482 18:43:33 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:06:48.482 18:43:33 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:06:48.482 18:43:33 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:06:48.482 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.482 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.482 18:43:33 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:06:48.482 18:43:33 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:48.482 18:43:33 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:06:48.482 18:43:33 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:48.482 18:43:33 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:48.482 18:43:33 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:48.482 18:43:33 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:48.482 18:43:33 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:48.482 18:43:33 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:06:48.482 18:43:33 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:06:48.482 [2024-07-24 18:43:33.308820] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:48.482 [2024-07-24 18:43:33.308871] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2019837 ] 00:06:48.482 [2024-07-24 18:43:33.372509] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.482 [2024-07-24 18:43:33.442947] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.740 18:43:33 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:06:49.677 18:43:34 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:49.677 00:06:49.677 real 0m1.355s 00:06:49.677 user 0m1.241s 00:06:49.677 sys 0m0.122s 00:06:49.677 18:43:34 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:49.677 18:43:34 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:06:49.677 ************************************ 00:06:49.677 END TEST accel_dif_verify 00:06:49.677 ************************************ 00:06:49.677 18:43:34 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:06:49.677 18:43:34 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:06:49.677 18:43:34 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:49.677 18:43:34 accel -- common/autotest_common.sh@10 -- # set +x 00:06:49.936 ************************************ 00:06:49.936 START TEST accel_dif_generate 00:06:49.936 ************************************ 00:06:49.936 18:43:34 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:06:49.936 [2024-07-24 18:43:34.730354] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:49.936 [2024-07-24 18:43:34.730401] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2020085 ] 00:06:49.936 [2024-07-24 18:43:34.795260] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.936 [2024-07-24 18:43:34.866257] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.936 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:49.937 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.937 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.937 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.937 18:43:34 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:49.937 18:43:34 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.937 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.937 18:43:34 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:06:51.313 18:43:36 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:51.313 00:06:51.313 real 0m1.359s 00:06:51.313 user 0m1.242s 00:06:51.313 sys 0m0.122s 00:06:51.313 18:43:36 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:51.313 18:43:36 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:06:51.313 ************************************ 00:06:51.313 END TEST accel_dif_generate 00:06:51.313 ************************************ 00:06:51.313 18:43:36 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:06:51.313 18:43:36 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:06:51.313 18:43:36 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:51.314 18:43:36 accel -- common/autotest_common.sh@10 -- # set +x 00:06:51.314 ************************************ 00:06:51.314 START TEST accel_dif_generate_copy 00:06:51.314 ************************************ 00:06:51.314 18:43:36 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:06:51.314 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:06:51.314 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:06:51.314 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:51.314 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:51.314 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:06:51.314 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:06:51.314 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:51.314 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:51.314 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:51.314 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:51.314 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:51.314 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:51.314 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:06:51.314 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:06:51.314 [2024-07-24 18:43:36.155423] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:51.314 [2024-07-24 18:43:36.155466] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2020332 ] 00:06:51.314 [2024-07-24 18:43:36.218890] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.314 [2024-07-24 18:43:36.289907] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:51.572 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:51.573 18:43:36 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:52.509 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:52.510 00:06:52.510 real 0m1.363s 00:06:52.510 user 0m1.242s 00:06:52.510 sys 0m0.119s 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:52.510 18:43:37 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:06:52.510 ************************************ 00:06:52.510 END TEST accel_dif_generate_copy 00:06:52.510 ************************************ 00:06:52.768 18:43:37 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:06:52.768 18:43:37 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:52.768 18:43:37 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:06:52.768 18:43:37 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:52.768 18:43:37 accel -- common/autotest_common.sh@10 -- # set +x 00:06:52.768 ************************************ 00:06:52.768 START TEST accel_comp 00:06:52.768 ************************************ 00:06:52.768 18:43:37 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:52.768 18:43:37 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:06:52.768 18:43:37 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:06:52.768 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.768 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.768 18:43:37 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:52.768 18:43:37 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:06:52.768 18:43:37 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:52.768 18:43:37 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:52.768 18:43:37 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:52.768 18:43:37 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:52.768 18:43:37 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:52.768 18:43:37 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:52.768 18:43:37 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:06:52.768 18:43:37 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:06:52.768 [2024-07-24 18:43:37.582312] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:52.768 [2024-07-24 18:43:37.582359] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2020580 ] 00:06:52.769 [2024-07-24 18:43:37.647678] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.769 [2024-07-24 18:43:37.717935] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:06:53.027 18:43:37 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.028 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.028 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.028 18:43:37 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:06:53.028 18:43:37 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.028 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.028 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.028 18:43:37 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:06:53.028 18:43:37 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.028 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.028 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.028 18:43:37 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.028 18:43:37 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.028 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.028 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.028 18:43:37 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.028 18:43:37 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.028 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.028 18:43:37 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:06:53.961 18:43:38 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:53.961 00:06:53.961 real 0m1.367s 00:06:53.961 user 0m1.239s 00:06:53.961 sys 0m0.126s 00:06:53.961 18:43:38 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:53.961 18:43:38 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:06:53.961 ************************************ 00:06:53.961 END TEST accel_comp 00:06:53.961 ************************************ 00:06:53.961 18:43:38 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:53.961 18:43:38 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:06:53.961 18:43:38 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:53.961 18:43:38 accel -- common/autotest_common.sh@10 -- # set +x 00:06:54.219 ************************************ 00:06:54.219 START TEST accel_decomp 00:06:54.219 ************************************ 00:06:54.219 18:43:38 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:54.219 18:43:38 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:06:54.219 18:43:38 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:06:54.219 18:43:38 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.219 18:43:38 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.219 18:43:38 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:54.219 18:43:38 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:54.219 18:43:38 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:06:54.219 18:43:38 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:54.219 18:43:38 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:54.219 18:43:38 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.219 18:43:38 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.219 18:43:38 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:54.219 18:43:38 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:06:54.219 18:43:38 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:06:54.219 [2024-07-24 18:43:39.016777] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:54.219 [2024-07-24 18:43:39.016824] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2020825 ] 00:06:54.220 [2024-07-24 18:43:39.080862] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.220 [2024-07-24 18:43:39.150711] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.220 18:43:39 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:55.622 18:43:40 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:55.622 18:43:40 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:55.622 18:43:40 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:55.622 18:43:40 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:55.622 18:43:40 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:55.622 18:43:40 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:55.622 18:43:40 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:55.622 18:43:40 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:55.622 18:43:40 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:55.622 18:43:40 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:55.622 18:43:40 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:55.622 18:43:40 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:55.622 18:43:40 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:55.622 18:43:40 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:55.622 18:43:40 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:55.622 18:43:40 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:55.622 18:43:40 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:55.622 18:43:40 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:55.623 18:43:40 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:55.623 18:43:40 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:55.623 18:43:40 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:55.623 18:43:40 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:55.623 18:43:40 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:55.623 18:43:40 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:55.623 18:43:40 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:55.623 18:43:40 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:55.623 18:43:40 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:55.623 00:06:55.623 real 0m1.360s 00:06:55.623 user 0m1.240s 00:06:55.623 sys 0m0.126s 00:06:55.623 18:43:40 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:55.623 18:43:40 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:06:55.623 ************************************ 00:06:55.623 END TEST accel_decomp 00:06:55.623 ************************************ 00:06:55.623 18:43:40 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:55.623 18:43:40 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:06:55.623 18:43:40 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:55.623 18:43:40 accel -- common/autotest_common.sh@10 -- # set +x 00:06:55.623 ************************************ 00:06:55.623 START TEST accel_decomp_full 00:06:55.623 ************************************ 00:06:55.623 18:43:40 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:55.623 18:43:40 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:06:55.623 18:43:40 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:06:55.623 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.623 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.623 18:43:40 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:55.623 18:43:40 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:55.623 18:43:40 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:06:55.623 18:43:40 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:55.623 18:43:40 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:55.623 18:43:40 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.623 18:43:40 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.623 18:43:40 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:55.623 18:43:40 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:06:55.623 18:43:40 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:06:55.623 [2024-07-24 18:43:40.441985] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:55.623 [2024-07-24 18:43:40.442033] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2021073 ] 00:06:55.623 [2024-07-24 18:43:40.505171] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.623 [2024-07-24 18:43:40.575438] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.892 18:43:40 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:56.829 18:43:41 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:56.829 00:06:56.829 real 0m1.362s 00:06:56.829 user 0m1.250s 00:06:56.829 sys 0m0.117s 00:06:56.829 18:43:41 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:56.829 18:43:41 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:06:56.829 ************************************ 00:06:56.829 END TEST accel_decomp_full 00:06:56.829 ************************************ 00:06:56.829 18:43:41 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:56.829 18:43:41 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:06:56.829 18:43:41 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:56.829 18:43:41 accel -- common/autotest_common.sh@10 -- # set +x 00:06:56.829 ************************************ 00:06:56.829 START TEST accel_decomp_mcore 00:06:56.829 ************************************ 00:06:57.088 18:43:41 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:57.088 18:43:41 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:06:57.088 18:43:41 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:06:57.088 18:43:41 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.088 18:43:41 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.088 18:43:41 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:57.088 18:43:41 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:57.088 18:43:41 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:06:57.088 18:43:41 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:57.088 18:43:41 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:57.088 18:43:41 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:57.088 18:43:41 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:57.088 18:43:41 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:57.088 18:43:41 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:06:57.088 18:43:41 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:06:57.088 [2024-07-24 18:43:41.867458] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:57.088 [2024-07-24 18:43:41.867517] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2021326 ] 00:06:57.088 [2024-07-24 18:43:41.932331] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:57.088 [2024-07-24 18:43:42.005466] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:57.088 [2024-07-24 18:43:42.005566] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:06:57.088 [2024-07-24 18:43:42.005589] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:06:57.088 [2024-07-24 18:43:42.005590] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.088 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:06:57.089 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.089 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.089 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.089 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:06:57.089 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.089 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.089 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.089 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:57.089 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.089 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.089 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.089 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:57.089 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.089 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.089 18:43:42 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:58.465 00:06:58.465 real 0m1.372s 00:06:58.465 user 0m4.611s 00:06:58.465 sys 0m0.130s 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:58.465 18:43:43 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:06:58.465 ************************************ 00:06:58.465 END TEST accel_decomp_mcore 00:06:58.465 ************************************ 00:06:58.465 18:43:43 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:58.465 18:43:43 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:06:58.465 18:43:43 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:58.465 18:43:43 accel -- common/autotest_common.sh@10 -- # set +x 00:06:58.465 ************************************ 00:06:58.465 START TEST accel_decomp_full_mcore 00:06:58.465 ************************************ 00:06:58.465 18:43:43 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:58.465 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:06:58.465 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:06:58.465 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.465 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.465 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:58.465 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:58.465 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:06:58.465 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:58.465 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:58.465 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.465 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.465 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:58.465 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:06:58.465 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:06:58.465 [2024-07-24 18:43:43.303357] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:58.465 [2024-07-24 18:43:43.303405] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2021571 ] 00:06:58.465 [2024-07-24 18:43:43.368392] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:58.465 [2024-07-24 18:43:43.442857] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:58.465 [2024-07-24 18:43:43.442957] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:06:58.465 [2024-07-24 18:43:43.443039] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:06:58.465 [2024-07-24 18:43:43.443040] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.724 18:43:43 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:59.661 00:06:59.661 real 0m1.386s 00:06:59.661 user 0m4.654s 00:06:59.661 sys 0m0.135s 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:59.661 18:43:44 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:06:59.661 ************************************ 00:06:59.661 END TEST accel_decomp_full_mcore 00:06:59.661 ************************************ 00:06:59.920 18:43:44 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:59.920 18:43:44 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:06:59.920 18:43:44 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:59.920 18:43:44 accel -- common/autotest_common.sh@10 -- # set +x 00:06:59.920 ************************************ 00:06:59.920 START TEST accel_decomp_mthread 00:06:59.920 ************************************ 00:06:59.920 18:43:44 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:59.920 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:06:59.920 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:06:59.920 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.920 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.920 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:59.920 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:06:59.920 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:59.920 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:59.920 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:59.920 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.920 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.920 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:59.920 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:06:59.920 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:06:59.920 [2024-07-24 18:43:44.756592] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:06:59.920 [2024-07-24 18:43:44.756635] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2021827 ] 00:06:59.920 [2024-07-24 18:43:44.818897] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.920 [2024-07-24 18:43:44.890819] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:00.181 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.182 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.182 18:43:44 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:01.119 00:07:01.119 real 0m1.371s 00:07:01.119 user 0m1.251s 00:07:01.119 sys 0m0.119s 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:01.119 18:43:46 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:01.119 ************************************ 00:07:01.119 END TEST accel_decomp_mthread 00:07:01.119 ************************************ 00:07:01.379 18:43:46 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:01.379 18:43:46 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:01.379 18:43:46 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:01.379 18:43:46 accel -- common/autotest_common.sh@10 -- # set +x 00:07:01.379 ************************************ 00:07:01.379 START TEST accel_decomp_full_mthread 00:07:01.379 ************************************ 00:07:01.379 18:43:46 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:01.379 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:01.379 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:01.379 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.379 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.379 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:01.379 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:01.379 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:01.379 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:01.379 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:01.379 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:01.379 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:01.379 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:01.379 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:01.379 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:01.379 [2024-07-24 18:43:46.193272] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:07:01.379 [2024-07-24 18:43:46.193320] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2022069 ] 00:07:01.379 [2024-07-24 18:43:46.258987] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.379 [2024-07-24 18:43:46.331376] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.639 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.640 18:43:46 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:02.577 00:07:02.577 real 0m1.405s 00:07:02.577 user 0m1.276s 00:07:02.577 sys 0m0.123s 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:02.577 18:43:47 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:02.577 ************************************ 00:07:02.577 END TEST accel_decomp_full_mthread 00:07:02.577 ************************************ 00:07:02.837 18:43:47 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:07:02.837 18:43:47 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:07:02.837 18:43:47 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:07:02.837 18:43:47 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:02.837 18:43:47 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=2022316 00:07:02.837 18:43:47 accel -- accel/accel.sh@63 -- # waitforlisten 2022316 00:07:02.837 18:43:47 accel -- common/autotest_common.sh@829 -- # '[' -z 2022316 ']' 00:07:02.837 18:43:47 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:02.837 18:43:47 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:02.837 18:43:47 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:02.837 18:43:47 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:02.837 18:43:47 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:02.837 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:02.837 18:43:47 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:02.837 18:43:47 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:02.837 18:43:47 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:02.837 18:43:47 accel -- common/autotest_common.sh@10 -- # set +x 00:07:02.837 18:43:47 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.837 18:43:47 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.837 18:43:47 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:02.837 18:43:47 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:02.837 18:43:47 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:02.837 18:43:47 accel -- accel/accel.sh@41 -- # jq -r . 00:07:02.837 [2024-07-24 18:43:47.657065] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:07:02.837 [2024-07-24 18:43:47.657111] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2022316 ] 00:07:02.837 [2024-07-24 18:43:47.721397] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.837 [2024-07-24 18:43:47.796690] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.404 [2024-07-24 18:43:48.171669] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:03.664 18:43:48 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:03.664 18:43:48 accel -- common/autotest_common.sh@862 -- # return 0 00:07:03.664 18:43:48 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:03.664 18:43:48 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:03.664 18:43:48 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:03.664 18:43:48 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:07:03.664 18:43:48 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:07:03.664 18:43:48 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:07:03.664 18:43:48 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:07:03.664 18:43:48 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:07:03.664 18:43:48 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:03.664 18:43:48 accel -- common/autotest_common.sh@10 -- # set +x 00:07:03.664 18:43:48 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:03.664 "method": "compressdev_scan_accel_module", 00:07:03.664 18:43:48 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:03.664 18:43:48 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:03.664 18:43:48 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:03.664 18:43:48 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:03.664 18:43:48 accel -- common/autotest_common.sh@10 -- # set +x 00:07:03.664 18:43:48 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:03.664 18:43:48 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.664 18:43:48 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.664 18:43:48 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.664 18:43:48 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.664 18:43:48 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.664 18:43:48 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.664 18:43:48 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.664 18:43:48 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.664 18:43:48 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.664 18:43:48 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.664 18:43:48 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.664 18:43:48 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.664 18:43:48 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.664 18:43:48 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:07:03.664 18:43:48 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.664 18:43:48 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:07:03.664 18:43:48 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.664 18:43:48 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.664 18:43:48 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.664 18:43:48 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.664 18:43:48 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.664 18:43:48 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.664 18:43:48 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.664 18:43:48 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.664 18:43:48 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.664 18:43:48 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.664 18:43:48 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.664 18:43:48 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.664 18:43:48 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.664 18:43:48 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.664 18:43:48 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.664 18:43:48 accel -- accel/accel.sh@75 -- # killprocess 2022316 00:07:03.664 18:43:48 accel -- common/autotest_common.sh@948 -- # '[' -z 2022316 ']' 00:07:03.664 18:43:48 accel -- common/autotest_common.sh@952 -- # kill -0 2022316 00:07:03.664 18:43:48 accel -- common/autotest_common.sh@953 -- # uname 00:07:03.664 18:43:48 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:03.664 18:43:48 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2022316 00:07:03.924 18:43:48 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:03.924 18:43:48 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:03.924 18:43:48 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2022316' 00:07:03.924 killing process with pid 2022316 00:07:03.924 18:43:48 accel -- common/autotest_common.sh@967 -- # kill 2022316 00:07:03.924 18:43:48 accel -- common/autotest_common.sh@972 -- # wait 2022316 00:07:04.183 18:43:49 accel -- accel/accel.sh@76 -- # trap - ERR 00:07:04.183 18:43:49 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:04.183 18:43:49 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:04.183 18:43:49 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:04.183 18:43:49 accel -- common/autotest_common.sh@10 -- # set +x 00:07:04.183 ************************************ 00:07:04.183 START TEST accel_cdev_comp 00:07:04.183 ************************************ 00:07:04.183 18:43:49 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:04.183 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:07:04.183 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:07:04.183 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.183 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:04.183 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:04.183 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:04.183 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:07:04.183 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:04.183 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:04.183 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.183 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.183 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:04.183 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:04.183 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:07:04.183 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:07:04.183 [2024-07-24 18:43:49.082681] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:07:04.183 [2024-07-24 18:43:49.082739] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2022642 ] 00:07:04.183 [2024-07-24 18:43:49.148534] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.442 [2024-07-24 18:43:49.223099] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.701 [2024-07-24 18:43:49.580758] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:04.701 [2024-07-24 18:43:49.582388] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xc03a80 PMD being used: compress_qat 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:04.701 [2024-07-24 18:43:49.585684] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xe08870 PMD being used: compress_qat 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:04.701 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:04.702 18:43:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:06.080 18:43:50 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:06.080 00:07:06.080 real 0m1.677s 00:07:06.080 user 0m1.391s 00:07:06.080 sys 0m0.288s 00:07:06.080 18:43:50 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:06.080 18:43:50 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:07:06.080 ************************************ 00:07:06.080 END TEST accel_cdev_comp 00:07:06.080 ************************************ 00:07:06.080 18:43:50 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:06.080 18:43:50 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:06.080 18:43:50 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:06.080 18:43:50 accel -- common/autotest_common.sh@10 -- # set +x 00:07:06.080 ************************************ 00:07:06.080 START TEST accel_cdev_decomp 00:07:06.080 ************************************ 00:07:06.080 18:43:50 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:06.080 18:43:50 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:07:06.080 18:43:50 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:07:06.080 18:43:50 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:06.080 18:43:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.080 18:43:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:06.080 18:43:50 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:06.080 18:43:50 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:07:06.080 18:43:50 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:06.080 18:43:50 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:06.080 18:43:50 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:06.080 18:43:50 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:06.080 18:43:50 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:06.080 18:43:50 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:06.080 18:43:50 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:07:06.080 18:43:50 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:07:06.080 [2024-07-24 18:43:50.820403] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:07:06.080 [2024-07-24 18:43:50.820443] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2023025 ] 00:07:06.080 [2024-07-24 18:43:50.883844] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.080 [2024-07-24 18:43:50.954990] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.340 [2024-07-24 18:43:51.316420] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:06.340 [2024-07-24 18:43:51.318100] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1fb2a80 PMD being used: compress_qat 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:06.340 [2024-07-24 18:43:51.321438] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x21b7870 PMD being used: compress_qat 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:06.340 18:43:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.720 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:07.720 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:07.721 00:07:07.721 real 0m1.670s 00:07:07.721 user 0m1.409s 00:07:07.721 sys 0m0.266s 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:07.721 18:43:52 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:07:07.721 ************************************ 00:07:07.721 END TEST accel_cdev_decomp 00:07:07.721 ************************************ 00:07:07.721 18:43:52 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:07.721 18:43:52 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:07.721 18:43:52 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:07.721 18:43:52 accel -- common/autotest_common.sh@10 -- # set +x 00:07:07.721 ************************************ 00:07:07.721 START TEST accel_cdev_decomp_full 00:07:07.721 ************************************ 00:07:07.721 18:43:52 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:07.721 18:43:52 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:07:07.721 18:43:52 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:07:07.721 18:43:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.721 18:43:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.721 18:43:52 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:07.721 18:43:52 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:07.721 18:43:52 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:07:07.721 18:43:52 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:07.721 18:43:52 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:07.721 18:43:52 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:07.721 18:43:52 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:07.721 18:43:52 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:07.721 18:43:52 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:07.721 18:43:52 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:07:07.721 18:43:52 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:07:07.721 [2024-07-24 18:43:52.549501] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:07:07.721 [2024-07-24 18:43:52.549547] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2023272 ] 00:07:07.721 [2024-07-24 18:43:52.613513] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.721 [2024-07-24 18:43:52.684874] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.289 [2024-07-24 18:43:53.045074] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:08.289 [2024-07-24 18:43:53.046718] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x120ea80 PMD being used: compress_qat 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.289 [2024-07-24 18:43:53.049168] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1211dd0 PMD being used: compress_qat 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.289 18:43:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:09.223 00:07:09.223 real 0m1.675s 00:07:09.223 user 0m1.409s 00:07:09.223 sys 0m0.267s 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:09.223 18:43:54 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:07:09.223 ************************************ 00:07:09.223 END TEST accel_cdev_decomp_full 00:07:09.223 ************************************ 00:07:09.223 18:43:54 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:09.223 18:43:54 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:09.223 18:43:54 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:09.223 18:43:54 accel -- common/autotest_common.sh@10 -- # set +x 00:07:09.482 ************************************ 00:07:09.482 START TEST accel_cdev_decomp_mcore 00:07:09.482 ************************************ 00:07:09.482 18:43:54 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:09.482 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:09.482 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:09.482 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.482 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:09.482 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:09.482 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:09.482 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:09.482 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:09.482 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:09.482 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:09.482 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:09.482 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:09.482 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:09.482 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:09.482 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:09.482 [2024-07-24 18:43:54.276413] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:07:09.482 [2024-07-24 18:43:54.276456] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2023526 ] 00:07:09.483 [2024-07-24 18:43:54.339341] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:09.483 [2024-07-24 18:43:54.419765] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:09.483 [2024-07-24 18:43:54.419867] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:09.483 [2024-07-24 18:43:54.419900] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:09.483 [2024-07-24 18:43:54.419902] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.051 [2024-07-24 18:43:54.808600] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:10.051 [2024-07-24 18:43:54.810263] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x20a40c0 PMD being used: compress_qat 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.051 [2024-07-24 18:43:54.814668] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f313819b8b0 PMD being used: compress_qat 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.051 [2024-07-24 18:43:54.815464] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f313019b8b0 PMD being used: compress_qat 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:10.051 [2024-07-24 18:43:54.816139] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x20a9440 PMD being used: compress_qat 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.051 [2024-07-24 18:43:54.816311] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f312819b8b0 PMD being used: compress_qat 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.051 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.052 18:43:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:10.989 00:07:10.989 real 0m1.729s 00:07:10.989 user 0m5.845s 00:07:10.989 sys 0m0.292s 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:10.989 18:43:55 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:10.989 ************************************ 00:07:10.989 END TEST accel_cdev_decomp_mcore 00:07:10.989 ************************************ 00:07:11.248 18:43:56 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:11.248 18:43:56 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:11.248 18:43:56 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:11.248 18:43:56 accel -- common/autotest_common.sh@10 -- # set +x 00:07:11.248 ************************************ 00:07:11.248 START TEST accel_cdev_decomp_full_mcore 00:07:11.248 ************************************ 00:07:11.248 18:43:56 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:11.248 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:11.248 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:11.248 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.248 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.248 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:11.248 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:11.248 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:11.248 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:11.249 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:11.249 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:11.249 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:11.249 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:11.249 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:11.249 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:11.249 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:11.249 [2024-07-24 18:43:56.070330] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:07:11.249 [2024-07-24 18:43:56.070380] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2023961 ] 00:07:11.249 [2024-07-24 18:43:56.136285] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:11.249 [2024-07-24 18:43:56.209942] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:11.249 [2024-07-24 18:43:56.210041] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:11.249 [2024-07-24 18:43:56.210133] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:11.249 [2024-07-24 18:43:56.210134] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.817 [2024-07-24 18:43:56.593889] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:11.817 [2024-07-24 18:43:56.595562] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x22b30c0 PMD being used: compress_qat 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.817 [2024-07-24 18:43:56.599180] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fb3ec19b8b0 PMD being used: compress_qat 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.817 [2024-07-24 18:43:56.600030] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fb3e419b8b0 PMD being used: compress_qat 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:11.817 [2024-07-24 18:43:56.600673] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x22b3160 PMD being used: compress_qat 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.817 [2024-07-24 18:43:56.600833] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fb3dc19b8b0 PMD being used: compress_qat 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.817 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:11.818 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.818 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.818 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.818 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:11.818 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.818 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.818 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.818 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.818 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.818 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.818 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.818 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.818 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.818 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.818 18:43:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.755 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:12.756 00:07:12.756 real 0m1.718s 00:07:12.756 user 0m5.808s 00:07:12.756 sys 0m0.312s 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:12.756 18:43:57 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:12.756 ************************************ 00:07:12.756 END TEST accel_cdev_decomp_full_mcore 00:07:12.756 ************************************ 00:07:13.015 18:43:57 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:13.015 18:43:57 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:13.015 18:43:57 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:13.015 18:43:57 accel -- common/autotest_common.sh@10 -- # set +x 00:07:13.015 ************************************ 00:07:13.015 START TEST accel_cdev_decomp_mthread 00:07:13.015 ************************************ 00:07:13.015 18:43:57 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:13.015 18:43:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:13.015 18:43:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:13.015 18:43:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.015 18:43:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.015 18:43:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:13.015 18:43:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:13.015 18:43:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:13.015 18:43:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:13.015 18:43:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:13.015 18:43:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:13.015 18:43:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:13.015 18:43:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:13.015 18:43:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:13.015 18:43:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:13.015 18:43:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:13.015 [2024-07-24 18:43:57.837487] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:07:13.015 [2024-07-24 18:43:57.837533] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2024239 ] 00:07:13.015 [2024-07-24 18:43:57.902752] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.015 [2024-07-24 18:43:57.974829] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.584 [2024-07-24 18:43:58.335311] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:13.584 [2024-07-24 18:43:58.336979] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x20b4a80 PMD being used: compress_qat 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.584 [2024-07-24 18:43:58.340915] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x20b9c40 PMD being used: compress_qat 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:13.584 [2024-07-24 18:43:58.342461] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x21dca30 PMD being used: compress_qat 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.584 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.585 18:43:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:14.618 00:07:14.618 real 0m1.678s 00:07:14.618 user 0m1.400s 00:07:14.618 sys 0m0.278s 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:14.618 18:43:59 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:14.618 ************************************ 00:07:14.618 END TEST accel_cdev_decomp_mthread 00:07:14.618 ************************************ 00:07:14.618 18:43:59 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:14.618 18:43:59 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:14.618 18:43:59 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:14.618 18:43:59 accel -- common/autotest_common.sh@10 -- # set +x 00:07:14.618 ************************************ 00:07:14.618 START TEST accel_cdev_decomp_full_mthread 00:07:14.618 ************************************ 00:07:14.619 18:43:59 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:14.619 18:43:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:14.619 18:43:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:14.619 18:43:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.619 18:43:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.619 18:43:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:14.619 18:43:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:14.619 18:43:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:14.619 18:43:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:14.619 18:43:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:14.619 18:43:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.619 18:43:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.619 18:43:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:14.619 18:43:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:14.619 18:43:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:14.619 18:43:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:14.619 [2024-07-24 18:43:59.568634] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:07:14.619 [2024-07-24 18:43:59.568681] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2024496 ] 00:07:14.877 [2024-07-24 18:43:59.632888] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.877 [2024-07-24 18:43:59.703507] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.137 [2024-07-24 18:44:00.075574] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:15.137 [2024-07-24 18:44:00.077226] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1131a80 PMD being used: compress_qat 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:15.137 [2024-07-24 18:44:00.080379] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1134dd0 PMD being used: compress_qat 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:15.137 [2024-07-24 18:44:00.082047] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x13366c0 PMD being used: compress_qat 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:15.137 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.138 18:44:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:16.516 00:07:16.516 real 0m1.688s 00:07:16.516 user 0m1.405s 00:07:16.516 sys 0m0.282s 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:16.516 18:44:01 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:16.516 ************************************ 00:07:16.516 END TEST accel_cdev_decomp_full_mthread 00:07:16.516 ************************************ 00:07:16.516 18:44:01 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:07:16.516 18:44:01 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:16.516 18:44:01 accel -- accel/accel.sh@137 -- # build_accel_config 00:07:16.516 18:44:01 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:07:16.516 18:44:01 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:16.516 18:44:01 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:16.516 18:44:01 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:16.516 18:44:01 accel -- common/autotest_common.sh@10 -- # set +x 00:07:16.516 18:44:01 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:16.516 18:44:01 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:16.516 18:44:01 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:16.516 18:44:01 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:16.516 18:44:01 accel -- accel/accel.sh@41 -- # jq -r . 00:07:16.516 ************************************ 00:07:16.516 START TEST accel_dif_functional_tests 00:07:16.516 ************************************ 00:07:16.516 18:44:01 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:16.516 [2024-07-24 18:44:01.343863] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:07:16.516 [2024-07-24 18:44:01.343898] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2024784 ] 00:07:16.516 [2024-07-24 18:44:01.406293] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:16.516 [2024-07-24 18:44:01.480236] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:16.516 [2024-07-24 18:44:01.480333] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:16.516 [2024-07-24 18:44:01.480335] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.775 00:07:16.775 00:07:16.775 CUnit - A unit testing framework for C - Version 2.1-3 00:07:16.775 http://cunit.sourceforge.net/ 00:07:16.775 00:07:16.775 00:07:16.775 Suite: accel_dif 00:07:16.775 Test: verify: DIF generated, GUARD check ...passed 00:07:16.775 Test: verify: DIF generated, APPTAG check ...passed 00:07:16.775 Test: verify: DIF generated, REFTAG check ...passed 00:07:16.775 Test: verify: DIF not generated, GUARD check ...[2024-07-24 18:44:01.565455] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:16.775 passed 00:07:16.775 Test: verify: DIF not generated, APPTAG check ...[2024-07-24 18:44:01.565510] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:16.775 passed 00:07:16.775 Test: verify: DIF not generated, REFTAG check ...[2024-07-24 18:44:01.565548] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:16.775 passed 00:07:16.775 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:16.775 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-24 18:44:01.565590] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:16.775 passed 00:07:16.775 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:16.775 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:16.775 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:16.775 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-24 18:44:01.565695] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:16.775 passed 00:07:16.775 Test: verify copy: DIF generated, GUARD check ...passed 00:07:16.775 Test: verify copy: DIF generated, APPTAG check ...passed 00:07:16.775 Test: verify copy: DIF generated, REFTAG check ...passed 00:07:16.775 Test: verify copy: DIF not generated, GUARD check ...[2024-07-24 18:44:01.565806] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:16.775 passed 00:07:16.775 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-24 18:44:01.565825] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:16.775 passed 00:07:16.775 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-24 18:44:01.565844] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:16.775 passed 00:07:16.775 Test: generate copy: DIF generated, GUARD check ...passed 00:07:16.775 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:16.775 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:16.775 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:16.775 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:16.775 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:16.775 Test: generate copy: iovecs-len validate ...[2024-07-24 18:44:01.566006] dif.c:1225:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:16.775 passed 00:07:16.775 Test: generate copy: buffer alignment validate ...passed 00:07:16.775 00:07:16.775 Run Summary: Type Total Ran Passed Failed Inactive 00:07:16.775 suites 1 1 n/a 0 0 00:07:16.775 tests 26 26 26 0 0 00:07:16.775 asserts 115 115 115 0 n/a 00:07:16.775 00:07:16.775 Elapsed time = 0.002 seconds 00:07:16.775 00:07:16.775 real 0m0.442s 00:07:16.775 user 0m0.663s 00:07:16.775 sys 0m0.150s 00:07:16.775 18:44:01 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:16.775 18:44:01 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:07:16.775 ************************************ 00:07:16.775 END TEST accel_dif_functional_tests 00:07:16.775 ************************************ 00:07:16.775 00:07:16.775 real 0m45.162s 00:07:16.775 user 0m55.254s 00:07:16.775 sys 0m7.163s 00:07:16.775 18:44:01 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:16.775 18:44:01 accel -- common/autotest_common.sh@10 -- # set +x 00:07:16.775 ************************************ 00:07:16.775 END TEST accel 00:07:16.775 ************************************ 00:07:17.034 18:44:01 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:17.034 18:44:01 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:17.034 18:44:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.034 18:44:01 -- common/autotest_common.sh@10 -- # set +x 00:07:17.034 ************************************ 00:07:17.034 START TEST accel_rpc 00:07:17.034 ************************************ 00:07:17.034 18:44:01 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:17.035 * Looking for test storage... 00:07:17.035 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:17.035 18:44:01 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:17.035 18:44:01 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=2025022 00:07:17.035 18:44:01 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:17.035 18:44:01 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 2025022 00:07:17.035 18:44:01 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 2025022 ']' 00:07:17.035 18:44:01 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:17.035 18:44:01 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:17.035 18:44:01 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:17.035 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:17.035 18:44:01 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:17.035 18:44:01 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:17.035 [2024-07-24 18:44:01.959896] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:07:17.035 [2024-07-24 18:44:01.959950] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2025022 ] 00:07:17.035 [2024-07-24 18:44:02.024389] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.293 [2024-07-24 18:44:02.102225] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.861 18:44:02 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:17.861 18:44:02 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:07:17.861 18:44:02 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:17.861 18:44:02 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:17.861 18:44:02 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:17.861 18:44:02 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:17.861 18:44:02 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:17.861 18:44:02 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:17.861 18:44:02 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.861 18:44:02 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:17.861 ************************************ 00:07:17.861 START TEST accel_assign_opcode 00:07:17.861 ************************************ 00:07:17.861 18:44:02 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:07:17.861 18:44:02 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:17.861 18:44:02 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.861 18:44:02 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:17.861 [2024-07-24 18:44:02.780210] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:17.861 18:44:02 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:17.861 18:44:02 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:17.861 18:44:02 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.861 18:44:02 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:17.861 [2024-07-24 18:44:02.788223] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:17.861 18:44:02 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:17.861 18:44:02 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:17.861 18:44:02 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.861 18:44:02 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:18.121 18:44:02 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:18.121 18:44:02 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:18.121 18:44:02 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:18.121 18:44:02 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:18.121 18:44:02 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:07:18.121 18:44:02 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:18.121 18:44:03 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:18.121 software 00:07:18.121 00:07:18.121 real 0m0.260s 00:07:18.121 user 0m0.045s 00:07:18.121 sys 0m0.012s 00:07:18.121 18:44:03 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:18.121 18:44:03 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:18.121 ************************************ 00:07:18.121 END TEST accel_assign_opcode 00:07:18.121 ************************************ 00:07:18.121 18:44:03 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 2025022 00:07:18.121 18:44:03 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 2025022 ']' 00:07:18.121 18:44:03 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 2025022 00:07:18.121 18:44:03 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:07:18.121 18:44:03 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:18.121 18:44:03 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2025022 00:07:18.121 18:44:03 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:18.121 18:44:03 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:18.121 18:44:03 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2025022' 00:07:18.121 killing process with pid 2025022 00:07:18.121 18:44:03 accel_rpc -- common/autotest_common.sh@967 -- # kill 2025022 00:07:18.121 18:44:03 accel_rpc -- common/autotest_common.sh@972 -- # wait 2025022 00:07:18.689 00:07:18.689 real 0m1.588s 00:07:18.689 user 0m1.640s 00:07:18.689 sys 0m0.412s 00:07:18.689 18:44:03 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:18.689 18:44:03 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:18.689 ************************************ 00:07:18.689 END TEST accel_rpc 00:07:18.689 ************************************ 00:07:18.689 18:44:03 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:07:18.689 18:44:03 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:18.689 18:44:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:18.689 18:44:03 -- common/autotest_common.sh@10 -- # set +x 00:07:18.689 ************************************ 00:07:18.689 START TEST app_cmdline 00:07:18.689 ************************************ 00:07:18.689 18:44:03 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:07:18.689 * Looking for test storage... 00:07:18.689 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:07:18.689 18:44:03 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:18.689 18:44:03 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=2025331 00:07:18.689 18:44:03 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:18.689 18:44:03 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 2025331 00:07:18.689 18:44:03 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 2025331 ']' 00:07:18.689 18:44:03 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:18.689 18:44:03 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:18.689 18:44:03 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:18.689 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:18.689 18:44:03 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:18.689 18:44:03 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:18.689 [2024-07-24 18:44:03.600292] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:07:18.689 [2024-07-24 18:44:03.600336] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2025331 ] 00:07:18.689 [2024-07-24 18:44:03.664424] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:18.948 [2024-07-24 18:44:03.743244] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.515 18:44:04 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:19.515 18:44:04 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:07:19.516 18:44:04 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:19.774 { 00:07:19.774 "version": "SPDK v24.09-pre git sha1 0bb5c21e2", 00:07:19.774 "fields": { 00:07:19.774 "major": 24, 00:07:19.774 "minor": 9, 00:07:19.774 "patch": 0, 00:07:19.774 "suffix": "-pre", 00:07:19.774 "commit": "0bb5c21e2" 00:07:19.774 } 00:07:19.774 } 00:07:19.774 18:44:04 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:19.774 18:44:04 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:19.774 18:44:04 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:19.774 18:44:04 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:19.774 18:44:04 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:19.774 18:44:04 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:19.774 18:44:04 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.774 18:44:04 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:19.774 18:44:04 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:19.774 18:44:04 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:19.774 request: 00:07:19.774 { 00:07:19.774 "method": "env_dpdk_get_mem_stats", 00:07:19.774 "req_id": 1 00:07:19.774 } 00:07:19.774 Got JSON-RPC error response 00:07:19.774 response: 00:07:19.774 { 00:07:19.774 "code": -32601, 00:07:19.774 "message": "Method not found" 00:07:19.774 } 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:19.774 18:44:04 app_cmdline -- app/cmdline.sh@1 -- # killprocess 2025331 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 2025331 ']' 00:07:19.774 18:44:04 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 2025331 00:07:19.775 18:44:04 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:07:19.775 18:44:04 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:19.775 18:44:04 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2025331 00:07:20.033 18:44:04 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:20.033 18:44:04 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:20.033 18:44:04 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2025331' 00:07:20.033 killing process with pid 2025331 00:07:20.033 18:44:04 app_cmdline -- common/autotest_common.sh@967 -- # kill 2025331 00:07:20.033 18:44:04 app_cmdline -- common/autotest_common.sh@972 -- # wait 2025331 00:07:20.293 00:07:20.293 real 0m1.630s 00:07:20.293 user 0m1.911s 00:07:20.293 sys 0m0.421s 00:07:20.293 18:44:05 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:20.293 18:44:05 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:20.293 ************************************ 00:07:20.293 END TEST app_cmdline 00:07:20.293 ************************************ 00:07:20.293 18:44:05 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:07:20.293 18:44:05 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:20.293 18:44:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:20.293 18:44:05 -- common/autotest_common.sh@10 -- # set +x 00:07:20.293 ************************************ 00:07:20.293 START TEST version 00:07:20.293 ************************************ 00:07:20.293 18:44:05 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:07:20.293 * Looking for test storage... 00:07:20.293 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:07:20.293 18:44:05 version -- app/version.sh@17 -- # get_header_version major 00:07:20.293 18:44:05 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:20.293 18:44:05 version -- app/version.sh@14 -- # cut -f2 00:07:20.293 18:44:05 version -- app/version.sh@14 -- # tr -d '"' 00:07:20.293 18:44:05 version -- app/version.sh@17 -- # major=24 00:07:20.293 18:44:05 version -- app/version.sh@18 -- # get_header_version minor 00:07:20.293 18:44:05 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:20.293 18:44:05 version -- app/version.sh@14 -- # cut -f2 00:07:20.293 18:44:05 version -- app/version.sh@14 -- # tr -d '"' 00:07:20.293 18:44:05 version -- app/version.sh@18 -- # minor=9 00:07:20.293 18:44:05 version -- app/version.sh@19 -- # get_header_version patch 00:07:20.293 18:44:05 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:20.293 18:44:05 version -- app/version.sh@14 -- # cut -f2 00:07:20.293 18:44:05 version -- app/version.sh@14 -- # tr -d '"' 00:07:20.293 18:44:05 version -- app/version.sh@19 -- # patch=0 00:07:20.293 18:44:05 version -- app/version.sh@20 -- # get_header_version suffix 00:07:20.293 18:44:05 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:20.293 18:44:05 version -- app/version.sh@14 -- # cut -f2 00:07:20.293 18:44:05 version -- app/version.sh@14 -- # tr -d '"' 00:07:20.293 18:44:05 version -- app/version.sh@20 -- # suffix=-pre 00:07:20.293 18:44:05 version -- app/version.sh@22 -- # version=24.9 00:07:20.293 18:44:05 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:20.293 18:44:05 version -- app/version.sh@28 -- # version=24.9rc0 00:07:20.293 18:44:05 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:07:20.293 18:44:05 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:20.552 18:44:05 version -- app/version.sh@30 -- # py_version=24.9rc0 00:07:20.552 18:44:05 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:07:20.552 00:07:20.552 real 0m0.149s 00:07:20.552 user 0m0.072s 00:07:20.552 sys 0m0.114s 00:07:20.552 18:44:05 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:20.552 18:44:05 version -- common/autotest_common.sh@10 -- # set +x 00:07:20.552 ************************************ 00:07:20.552 END TEST version 00:07:20.552 ************************************ 00:07:20.552 18:44:05 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:07:20.552 18:44:05 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:07:20.552 18:44:05 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:20.552 18:44:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:20.552 18:44:05 -- common/autotest_common.sh@10 -- # set +x 00:07:20.552 ************************************ 00:07:20.552 START TEST blockdev_general 00:07:20.552 ************************************ 00:07:20.552 18:44:05 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:07:20.552 * Looking for test storage... 00:07:20.552 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:20.552 18:44:05 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@673 -- # uname -s 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@681 -- # test_type=bdev 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@682 -- # crypto_device= 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@683 -- # dek= 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@684 -- # env_ctx= 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@689 -- # [[ bdev == bdev ]] 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2025782 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:07:20.552 18:44:05 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 2025782 00:07:20.552 18:44:05 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 2025782 ']' 00:07:20.552 18:44:05 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:20.552 18:44:05 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:20.552 18:44:05 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:20.552 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:20.552 18:44:05 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:20.552 18:44:05 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:20.552 [2024-07-24 18:44:05.541916] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:07:20.552 [2024-07-24 18:44:05.541962] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2025782 ] 00:07:20.812 [2024-07-24 18:44:05.606637] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.812 [2024-07-24 18:44:05.683003] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.380 18:44:06 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:21.380 18:44:06 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:07:21.380 18:44:06 blockdev_general -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:07:21.380 18:44:06 blockdev_general -- bdev/blockdev.sh@695 -- # setup_bdev_conf 00:07:21.380 18:44:06 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:07:21.380 18:44:06 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:21.380 18:44:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:21.639 [2024-07-24 18:44:06.518657] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:21.639 [2024-07-24 18:44:06.518701] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:21.639 00:07:21.639 [2024-07-24 18:44:06.526653] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:21.639 [2024-07-24 18:44:06.526668] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:21.639 00:07:21.639 Malloc0 00:07:21.639 Malloc1 00:07:21.639 Malloc2 00:07:21.639 Malloc3 00:07:21.639 Malloc4 00:07:21.639 Malloc5 00:07:21.639 Malloc6 00:07:21.639 Malloc7 00:07:21.639 Malloc8 00:07:21.639 Malloc9 00:07:21.899 [2024-07-24 18:44:06.651196] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:21.899 [2024-07-24 18:44:06.651232] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:21.899 [2024-07-24 18:44:06.651243] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe0e680 00:07:21.899 [2024-07-24 18:44:06.651249] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:21.899 [2024-07-24 18:44:06.652180] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:21.899 [2024-07-24 18:44:06.652199] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:21.899 TestPT 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:21.899 18:44:06 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:07:21.899 5000+0 records in 00:07:21.899 5000+0 records out 00:07:21.899 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0177054 s, 578 MB/s 00:07:21.899 18:44:06 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:21.899 AIO0 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:21.899 18:44:06 blockdev_general -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:21.899 18:44:06 blockdev_general -- bdev/blockdev.sh@739 -- # cat 00:07:21.899 18:44:06 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:21.899 18:44:06 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:21.899 18:44:06 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:21.899 18:44:06 blockdev_general -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:07:21.899 18:44:06 blockdev_general -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:07:21.899 18:44:06 blockdev_general -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:21.899 18:44:06 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:21.899 18:44:06 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:07:21.899 18:44:06 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r .name 00:07:21.900 18:44:06 blockdev_general -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "65959268-e9fe-4e98-8c1c-1a6aae5df474"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "65959268-e9fe-4e98-8c1c-1a6aae5df474",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "dacdc9d4-eca7-5508-95c3-c1cb59aa8a8c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "dacdc9d4-eca7-5508-95c3-c1cb59aa8a8c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "a1b62a94-cd9e-52a1-ad91-152da23f9c1c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "a1b62a94-cd9e-52a1-ad91-152da23f9c1c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "019bba84-cc80-5697-9225-636efacd0f43"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "019bba84-cc80-5697-9225-636efacd0f43",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "45b55701-cd11-546b-a237-8f3076f23cc5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "45b55701-cd11-546b-a237-8f3076f23cc5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "a8bd9c9a-b571-559d-b2e1-f6245bdf3ad4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a8bd9c9a-b571-559d-b2e1-f6245bdf3ad4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "df84e28a-4455-54b2-b33a-e389383719b8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "df84e28a-4455-54b2-b33a-e389383719b8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "e0086ef8-b7a6-52b8-a646-8f22b4cbcb7b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e0086ef8-b7a6-52b8-a646-8f22b4cbcb7b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "a9c7cdac-297d-534e-81f0-9ffdeceb1dcb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a9c7cdac-297d-534e-81f0-9ffdeceb1dcb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "f0a2cf31-7c77-56e8-b197-d236c1aac61c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f0a2cf31-7c77-56e8-b197-d236c1aac61c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "0bfc745a-20b9-5b20-81ff-2184a98818a3"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0bfc745a-20b9-5b20-81ff-2184a98818a3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "f64382ad-312e-5db6-8c1b-e56caf7476a9"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f64382ad-312e-5db6-8c1b-e56caf7476a9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "2c5fb105-88a0-4285-9c75-931f827eb641"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "2c5fb105-88a0-4285-9c75-931f827eb641",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "2c5fb105-88a0-4285-9c75-931f827eb641",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "2fc39124-b0a6-4e81-98ee-2956c31a77e5",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "0f2e17c0-c340-48a1-9a90-131b8b95b882",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "bc1c615f-ef65-41e1-8290-0896f4df63d5"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "bc1c615f-ef65-41e1-8290-0896f4df63d5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "bc1c615f-ef65-41e1-8290-0896f4df63d5",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "fa43c441-8514-47a1-b156-b10e7c30b132",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "1c20925f-01a9-4241-b17b-ccd7493d0c83",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "24014bff-3a03-4bfc-a610-465675b93ba6"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "24014bff-3a03-4bfc-a610-465675b93ba6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "24014bff-3a03-4bfc-a610-465675b93ba6",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "4e481d73-5536-4380-81dc-96b7d5513589",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "bf76b770-0e0b-4c4d-9098-c0cf7cdab04b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "5f1a9707-a285-4b5b-94fb-dbbd071e59fd"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "5f1a9707-a285-4b5b-94fb-dbbd071e59fd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:07:22.159 18:44:06 blockdev_general -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:07:22.159 18:44:06 blockdev_general -- bdev/blockdev.sh@751 -- # hello_world_bdev=Malloc0 00:07:22.159 18:44:06 blockdev_general -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:07:22.159 18:44:06 blockdev_general -- bdev/blockdev.sh@753 -- # killprocess 2025782 00:07:22.159 18:44:06 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 2025782 ']' 00:07:22.159 18:44:06 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 2025782 00:07:22.159 18:44:06 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:07:22.159 18:44:06 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:22.159 18:44:06 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2025782 00:07:22.159 18:44:06 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:22.159 18:44:06 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:22.159 18:44:06 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2025782' 00:07:22.159 killing process with pid 2025782 00:07:22.159 18:44:06 blockdev_general -- common/autotest_common.sh@967 -- # kill 2025782 00:07:22.159 18:44:06 blockdev_general -- common/autotest_common.sh@972 -- # wait 2025782 00:07:22.419 18:44:07 blockdev_general -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:07:22.419 18:44:07 blockdev_general -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:07:22.419 18:44:07 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:22.419 18:44:07 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:22.419 18:44:07 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:22.419 ************************************ 00:07:22.419 START TEST bdev_hello_world 00:07:22.419 ************************************ 00:07:22.419 18:44:07 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:07:22.678 [2024-07-24 18:44:07.444424] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:07:22.678 [2024-07-24 18:44:07.444459] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2026156 ] 00:07:22.678 [2024-07-24 18:44:07.506990] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.678 [2024-07-24 18:44:07.579690] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.937 [2024-07-24 18:44:07.717567] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:22.937 [2024-07-24 18:44:07.717606] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:22.937 [2024-07-24 18:44:07.717614] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:22.937 [2024-07-24 18:44:07.725574] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:22.937 [2024-07-24 18:44:07.725594] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:22.937 [2024-07-24 18:44:07.733587] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:22.937 [2024-07-24 18:44:07.733601] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:22.937 [2024-07-24 18:44:07.800760] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:22.937 [2024-07-24 18:44:07.800797] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:22.937 [2024-07-24 18:44:07.800806] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x144de50 00:07:22.937 [2024-07-24 18:44:07.800816] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:22.937 [2024-07-24 18:44:07.801745] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:22.937 [2024-07-24 18:44:07.801765] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:22.937 [2024-07-24 18:44:07.932859] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:07:22.937 [2024-07-24 18:44:07.932898] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:07:22.937 [2024-07-24 18:44:07.932921] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:07:22.937 [2024-07-24 18:44:07.932955] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:07:22.937 [2024-07-24 18:44:07.932992] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:07:22.937 [2024-07-24 18:44:07.933003] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:07:22.937 [2024-07-24 18:44:07.933030] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:07:22.937 00:07:22.937 [2024-07-24 18:44:07.933046] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:07:23.196 00:07:23.196 real 0m0.791s 00:07:23.196 user 0m0.546s 00:07:23.196 sys 0m0.211s 00:07:23.196 18:44:08 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:23.196 18:44:08 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:07:23.196 ************************************ 00:07:23.196 END TEST bdev_hello_world 00:07:23.196 ************************************ 00:07:23.455 18:44:08 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:07:23.455 18:44:08 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:07:23.455 18:44:08 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:23.455 18:44:08 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:23.455 ************************************ 00:07:23.455 START TEST bdev_bounds 00:07:23.455 ************************************ 00:07:23.455 18:44:08 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:07:23.455 18:44:08 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=2026298 00:07:23.455 18:44:08 blockdev_general.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:07:23.455 18:44:08 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:07:23.455 18:44:08 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 2026298' 00:07:23.455 Process bdevio pid: 2026298 00:07:23.455 18:44:08 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 2026298 00:07:23.455 18:44:08 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2026298 ']' 00:07:23.455 18:44:08 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:23.455 18:44:08 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:23.455 18:44:08 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:23.455 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:23.455 18:44:08 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:23.455 18:44:08 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:07:23.455 [2024-07-24 18:44:08.290905] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:07:23.455 [2024-07-24 18:44:08.290942] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2026298 ] 00:07:23.455 [2024-07-24 18:44:08.355358] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:23.455 [2024-07-24 18:44:08.436767] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:23.455 [2024-07-24 18:44:08.436866] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.455 [2024-07-24 18:44:08.436866] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:23.714 [2024-07-24 18:44:08.576368] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:23.715 [2024-07-24 18:44:08.576412] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:23.715 [2024-07-24 18:44:08.576420] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:23.715 [2024-07-24 18:44:08.584383] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:23.715 [2024-07-24 18:44:08.584403] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:23.715 [2024-07-24 18:44:08.592397] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:23.715 [2024-07-24 18:44:08.592410] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:23.715 [2024-07-24 18:44:08.659996] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:23.715 [2024-07-24 18:44:08.660037] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:23.715 [2024-07-24 18:44:08.660046] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2887af0 00:07:23.715 [2024-07-24 18:44:08.660052] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:23.715 [2024-07-24 18:44:08.661075] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:23.715 [2024-07-24 18:44:08.661097] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:24.283 18:44:09 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:24.283 18:44:09 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:07:24.283 18:44:09 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:07:24.283 I/O targets: 00:07:24.283 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:07:24.283 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:07:24.283 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:07:24.283 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:07:24.283 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:07:24.283 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:07:24.283 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:07:24.283 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:07:24.283 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:07:24.283 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:07:24.283 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:07:24.283 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:07:24.283 raid0: 131072 blocks of 512 bytes (64 MiB) 00:07:24.283 concat0: 131072 blocks of 512 bytes (64 MiB) 00:07:24.283 raid1: 65536 blocks of 512 bytes (32 MiB) 00:07:24.283 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:07:24.283 00:07:24.283 00:07:24.283 CUnit - A unit testing framework for C - Version 2.1-3 00:07:24.283 http://cunit.sourceforge.net/ 00:07:24.283 00:07:24.283 00:07:24.283 Suite: bdevio tests on: AIO0 00:07:24.283 Test: blockdev write read block ...passed 00:07:24.283 Test: blockdev write zeroes read block ...passed 00:07:24.283 Test: blockdev write zeroes read no split ...passed 00:07:24.283 Test: blockdev write zeroes read split ...passed 00:07:24.283 Test: blockdev write zeroes read split partial ...passed 00:07:24.283 Test: blockdev reset ...passed 00:07:24.283 Test: blockdev write read 8 blocks ...passed 00:07:24.283 Test: blockdev write read size > 128k ...passed 00:07:24.283 Test: blockdev write read invalid size ...passed 00:07:24.283 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.283 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.283 Test: blockdev write read max offset ...passed 00:07:24.283 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.283 Test: blockdev writev readv 8 blocks ...passed 00:07:24.283 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.283 Test: blockdev writev readv block ...passed 00:07:24.283 Test: blockdev writev readv size > 128k ...passed 00:07:24.284 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.284 Test: blockdev comparev and writev ...passed 00:07:24.284 Test: blockdev nvme passthru rw ...passed 00:07:24.284 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.284 Test: blockdev nvme admin passthru ...passed 00:07:24.284 Test: blockdev copy ...passed 00:07:24.284 Suite: bdevio tests on: raid1 00:07:24.284 Test: blockdev write read block ...passed 00:07:24.284 Test: blockdev write zeroes read block ...passed 00:07:24.284 Test: blockdev write zeroes read no split ...passed 00:07:24.284 Test: blockdev write zeroes read split ...passed 00:07:24.284 Test: blockdev write zeroes read split partial ...passed 00:07:24.284 Test: blockdev reset ...passed 00:07:24.284 Test: blockdev write read 8 blocks ...passed 00:07:24.284 Test: blockdev write read size > 128k ...passed 00:07:24.284 Test: blockdev write read invalid size ...passed 00:07:24.284 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.284 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.284 Test: blockdev write read max offset ...passed 00:07:24.284 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.284 Test: blockdev writev readv 8 blocks ...passed 00:07:24.284 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.284 Test: blockdev writev readv block ...passed 00:07:24.284 Test: blockdev writev readv size > 128k ...passed 00:07:24.284 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.284 Test: blockdev comparev and writev ...passed 00:07:24.284 Test: blockdev nvme passthru rw ...passed 00:07:24.284 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.284 Test: blockdev nvme admin passthru ...passed 00:07:24.284 Test: blockdev copy ...passed 00:07:24.284 Suite: bdevio tests on: concat0 00:07:24.284 Test: blockdev write read block ...passed 00:07:24.284 Test: blockdev write zeroes read block ...passed 00:07:24.284 Test: blockdev write zeroes read no split ...passed 00:07:24.284 Test: blockdev write zeroes read split ...passed 00:07:24.284 Test: blockdev write zeroes read split partial ...passed 00:07:24.284 Test: blockdev reset ...passed 00:07:24.284 Test: blockdev write read 8 blocks ...passed 00:07:24.284 Test: blockdev write read size > 128k ...passed 00:07:24.284 Test: blockdev write read invalid size ...passed 00:07:24.284 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.284 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.284 Test: blockdev write read max offset ...passed 00:07:24.284 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.284 Test: blockdev writev readv 8 blocks ...passed 00:07:24.284 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.284 Test: blockdev writev readv block ...passed 00:07:24.284 Test: blockdev writev readv size > 128k ...passed 00:07:24.284 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.284 Test: blockdev comparev and writev ...passed 00:07:24.284 Test: blockdev nvme passthru rw ...passed 00:07:24.284 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.284 Test: blockdev nvme admin passthru ...passed 00:07:24.284 Test: blockdev copy ...passed 00:07:24.284 Suite: bdevio tests on: raid0 00:07:24.284 Test: blockdev write read block ...passed 00:07:24.284 Test: blockdev write zeroes read block ...passed 00:07:24.284 Test: blockdev write zeroes read no split ...passed 00:07:24.284 Test: blockdev write zeroes read split ...passed 00:07:24.284 Test: blockdev write zeroes read split partial ...passed 00:07:24.284 Test: blockdev reset ...passed 00:07:24.284 Test: blockdev write read 8 blocks ...passed 00:07:24.284 Test: blockdev write read size > 128k ...passed 00:07:24.284 Test: blockdev write read invalid size ...passed 00:07:24.284 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.284 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.284 Test: blockdev write read max offset ...passed 00:07:24.284 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.284 Test: blockdev writev readv 8 blocks ...passed 00:07:24.284 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.284 Test: blockdev writev readv block ...passed 00:07:24.284 Test: blockdev writev readv size > 128k ...passed 00:07:24.284 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.284 Test: blockdev comparev and writev ...passed 00:07:24.284 Test: blockdev nvme passthru rw ...passed 00:07:24.284 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.284 Test: blockdev nvme admin passthru ...passed 00:07:24.284 Test: blockdev copy ...passed 00:07:24.284 Suite: bdevio tests on: TestPT 00:07:24.284 Test: blockdev write read block ...passed 00:07:24.284 Test: blockdev write zeroes read block ...passed 00:07:24.284 Test: blockdev write zeroes read no split ...passed 00:07:24.284 Test: blockdev write zeroes read split ...passed 00:07:24.284 Test: blockdev write zeroes read split partial ...passed 00:07:24.284 Test: blockdev reset ...passed 00:07:24.284 Test: blockdev write read 8 blocks ...passed 00:07:24.284 Test: blockdev write read size > 128k ...passed 00:07:24.284 Test: blockdev write read invalid size ...passed 00:07:24.284 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.284 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.284 Test: blockdev write read max offset ...passed 00:07:24.284 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.284 Test: blockdev writev readv 8 blocks ...passed 00:07:24.284 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.284 Test: blockdev writev readv block ...passed 00:07:24.284 Test: blockdev writev readv size > 128k ...passed 00:07:24.544 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.544 Test: blockdev comparev and writev ...passed 00:07:24.544 Test: blockdev nvme passthru rw ...passed 00:07:24.544 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.544 Test: blockdev nvme admin passthru ...passed 00:07:24.544 Test: blockdev copy ...passed 00:07:24.544 Suite: bdevio tests on: Malloc2p7 00:07:24.544 Test: blockdev write read block ...passed 00:07:24.544 Test: blockdev write zeroes read block ...passed 00:07:24.544 Test: blockdev write zeroes read no split ...passed 00:07:24.544 Test: blockdev write zeroes read split ...passed 00:07:24.544 Test: blockdev write zeroes read split partial ...passed 00:07:24.544 Test: blockdev reset ...passed 00:07:24.544 Test: blockdev write read 8 blocks ...passed 00:07:24.544 Test: blockdev write read size > 128k ...passed 00:07:24.544 Test: blockdev write read invalid size ...passed 00:07:24.544 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.544 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.544 Test: blockdev write read max offset ...passed 00:07:24.544 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.544 Test: blockdev writev readv 8 blocks ...passed 00:07:24.544 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.544 Test: blockdev writev readv block ...passed 00:07:24.544 Test: blockdev writev readv size > 128k ...passed 00:07:24.544 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.544 Test: blockdev comparev and writev ...passed 00:07:24.544 Test: blockdev nvme passthru rw ...passed 00:07:24.544 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.544 Test: blockdev nvme admin passthru ...passed 00:07:24.544 Test: blockdev copy ...passed 00:07:24.544 Suite: bdevio tests on: Malloc2p6 00:07:24.544 Test: blockdev write read block ...passed 00:07:24.544 Test: blockdev write zeroes read block ...passed 00:07:24.544 Test: blockdev write zeroes read no split ...passed 00:07:24.544 Test: blockdev write zeroes read split ...passed 00:07:24.544 Test: blockdev write zeroes read split partial ...passed 00:07:24.544 Test: blockdev reset ...passed 00:07:24.544 Test: blockdev write read 8 blocks ...passed 00:07:24.544 Test: blockdev write read size > 128k ...passed 00:07:24.544 Test: blockdev write read invalid size ...passed 00:07:24.544 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.544 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.544 Test: blockdev write read max offset ...passed 00:07:24.544 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.544 Test: blockdev writev readv 8 blocks ...passed 00:07:24.544 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.544 Test: blockdev writev readv block ...passed 00:07:24.544 Test: blockdev writev readv size > 128k ...passed 00:07:24.544 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.544 Test: blockdev comparev and writev ...passed 00:07:24.544 Test: blockdev nvme passthru rw ...passed 00:07:24.544 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.544 Test: blockdev nvme admin passthru ...passed 00:07:24.544 Test: blockdev copy ...passed 00:07:24.544 Suite: bdevio tests on: Malloc2p5 00:07:24.544 Test: blockdev write read block ...passed 00:07:24.544 Test: blockdev write zeroes read block ...passed 00:07:24.544 Test: blockdev write zeroes read no split ...passed 00:07:24.544 Test: blockdev write zeroes read split ...passed 00:07:24.544 Test: blockdev write zeroes read split partial ...passed 00:07:24.544 Test: blockdev reset ...passed 00:07:24.544 Test: blockdev write read 8 blocks ...passed 00:07:24.544 Test: blockdev write read size > 128k ...passed 00:07:24.544 Test: blockdev write read invalid size ...passed 00:07:24.544 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.545 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.545 Test: blockdev write read max offset ...passed 00:07:24.545 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.545 Test: blockdev writev readv 8 blocks ...passed 00:07:24.545 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.545 Test: blockdev writev readv block ...passed 00:07:24.545 Test: blockdev writev readv size > 128k ...passed 00:07:24.545 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.545 Test: blockdev comparev and writev ...passed 00:07:24.545 Test: blockdev nvme passthru rw ...passed 00:07:24.545 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.545 Test: blockdev nvme admin passthru ...passed 00:07:24.545 Test: blockdev copy ...passed 00:07:24.545 Suite: bdevio tests on: Malloc2p4 00:07:24.545 Test: blockdev write read block ...passed 00:07:24.545 Test: blockdev write zeroes read block ...passed 00:07:24.545 Test: blockdev write zeroes read no split ...passed 00:07:24.545 Test: blockdev write zeroes read split ...passed 00:07:24.545 Test: blockdev write zeroes read split partial ...passed 00:07:24.545 Test: blockdev reset ...passed 00:07:24.545 Test: blockdev write read 8 blocks ...passed 00:07:24.545 Test: blockdev write read size > 128k ...passed 00:07:24.545 Test: blockdev write read invalid size ...passed 00:07:24.545 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.545 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.545 Test: blockdev write read max offset ...passed 00:07:24.545 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.545 Test: blockdev writev readv 8 blocks ...passed 00:07:24.545 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.545 Test: blockdev writev readv block ...passed 00:07:24.545 Test: blockdev writev readv size > 128k ...passed 00:07:24.545 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.545 Test: blockdev comparev and writev ...passed 00:07:24.545 Test: blockdev nvme passthru rw ...passed 00:07:24.545 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.545 Test: blockdev nvme admin passthru ...passed 00:07:24.545 Test: blockdev copy ...passed 00:07:24.545 Suite: bdevio tests on: Malloc2p3 00:07:24.545 Test: blockdev write read block ...passed 00:07:24.545 Test: blockdev write zeroes read block ...passed 00:07:24.545 Test: blockdev write zeroes read no split ...passed 00:07:24.545 Test: blockdev write zeroes read split ...passed 00:07:24.545 Test: blockdev write zeroes read split partial ...passed 00:07:24.545 Test: blockdev reset ...passed 00:07:24.545 Test: blockdev write read 8 blocks ...passed 00:07:24.545 Test: blockdev write read size > 128k ...passed 00:07:24.545 Test: blockdev write read invalid size ...passed 00:07:24.545 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.545 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.545 Test: blockdev write read max offset ...passed 00:07:24.545 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.545 Test: blockdev writev readv 8 blocks ...passed 00:07:24.545 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.545 Test: blockdev writev readv block ...passed 00:07:24.545 Test: blockdev writev readv size > 128k ...passed 00:07:24.545 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.545 Test: blockdev comparev and writev ...passed 00:07:24.545 Test: blockdev nvme passthru rw ...passed 00:07:24.545 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.545 Test: blockdev nvme admin passthru ...passed 00:07:24.545 Test: blockdev copy ...passed 00:07:24.545 Suite: bdevio tests on: Malloc2p2 00:07:24.545 Test: blockdev write read block ...passed 00:07:24.545 Test: blockdev write zeroes read block ...passed 00:07:24.545 Test: blockdev write zeroes read no split ...passed 00:07:24.545 Test: blockdev write zeroes read split ...passed 00:07:24.545 Test: blockdev write zeroes read split partial ...passed 00:07:24.545 Test: blockdev reset ...passed 00:07:24.545 Test: blockdev write read 8 blocks ...passed 00:07:24.545 Test: blockdev write read size > 128k ...passed 00:07:24.545 Test: blockdev write read invalid size ...passed 00:07:24.545 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.545 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.545 Test: blockdev write read max offset ...passed 00:07:24.545 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.545 Test: blockdev writev readv 8 blocks ...passed 00:07:24.545 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.545 Test: blockdev writev readv block ...passed 00:07:24.545 Test: blockdev writev readv size > 128k ...passed 00:07:24.545 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.545 Test: blockdev comparev and writev ...passed 00:07:24.545 Test: blockdev nvme passthru rw ...passed 00:07:24.545 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.545 Test: blockdev nvme admin passthru ...passed 00:07:24.545 Test: blockdev copy ...passed 00:07:24.545 Suite: bdevio tests on: Malloc2p1 00:07:24.545 Test: blockdev write read block ...passed 00:07:24.545 Test: blockdev write zeroes read block ...passed 00:07:24.545 Test: blockdev write zeroes read no split ...passed 00:07:24.545 Test: blockdev write zeroes read split ...passed 00:07:24.545 Test: blockdev write zeroes read split partial ...passed 00:07:24.545 Test: blockdev reset ...passed 00:07:24.545 Test: blockdev write read 8 blocks ...passed 00:07:24.545 Test: blockdev write read size > 128k ...passed 00:07:24.545 Test: blockdev write read invalid size ...passed 00:07:24.545 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.545 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.545 Test: blockdev write read max offset ...passed 00:07:24.545 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.545 Test: blockdev writev readv 8 blocks ...passed 00:07:24.545 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.545 Test: blockdev writev readv block ...passed 00:07:24.545 Test: blockdev writev readv size > 128k ...passed 00:07:24.545 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.545 Test: blockdev comparev and writev ...passed 00:07:24.545 Test: blockdev nvme passthru rw ...passed 00:07:24.545 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.545 Test: blockdev nvme admin passthru ...passed 00:07:24.545 Test: blockdev copy ...passed 00:07:24.545 Suite: bdevio tests on: Malloc2p0 00:07:24.545 Test: blockdev write read block ...passed 00:07:24.545 Test: blockdev write zeroes read block ...passed 00:07:24.545 Test: blockdev write zeroes read no split ...passed 00:07:24.545 Test: blockdev write zeroes read split ...passed 00:07:24.545 Test: blockdev write zeroes read split partial ...passed 00:07:24.545 Test: blockdev reset ...passed 00:07:24.545 Test: blockdev write read 8 blocks ...passed 00:07:24.545 Test: blockdev write read size > 128k ...passed 00:07:24.545 Test: blockdev write read invalid size ...passed 00:07:24.545 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.545 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.545 Test: blockdev write read max offset ...passed 00:07:24.545 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.545 Test: blockdev writev readv 8 blocks ...passed 00:07:24.545 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.545 Test: blockdev writev readv block ...passed 00:07:24.545 Test: blockdev writev readv size > 128k ...passed 00:07:24.545 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.545 Test: blockdev comparev and writev ...passed 00:07:24.545 Test: blockdev nvme passthru rw ...passed 00:07:24.545 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.545 Test: blockdev nvme admin passthru ...passed 00:07:24.545 Test: blockdev copy ...passed 00:07:24.545 Suite: bdevio tests on: Malloc1p1 00:07:24.545 Test: blockdev write read block ...passed 00:07:24.545 Test: blockdev write zeroes read block ...passed 00:07:24.545 Test: blockdev write zeroes read no split ...passed 00:07:24.545 Test: blockdev write zeroes read split ...passed 00:07:24.545 Test: blockdev write zeroes read split partial ...passed 00:07:24.545 Test: blockdev reset ...passed 00:07:24.545 Test: blockdev write read 8 blocks ...passed 00:07:24.545 Test: blockdev write read size > 128k ...passed 00:07:24.545 Test: blockdev write read invalid size ...passed 00:07:24.545 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.545 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.545 Test: blockdev write read max offset ...passed 00:07:24.545 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.545 Test: blockdev writev readv 8 blocks ...passed 00:07:24.545 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.545 Test: blockdev writev readv block ...passed 00:07:24.545 Test: blockdev writev readv size > 128k ...passed 00:07:24.545 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.545 Test: blockdev comparev and writev ...passed 00:07:24.545 Test: blockdev nvme passthru rw ...passed 00:07:24.545 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.545 Test: blockdev nvme admin passthru ...passed 00:07:24.545 Test: blockdev copy ...passed 00:07:24.545 Suite: bdevio tests on: Malloc1p0 00:07:24.545 Test: blockdev write read block ...passed 00:07:24.545 Test: blockdev write zeroes read block ...passed 00:07:24.545 Test: blockdev write zeroes read no split ...passed 00:07:24.545 Test: blockdev write zeroes read split ...passed 00:07:24.545 Test: blockdev write zeroes read split partial ...passed 00:07:24.546 Test: blockdev reset ...passed 00:07:24.546 Test: blockdev write read 8 blocks ...passed 00:07:24.546 Test: blockdev write read size > 128k ...passed 00:07:24.546 Test: blockdev write read invalid size ...passed 00:07:24.546 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.546 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.546 Test: blockdev write read max offset ...passed 00:07:24.546 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.546 Test: blockdev writev readv 8 blocks ...passed 00:07:24.546 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.546 Test: blockdev writev readv block ...passed 00:07:24.546 Test: blockdev writev readv size > 128k ...passed 00:07:24.546 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.546 Test: blockdev comparev and writev ...passed 00:07:24.546 Test: blockdev nvme passthru rw ...passed 00:07:24.546 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.546 Test: blockdev nvme admin passthru ...passed 00:07:24.546 Test: blockdev copy ...passed 00:07:24.546 Suite: bdevio tests on: Malloc0 00:07:24.546 Test: blockdev write read block ...passed 00:07:24.546 Test: blockdev write zeroes read block ...passed 00:07:24.546 Test: blockdev write zeroes read no split ...passed 00:07:24.546 Test: blockdev write zeroes read split ...passed 00:07:24.546 Test: blockdev write zeroes read split partial ...passed 00:07:24.546 Test: blockdev reset ...passed 00:07:24.546 Test: blockdev write read 8 blocks ...passed 00:07:24.546 Test: blockdev write read size > 128k ...passed 00:07:24.546 Test: blockdev write read invalid size ...passed 00:07:24.546 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.546 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.546 Test: blockdev write read max offset ...passed 00:07:24.546 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.546 Test: blockdev writev readv 8 blocks ...passed 00:07:24.546 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.546 Test: blockdev writev readv block ...passed 00:07:24.546 Test: blockdev writev readv size > 128k ...passed 00:07:24.546 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.546 Test: blockdev comparev and writev ...passed 00:07:24.546 Test: blockdev nvme passthru rw ...passed 00:07:24.546 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.546 Test: blockdev nvme admin passthru ...passed 00:07:24.546 Test: blockdev copy ...passed 00:07:24.546 00:07:24.546 Run Summary: Type Total Ran Passed Failed Inactive 00:07:24.546 suites 16 16 n/a 0 0 00:07:24.546 tests 368 368 368 0 0 00:07:24.546 asserts 2224 2224 2224 0 n/a 00:07:24.546 00:07:24.546 Elapsed time = 0.464 seconds 00:07:24.546 0 00:07:24.546 18:44:09 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 2026298 00:07:24.546 18:44:09 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2026298 ']' 00:07:24.546 18:44:09 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2026298 00:07:24.546 18:44:09 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:07:24.546 18:44:09 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:24.546 18:44:09 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2026298 00:07:24.546 18:44:09 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:24.546 18:44:09 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:24.546 18:44:09 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2026298' 00:07:24.546 killing process with pid 2026298 00:07:24.546 18:44:09 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2026298 00:07:24.546 18:44:09 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2026298 00:07:24.806 18:44:09 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:07:24.806 00:07:24.806 real 0m1.426s 00:07:24.806 user 0m3.649s 00:07:24.806 sys 0m0.349s 00:07:24.806 18:44:09 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:24.806 18:44:09 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:07:24.806 ************************************ 00:07:24.806 END TEST bdev_bounds 00:07:24.806 ************************************ 00:07:24.806 18:44:09 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:07:24.806 18:44:09 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:07:24.806 18:44:09 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:24.806 18:44:09 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:24.806 ************************************ 00:07:24.806 START TEST bdev_nbd 00:07:24.806 ************************************ 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=16 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=16 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=2026648 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 2026648 /var/tmp/spdk-nbd.sock 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2026648 ']' 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:24.806 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:24.806 18:44:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:07:24.806 [2024-07-24 18:44:09.802100] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:07:24.806 [2024-07-24 18:44:09.802139] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:07:25.065 [2024-07-24 18:44:09.866221] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.065 [2024-07-24 18:44:09.937253] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.322 [2024-07-24 18:44:10.077195] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:25.322 [2024-07-24 18:44:10.077239] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:25.322 [2024-07-24 18:44:10.077247] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:25.322 [2024-07-24 18:44:10.085205] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:25.322 [2024-07-24 18:44:10.085225] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:25.322 [2024-07-24 18:44:10.093217] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:25.322 [2024-07-24 18:44:10.093235] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:25.322 [2024-07-24 18:44:10.160076] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:25.322 [2024-07-24 18:44:10.160115] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:25.322 [2024-07-24 18:44:10.160123] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b6e560 00:07:25.322 [2024-07-24 18:44:10.160129] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:25.322 [2024-07-24 18:44:10.161190] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:25.322 [2024-07-24 18:44:10.161213] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:25.580 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:25.580 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:07:25.580 18:44:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:07:25.580 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:25.580 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:25.580 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:07:25.580 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:07:25.580 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:25.580 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:25.580 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:07:25.580 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:07:25.580 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:07:25.580 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:07:25.580 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:25.580 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:25.838 1+0 records in 00:07:25.838 1+0 records out 00:07:25.838 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235889 s, 17.4 MB/s 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:25.838 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:07:26.097 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:07:26.097 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:07:26.097 18:44:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:07:26.097 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:26.097 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:26.097 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:26.097 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:26.097 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:26.097 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:26.097 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:26.097 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:26.097 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:26.097 1+0 records in 00:07:26.097 1+0 records out 00:07:26.097 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000222132 s, 18.4 MB/s 00:07:26.097 18:44:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.097 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:26.097 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.098 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:26.098 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:26.098 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:26.098 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:26.098 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:26.357 1+0 records in 00:07:26.357 1+0 records out 00:07:26.357 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021559 s, 19.0 MB/s 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:26.357 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:26.616 1+0 records in 00:07:26.616 1+0 records out 00:07:26.616 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000158702 s, 25.8 MB/s 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:26.616 1+0 records in 00:07:26.616 1+0 records out 00:07:26.616 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244947 s, 16.7 MB/s 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:26.616 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:26.875 1+0 records in 00:07:26.875 1+0 records out 00:07:26.875 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000228864 s, 17.9 MB/s 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:26.875 18:44:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:27.134 1+0 records in 00:07:27.134 1+0 records out 00:07:27.134 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000267087 s, 15.3 MB/s 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:27.134 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:27.393 1+0 records in 00:07:27.393 1+0 records out 00:07:27.393 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000312925 s, 13.1 MB/s 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:07:27.393 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:27.652 1+0 records in 00:07:27.652 1+0 records out 00:07:27.652 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290131 s, 14.1 MB/s 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:27.652 1+0 records in 00:07:27.652 1+0 records out 00:07:27.652 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000288217 s, 14.2 MB/s 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:27.652 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:27.910 1+0 records in 00:07:27.910 1+0 records out 00:07:27.910 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000331449 s, 12.4 MB/s 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:27.910 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:28.169 1+0 records in 00:07:28.169 1+0 records out 00:07:28.169 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000319343 s, 12.8 MB/s 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:28.169 18:44:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:28.169 1+0 records in 00:07:28.169 1+0 records out 00:07:28.169 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0002968 s, 13.8 MB/s 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:28.169 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:28.428 1+0 records in 00:07:28.428 1+0 records out 00:07:28.428 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000307967 s, 13.3 MB/s 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:28.428 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:28.686 1+0 records in 00:07:28.686 1+0 records out 00:07:28.686 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00037481 s, 10.9 MB/s 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:28.686 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:28.944 1+0 records in 00:07:28.944 1+0 records out 00:07:28.944 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000246865 s, 16.6 MB/s 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd0", 00:07:28.944 "bdev_name": "Malloc0" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd1", 00:07:28.944 "bdev_name": "Malloc1p0" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd2", 00:07:28.944 "bdev_name": "Malloc1p1" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd3", 00:07:28.944 "bdev_name": "Malloc2p0" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd4", 00:07:28.944 "bdev_name": "Malloc2p1" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd5", 00:07:28.944 "bdev_name": "Malloc2p2" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd6", 00:07:28.944 "bdev_name": "Malloc2p3" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd7", 00:07:28.944 "bdev_name": "Malloc2p4" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd8", 00:07:28.944 "bdev_name": "Malloc2p5" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd9", 00:07:28.944 "bdev_name": "Malloc2p6" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd10", 00:07:28.944 "bdev_name": "Malloc2p7" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd11", 00:07:28.944 "bdev_name": "TestPT" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd12", 00:07:28.944 "bdev_name": "raid0" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd13", 00:07:28.944 "bdev_name": "concat0" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd14", 00:07:28.944 "bdev_name": "raid1" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd15", 00:07:28.944 "bdev_name": "AIO0" 00:07:28.944 } 00:07:28.944 ]' 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd0", 00:07:28.944 "bdev_name": "Malloc0" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd1", 00:07:28.944 "bdev_name": "Malloc1p0" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd2", 00:07:28.944 "bdev_name": "Malloc1p1" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd3", 00:07:28.944 "bdev_name": "Malloc2p0" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd4", 00:07:28.944 "bdev_name": "Malloc2p1" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd5", 00:07:28.944 "bdev_name": "Malloc2p2" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd6", 00:07:28.944 "bdev_name": "Malloc2p3" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd7", 00:07:28.944 "bdev_name": "Malloc2p4" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd8", 00:07:28.944 "bdev_name": "Malloc2p5" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd9", 00:07:28.944 "bdev_name": "Malloc2p6" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd10", 00:07:28.944 "bdev_name": "Malloc2p7" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd11", 00:07:28.944 "bdev_name": "TestPT" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd12", 00:07:28.944 "bdev_name": "raid0" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd13", 00:07:28.944 "bdev_name": "concat0" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd14", 00:07:28.944 "bdev_name": "raid1" 00:07:28.944 }, 00:07:28.944 { 00:07:28.944 "nbd_device": "/dev/nbd15", 00:07:28.944 "bdev_name": "AIO0" 00:07:28.944 } 00:07:28.944 ]' 00:07:28.944 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:07:29.202 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:07:29.202 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.203 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:07:29.203 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:29.203 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:29.203 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:29.203 18:44:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:29.203 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:29.203 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:29.203 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:29.203 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:29.203 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:29.203 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:29.203 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:29.203 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:29.203 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:29.203 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:29.461 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:29.461 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:29.461 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:29.461 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:29.461 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:29.461 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:29.461 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:29.461 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:29.462 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:29.462 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:29.719 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:07:29.977 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:07:29.977 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:07:29.977 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:07:29.977 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:29.977 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:29.977 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:07:29.977 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:29.977 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:29.977 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:29.977 18:44:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:30.235 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:07:30.493 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:07:30.493 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:07:30.493 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:07:30.493 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:30.493 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:30.493 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:07:30.493 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:30.493 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:30.493 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:30.493 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:07:30.786 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:07:30.786 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:07:30.786 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:07:30.786 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:30.786 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:30.786 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:07:30.786 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:30.786 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:30.786 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:30.786 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:31.045 18:44:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:07:31.303 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:07:31.303 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:07:31.303 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:07:31.303 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:31.303 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:31.303 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:07:31.303 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:31.303 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:31.303 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:31.303 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:07:31.562 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:07:31.562 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:07:31.562 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:07:31.562 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:31.562 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:31.562 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:07:31.562 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:31.562 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:31.562 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:31.562 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:07:31.562 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:07:31.562 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:07:31.562 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:07:31.562 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:31.562 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:31.562 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:07:31.821 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:31.821 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:31.821 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:31.821 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:07:31.821 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:07:31.821 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:07:31.821 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:07:31.821 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:31.821 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:31.821 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:07:31.821 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:31.821 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:31.821 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:31.821 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:07:32.080 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:07:32.080 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:07:32.080 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:07:32.080 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:32.080 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:32.080 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:07:32.080 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:32.080 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:32.080 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:32.080 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:32.080 18:44:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:32.338 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:32.339 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:32.339 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:32.339 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:32.339 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:32.339 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:07:32.339 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:32.339 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:32.339 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:32.339 /dev/nbd0 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:32.621 1+0 records in 00:07:32.621 1+0 records out 00:07:32.621 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024361 s, 16.8 MB/s 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:07:32.621 /dev/nbd1 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:32.621 1+0 records in 00:07:32.621 1+0 records out 00:07:32.621 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000148222 s, 27.6 MB/s 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:32.621 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:32.622 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:07:32.880 /dev/nbd10 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:32.880 1+0 records in 00:07:32.880 1+0 records out 00:07:32.880 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252207 s, 16.2 MB/s 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:32.880 18:44:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:07:33.139 /dev/nbd11 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:33.139 1+0 records in 00:07:33.139 1+0 records out 00:07:33.139 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000228038 s, 18.0 MB/s 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:33.139 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:07:33.398 /dev/nbd12 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:33.398 1+0 records in 00:07:33.398 1+0 records out 00:07:33.398 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000230719 s, 17.8 MB/s 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:33.398 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:07:33.398 /dev/nbd13 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:33.657 1+0 records in 00:07:33.657 1+0 records out 00:07:33.657 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257147 s, 15.9 MB/s 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:07:33.657 /dev/nbd14 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:33.657 1+0 records in 00:07:33.657 1+0 records out 00:07:33.657 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000188991 s, 21.7 MB/s 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:33.657 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:07:33.916 /dev/nbd15 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:33.916 1+0 records in 00:07:33.916 1+0 records out 00:07:33.916 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253749 s, 16.1 MB/s 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:33.916 18:44:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:07:34.174 /dev/nbd2 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:34.174 1+0 records in 00:07:34.174 1+0 records out 00:07:34.174 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247919 s, 16.5 MB/s 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:34.174 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:07:34.433 /dev/nbd3 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:34.433 1+0 records in 00:07:34.433 1+0 records out 00:07:34.433 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000315049 s, 13.0 MB/s 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:34.433 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:07:34.692 /dev/nbd4 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:34.692 1+0 records in 00:07:34.692 1+0 records out 00:07:34.692 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000293683 s, 13.9 MB/s 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:07:34.692 /dev/nbd5 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:34.692 1+0 records in 00:07:34.692 1+0 records out 00:07:34.692 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000246912 s, 16.6 MB/s 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:34.692 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:07:34.951 /dev/nbd6 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:34.951 1+0 records in 00:07:34.951 1+0 records out 00:07:34.951 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000350387 s, 11.7 MB/s 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:34.951 18:44:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:07:35.210 /dev/nbd7 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:35.210 1+0 records in 00:07:35.210 1+0 records out 00:07:35.210 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257391 s, 15.9 MB/s 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:35.210 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:07:35.468 /dev/nbd8 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:35.468 1+0 records in 00:07:35.468 1+0 records out 00:07:35.468 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000352159 s, 11.6 MB/s 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:35.468 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:07:35.727 /dev/nbd9 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:35.727 1+0 records in 00:07:35.727 1+0 records out 00:07:35.727 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000383455 s, 10.7 MB/s 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:35.727 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:35.985 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd0", 00:07:35.985 "bdev_name": "Malloc0" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd1", 00:07:35.985 "bdev_name": "Malloc1p0" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd10", 00:07:35.985 "bdev_name": "Malloc1p1" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd11", 00:07:35.985 "bdev_name": "Malloc2p0" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd12", 00:07:35.985 "bdev_name": "Malloc2p1" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd13", 00:07:35.985 "bdev_name": "Malloc2p2" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd14", 00:07:35.985 "bdev_name": "Malloc2p3" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd15", 00:07:35.985 "bdev_name": "Malloc2p4" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd2", 00:07:35.985 "bdev_name": "Malloc2p5" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd3", 00:07:35.985 "bdev_name": "Malloc2p6" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd4", 00:07:35.985 "bdev_name": "Malloc2p7" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd5", 00:07:35.985 "bdev_name": "TestPT" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd6", 00:07:35.985 "bdev_name": "raid0" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd7", 00:07:35.985 "bdev_name": "concat0" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd8", 00:07:35.985 "bdev_name": "raid1" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd9", 00:07:35.985 "bdev_name": "AIO0" 00:07:35.985 } 00:07:35.985 ]' 00:07:35.985 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd0", 00:07:35.985 "bdev_name": "Malloc0" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd1", 00:07:35.985 "bdev_name": "Malloc1p0" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd10", 00:07:35.985 "bdev_name": "Malloc1p1" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd11", 00:07:35.985 "bdev_name": "Malloc2p0" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd12", 00:07:35.985 "bdev_name": "Malloc2p1" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd13", 00:07:35.985 "bdev_name": "Malloc2p2" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd14", 00:07:35.985 "bdev_name": "Malloc2p3" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd15", 00:07:35.985 "bdev_name": "Malloc2p4" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd2", 00:07:35.985 "bdev_name": "Malloc2p5" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd3", 00:07:35.985 "bdev_name": "Malloc2p6" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd4", 00:07:35.985 "bdev_name": "Malloc2p7" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd5", 00:07:35.985 "bdev_name": "TestPT" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd6", 00:07:35.985 "bdev_name": "raid0" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd7", 00:07:35.985 "bdev_name": "concat0" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd8", 00:07:35.985 "bdev_name": "raid1" 00:07:35.985 }, 00:07:35.985 { 00:07:35.985 "nbd_device": "/dev/nbd9", 00:07:35.985 "bdev_name": "AIO0" 00:07:35.985 } 00:07:35.985 ]' 00:07:35.985 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:35.985 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:35.985 /dev/nbd1 00:07:35.985 /dev/nbd10 00:07:35.985 /dev/nbd11 00:07:35.985 /dev/nbd12 00:07:35.985 /dev/nbd13 00:07:35.985 /dev/nbd14 00:07:35.985 /dev/nbd15 00:07:35.985 /dev/nbd2 00:07:35.985 /dev/nbd3 00:07:35.985 /dev/nbd4 00:07:35.985 /dev/nbd5 00:07:35.985 /dev/nbd6 00:07:35.985 /dev/nbd7 00:07:35.985 /dev/nbd8 00:07:35.985 /dev/nbd9' 00:07:35.985 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:35.985 /dev/nbd1 00:07:35.985 /dev/nbd10 00:07:35.985 /dev/nbd11 00:07:35.985 /dev/nbd12 00:07:35.985 /dev/nbd13 00:07:35.985 /dev/nbd14 00:07:35.985 /dev/nbd15 00:07:35.985 /dev/nbd2 00:07:35.985 /dev/nbd3 00:07:35.985 /dev/nbd4 00:07:35.985 /dev/nbd5 00:07:35.985 /dev/nbd6 00:07:35.985 /dev/nbd7 00:07:35.985 /dev/nbd8 00:07:35.985 /dev/nbd9' 00:07:35.985 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:35.986 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:07:35.986 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:07:35.986 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:07:35.986 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:07:35.986 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:07:35.986 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:35.986 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:35.986 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:35.986 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:07:35.986 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:35.986 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:07:35.986 256+0 records in 00:07:35.986 256+0 records out 00:07:35.986 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103122 s, 102 MB/s 00:07:35.986 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:35.986 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:35.986 256+0 records in 00:07:35.986 256+0 records out 00:07:35.986 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0626119 s, 16.7 MB/s 00:07:35.986 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:35.986 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:35.986 256+0 records in 00:07:35.986 256+0 records out 00:07:35.986 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0642252 s, 16.3 MB/s 00:07:35.986 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:35.986 18:44:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:07:36.244 256+0 records in 00:07:36.244 256+0 records out 00:07:36.244 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.064794 s, 16.2 MB/s 00:07:36.244 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:36.244 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:07:36.244 256+0 records in 00:07:36.244 256+0 records out 00:07:36.244 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0644732 s, 16.3 MB/s 00:07:36.244 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:36.244 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:07:36.244 256+0 records in 00:07:36.244 256+0 records out 00:07:36.244 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0631721 s, 16.6 MB/s 00:07:36.244 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:36.244 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:07:36.244 256+0 records in 00:07:36.244 256+0 records out 00:07:36.244 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0636596 s, 16.5 MB/s 00:07:36.244 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:36.244 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:07:36.501 256+0 records in 00:07:36.501 256+0 records out 00:07:36.501 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0645868 s, 16.2 MB/s 00:07:36.501 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:36.501 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:07:36.501 256+0 records in 00:07:36.501 256+0 records out 00:07:36.501 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0639793 s, 16.4 MB/s 00:07:36.501 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:36.501 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:07:36.501 256+0 records in 00:07:36.501 256+0 records out 00:07:36.501 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0635327 s, 16.5 MB/s 00:07:36.501 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:36.502 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:07:36.502 256+0 records in 00:07:36.502 256+0 records out 00:07:36.502 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0644275 s, 16.3 MB/s 00:07:36.502 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:36.502 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:07:36.760 256+0 records in 00:07:36.760 256+0 records out 00:07:36.760 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0647987 s, 16.2 MB/s 00:07:36.760 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:36.760 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:07:36.760 256+0 records in 00:07:36.760 256+0 records out 00:07:36.760 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0645163 s, 16.3 MB/s 00:07:36.760 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:36.760 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:07:36.760 256+0 records in 00:07:36.760 256+0 records out 00:07:36.760 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0648429 s, 16.2 MB/s 00:07:36.760 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:36.760 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:07:36.760 256+0 records in 00:07:36.760 256+0 records out 00:07:36.760 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0649388 s, 16.1 MB/s 00:07:36.760 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:36.760 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:07:37.018 256+0 records in 00:07:37.018 256+0 records out 00:07:37.018 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0662282 s, 15.8 MB/s 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:07:37.018 256+0 records in 00:07:37.018 256+0 records out 00:07:37.018 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0628713 s, 16.7 MB/s 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:37.018 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:37.019 18:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:37.277 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:37.277 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:37.277 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:37.277 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:37.277 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:37.277 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:37.277 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:37.277 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:37.277 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:37.277 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:37.535 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:37.535 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:37.535 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:37.535 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:37.535 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:37.535 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:37.535 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:37.535 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:37.535 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:37.536 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:37.794 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:07:38.052 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:07:38.052 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:07:38.052 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:07:38.052 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:38.052 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:38.052 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:07:38.052 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:38.052 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:38.052 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:38.052 18:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:07:38.311 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:07:38.311 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:07:38.311 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:07:38.311 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:38.311 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:38.311 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:07:38.311 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:38.311 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:38.311 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:38.311 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:38.578 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:07:38.840 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:07:38.840 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:07:38.840 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:07:38.840 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:38.840 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:38.840 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:07:38.840 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:38.840 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:38.840 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:38.840 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:07:39.098 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:07:39.098 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:07:39.098 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:07:39.098 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:39.098 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:39.098 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:07:39.098 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:39.098 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:39.098 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:39.098 18:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:07:39.098 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:39.357 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:07:39.616 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:07:39.616 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:07:39.616 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:07:39.616 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:39.616 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:39.616 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:07:39.616 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:39.616 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:39.616 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:39.616 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:07:39.875 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:07:39.875 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:07:39.875 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:07:39.875 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:39.875 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:39.875 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:07:39.875 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:39.875 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:39.875 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:39.875 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:07:39.875 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:07:40.134 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:07:40.134 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:07:40.134 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:40.134 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:40.134 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:07:40.134 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:40.134 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:40.134 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:40.134 18:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:07:40.134 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:07:40.134 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:07:40.134 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:07:40.134 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:40.134 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:40.134 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:07:40.134 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:40.134 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:40.134 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:40.134 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:40.134 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:40.392 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:40.392 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:40.392 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:40.393 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:40.393 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:40.393 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:07:40.393 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:07:40.393 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:07:40.393 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:07:40.393 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:07:40.393 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:40.393 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:07:40.393 18:44:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:40.393 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:40.393 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:40.393 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:07:40.393 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:07:40.393 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:07:40.651 malloc_lvol_verify 00:07:40.651 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:07:40.651 f7757103-74df-4e53-a501-bc2a01c949ee 00:07:40.651 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:07:40.911 8ff53c4f-0f5f-4b53-a434-b55cc5b2f693 00:07:40.911 18:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:07:41.171 /dev/nbd0 00:07:41.171 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:07:41.171 mke2fs 1.46.5 (30-Dec-2021) 00:07:41.171 Discarding device blocks: 0/4096 done 00:07:41.171 Creating filesystem with 4096 1k blocks and 1024 inodes 00:07:41.171 00:07:41.171 Allocating group tables: 0/1 done 00:07:41.171 Writing inode tables: 0/1 done 00:07:41.171 Creating journal (1024 blocks): done 00:07:41.171 Writing superblocks and filesystem accounting information: 0/1 done 00:07:41.171 00:07:41.171 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:07:41.171 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:07:41.171 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:41.171 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:07:41.171 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:41.171 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:41.171 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:41.171 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 2026648 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2026648 ']' 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2026648 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2026648 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2026648' 00:07:41.430 killing process with pid 2026648 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2026648 00:07:41.430 18:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2026648 00:07:41.690 18:44:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:07:41.690 00:07:41.690 real 0m16.794s 00:07:41.690 user 0m22.582s 00:07:41.690 sys 0m8.065s 00:07:41.690 18:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:41.690 18:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:07:41.690 ************************************ 00:07:41.690 END TEST bdev_nbd 00:07:41.690 ************************************ 00:07:41.690 18:44:26 blockdev_general -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:07:41.690 18:44:26 blockdev_general -- bdev/blockdev.sh@763 -- # '[' bdev = nvme ']' 00:07:41.690 18:44:26 blockdev_general -- bdev/blockdev.sh@763 -- # '[' bdev = gpt ']' 00:07:41.690 18:44:26 blockdev_general -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:07:41.690 18:44:26 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:07:41.690 18:44:26 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:41.690 18:44:26 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:41.690 ************************************ 00:07:41.690 START TEST bdev_fio 00:07:41.690 ************************************ 00:07:41.690 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:07:41.690 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:07:41.690 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:07:41.690 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:41.690 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:07:41.690 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:07:41.690 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:07:41.690 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:07:41.690 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:07:41.690 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1278 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1279 -- # local workload=verify 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local bdev_type=AIO 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local env_context= 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local fio_dir=/usr/src/fio 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1289 -- # '[' -z verify ']' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1293 -- # '[' -n '' ']' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1297 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # cat 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1311 -- # '[' verify == verify ']' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1312 -- # cat 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1321 -- # '[' AIO == AIO ']' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1322 -- # /usr/src/fio/fio --version 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1322 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # echo serialize_overlap=1 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc0]' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc0 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc1p0]' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc1p0 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc1p1]' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc1p1 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p0]' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p0 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p1]' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p1 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p2]' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p2 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p3]' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p3 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p4]' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p4 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p5]' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p5 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p6]' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p6 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p7]' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p7 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_TestPT]' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=TestPT 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_raid0]' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=raid0 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_concat0]' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=concat0 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_raid1]' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=raid1 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_AIO0]' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=AIO0 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:41.691 18:44:26 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:07:41.691 ************************************ 00:07:41.691 START TEST bdev_fio_rw_verify 00:07:41.691 ************************************ 00:07:41.691 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:41.691 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1354 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:41.691 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # local fio_dir=/usr/src/fio 00:07:41.691 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:07:41.691 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local sanitizers 00:07:41.691 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:41.691 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # shift 00:07:41.691 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # local asan_lib= 00:07:41.691 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # for sanitizer in "${sanitizers[@]}" 00:07:41.691 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:41.691 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # grep libasan 00:07:41.691 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # awk '{print $3}' 00:07:41.979 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # asan_lib= 00:07:41.979 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # [[ -n '' ]] 00:07:41.979 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # for sanitizer in "${sanitizers[@]}" 00:07:41.979 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # grep libclang_rt.asan 00:07:41.979 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:41.979 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # awk '{print $3}' 00:07:41.979 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # asan_lib= 00:07:41.979 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # [[ -n '' ]] 00:07:41.979 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1350 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:07:41.979 18:44:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1350 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:42.244 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.244 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.244 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.244 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.244 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.244 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.244 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.244 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.244 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.244 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.244 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.244 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.244 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.244 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.244 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.244 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.244 fio-3.35 00:07:42.244 Starting 16 threads 00:07:54.451 00:07:54.451 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=2030314: Wed Jul 24 18:44:37 2024 00:07:54.451 read: IOPS=107k, BW=417MiB/s (437MB/s)(4167MiB/10001msec) 00:07:54.451 slat (nsec): min=1962, max=215247, avg=30723.19, stdev=13546.39 00:07:54.451 clat (usec): min=9, max=1376, avg=254.48, stdev=123.37 00:07:54.451 lat (usec): min=14, max=1461, avg=285.21, stdev=130.71 00:07:54.451 clat percentiles (usec): 00:07:54.451 | 50.000th=[ 247], 99.000th=[ 519], 99.900th=[ 619], 99.990th=[ 881], 00:07:54.451 | 99.999th=[ 1004] 00:07:54.451 write: IOPS=166k, BW=649MiB/s (681MB/s)(6410MiB/9870msec); 0 zone resets 00:07:54.451 slat (usec): min=4, max=541, avg=40.72, stdev=13.27 00:07:54.451 clat (usec): min=10, max=3625, avg=294.06, stdev=136.52 00:07:54.451 lat (usec): min=28, max=3663, avg=334.79, stdev=143.29 00:07:54.451 clat percentiles (usec): 00:07:54.451 | 50.000th=[ 281], 99.000th=[ 619], 99.900th=[ 816], 99.990th=[ 979], 00:07:54.451 | 99.999th=[ 1319] 00:07:54.451 bw ( KiB/s): min=533808, max=924403, per=99.04%, avg=658672.16, stdev=6256.89, samples=304 00:07:54.451 iops : min=133452, max=231102, avg=164668.11, stdev=1564.21, samples=304 00:07:54.451 lat (usec) : 10=0.01%, 20=0.03%, 50=1.23%, 100=7.11%, 250=37.31% 00:07:54.451 lat (usec) : 500=49.26%, 750=4.87%, 1000=0.18% 00:07:54.451 lat (msec) : 2=0.01%, 4=0.01% 00:07:54.451 cpu : usr=99.34%, sys=0.31%, ctx=669, majf=0, minf=2085 00:07:54.451 IO depths : 1=12.4%, 2=24.8%, 4=50.3%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:07:54.451 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:07:54.451 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:07:54.451 issued rwts: total=1066742,1641067,0,0 short=0,0,0,0 dropped=0,0,0,0 00:07:54.451 latency : target=0, window=0, percentile=100.00%, depth=8 00:07:54.451 00:07:54.452 Run status group 0 (all jobs): 00:07:54.452 READ: bw=417MiB/s (437MB/s), 417MiB/s-417MiB/s (437MB/s-437MB/s), io=4167MiB (4369MB), run=10001-10001msec 00:07:54.452 WRITE: bw=649MiB/s (681MB/s), 649MiB/s-649MiB/s (681MB/s-681MB/s), io=6410MiB (6722MB), run=9870-9870msec 00:07:54.452 00:07:54.452 real 0m11.342s 00:07:54.452 user 2m47.944s 00:07:54.452 sys 0m1.298s 00:07:54.452 18:44:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:54.452 18:44:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:07:54.452 ************************************ 00:07:54.452 END TEST bdev_fio_rw_verify 00:07:54.452 ************************************ 00:07:54.452 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:07:54.452 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:54.452 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:07:54.452 18:44:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1278 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:54.452 18:44:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1279 -- # local workload=trim 00:07:54.452 18:44:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local bdev_type= 00:07:54.452 18:44:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local env_context= 00:07:54.452 18:44:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local fio_dir=/usr/src/fio 00:07:54.452 18:44:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:07:54.452 18:44:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1289 -- # '[' -z trim ']' 00:07:54.452 18:44:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1293 -- # '[' -n '' ']' 00:07:54.452 18:44:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1297 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:54.452 18:44:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # cat 00:07:54.452 18:44:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1311 -- # '[' trim == verify ']' 00:07:54.452 18:44:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1326 -- # '[' trim == trim ']' 00:07:54.452 18:44:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1327 -- # echo rw=trimwrite 00:07:54.452 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:07:54.453 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "65959268-e9fe-4e98-8c1c-1a6aae5df474"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "65959268-e9fe-4e98-8c1c-1a6aae5df474",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "dacdc9d4-eca7-5508-95c3-c1cb59aa8a8c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "dacdc9d4-eca7-5508-95c3-c1cb59aa8a8c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "a1b62a94-cd9e-52a1-ad91-152da23f9c1c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "a1b62a94-cd9e-52a1-ad91-152da23f9c1c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "019bba84-cc80-5697-9225-636efacd0f43"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "019bba84-cc80-5697-9225-636efacd0f43",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "45b55701-cd11-546b-a237-8f3076f23cc5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "45b55701-cd11-546b-a237-8f3076f23cc5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "a8bd9c9a-b571-559d-b2e1-f6245bdf3ad4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a8bd9c9a-b571-559d-b2e1-f6245bdf3ad4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "df84e28a-4455-54b2-b33a-e389383719b8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "df84e28a-4455-54b2-b33a-e389383719b8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "e0086ef8-b7a6-52b8-a646-8f22b4cbcb7b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e0086ef8-b7a6-52b8-a646-8f22b4cbcb7b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "a9c7cdac-297d-534e-81f0-9ffdeceb1dcb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a9c7cdac-297d-534e-81f0-9ffdeceb1dcb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "f0a2cf31-7c77-56e8-b197-d236c1aac61c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f0a2cf31-7c77-56e8-b197-d236c1aac61c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "0bfc745a-20b9-5b20-81ff-2184a98818a3"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0bfc745a-20b9-5b20-81ff-2184a98818a3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "f64382ad-312e-5db6-8c1b-e56caf7476a9"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f64382ad-312e-5db6-8c1b-e56caf7476a9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "2c5fb105-88a0-4285-9c75-931f827eb641"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "2c5fb105-88a0-4285-9c75-931f827eb641",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "2c5fb105-88a0-4285-9c75-931f827eb641",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "2fc39124-b0a6-4e81-98ee-2956c31a77e5",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "0f2e17c0-c340-48a1-9a90-131b8b95b882",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "bc1c615f-ef65-41e1-8290-0896f4df63d5"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "bc1c615f-ef65-41e1-8290-0896f4df63d5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "bc1c615f-ef65-41e1-8290-0896f4df63d5",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "fa43c441-8514-47a1-b156-b10e7c30b132",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "1c20925f-01a9-4241-b17b-ccd7493d0c83",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "24014bff-3a03-4bfc-a610-465675b93ba6"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "24014bff-3a03-4bfc-a610-465675b93ba6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "24014bff-3a03-4bfc-a610-465675b93ba6",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "4e481d73-5536-4380-81dc-96b7d5513589",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "bf76b770-0e0b-4c4d-9098-c0cf7cdab04b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "5f1a9707-a285-4b5b-94fb-dbbd071e59fd"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "5f1a9707-a285-4b5b-94fb-dbbd071e59fd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:07:54.453 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n Malloc0 00:07:54.453 Malloc1p0 00:07:54.453 Malloc1p1 00:07:54.453 Malloc2p0 00:07:54.453 Malloc2p1 00:07:54.453 Malloc2p2 00:07:54.453 Malloc2p3 00:07:54.453 Malloc2p4 00:07:54.453 Malloc2p5 00:07:54.453 Malloc2p6 00:07:54.453 Malloc2p7 00:07:54.453 TestPT 00:07:54.453 raid0 00:07:54.453 concat0 ]] 00:07:54.453 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "65959268-e9fe-4e98-8c1c-1a6aae5df474"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "65959268-e9fe-4e98-8c1c-1a6aae5df474",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "dacdc9d4-eca7-5508-95c3-c1cb59aa8a8c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "dacdc9d4-eca7-5508-95c3-c1cb59aa8a8c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "a1b62a94-cd9e-52a1-ad91-152da23f9c1c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "a1b62a94-cd9e-52a1-ad91-152da23f9c1c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "019bba84-cc80-5697-9225-636efacd0f43"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "019bba84-cc80-5697-9225-636efacd0f43",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "45b55701-cd11-546b-a237-8f3076f23cc5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "45b55701-cd11-546b-a237-8f3076f23cc5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "a8bd9c9a-b571-559d-b2e1-f6245bdf3ad4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a8bd9c9a-b571-559d-b2e1-f6245bdf3ad4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "df84e28a-4455-54b2-b33a-e389383719b8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "df84e28a-4455-54b2-b33a-e389383719b8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "e0086ef8-b7a6-52b8-a646-8f22b4cbcb7b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e0086ef8-b7a6-52b8-a646-8f22b4cbcb7b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "a9c7cdac-297d-534e-81f0-9ffdeceb1dcb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a9c7cdac-297d-534e-81f0-9ffdeceb1dcb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "f0a2cf31-7c77-56e8-b197-d236c1aac61c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f0a2cf31-7c77-56e8-b197-d236c1aac61c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "0bfc745a-20b9-5b20-81ff-2184a98818a3"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0bfc745a-20b9-5b20-81ff-2184a98818a3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "f64382ad-312e-5db6-8c1b-e56caf7476a9"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f64382ad-312e-5db6-8c1b-e56caf7476a9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "2c5fb105-88a0-4285-9c75-931f827eb641"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "2c5fb105-88a0-4285-9c75-931f827eb641",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "2c5fb105-88a0-4285-9c75-931f827eb641",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "2fc39124-b0a6-4e81-98ee-2956c31a77e5",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "0f2e17c0-c340-48a1-9a90-131b8b95b882",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "bc1c615f-ef65-41e1-8290-0896f4df63d5"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "bc1c615f-ef65-41e1-8290-0896f4df63d5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "bc1c615f-ef65-41e1-8290-0896f4df63d5",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "fa43c441-8514-47a1-b156-b10e7c30b132",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "1c20925f-01a9-4241-b17b-ccd7493d0c83",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "24014bff-3a03-4bfc-a610-465675b93ba6"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "24014bff-3a03-4bfc-a610-465675b93ba6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "24014bff-3a03-4bfc-a610-465675b93ba6",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "4e481d73-5536-4380-81dc-96b7d5513589",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "bf76b770-0e0b-4c4d-9098-c0cf7cdab04b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "5f1a9707-a285-4b5b-94fb-dbbd071e59fd"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "5f1a9707-a285-4b5b-94fb-dbbd071e59fd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc0]' 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc0 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc1p0]' 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc1p0 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc1p1]' 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc1p1 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p0]' 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p0 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p1]' 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p1 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p2]' 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p2 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p3]' 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p3 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p4]' 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p4 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p5]' 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p5 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p6]' 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p6 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p7]' 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p7 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_TestPT]' 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=TestPT 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_raid0]' 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=raid0 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_concat0]' 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=concat0 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:54.455 18:44:38 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:07:54.455 ************************************ 00:07:54.455 START TEST bdev_fio_trim 00:07:54.455 ************************************ 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1354 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # local fio_dir=/usr/src/fio 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local sanitizers 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # shift 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # local asan_lib= 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # for sanitizer in "${sanitizers[@]}" 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # grep libasan 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # awk '{print $3}' 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # asan_lib= 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # [[ -n '' ]] 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # for sanitizer in "${sanitizers[@]}" 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # grep libclang_rt.asan 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # awk '{print $3}' 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # asan_lib= 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # [[ -n '' ]] 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1350 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:07:54.455 18:44:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1350 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:54.455 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:54.455 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:54.455 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:54.455 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:54.455 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:54.455 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:54.455 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:54.455 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:54.455 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:54.455 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:54.455 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:54.455 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:54.455 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:54.455 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:54.455 fio-3.35 00:07:54.455 Starting 14 threads 00:08:04.466 00:08:04.466 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=2032288: Wed Jul 24 18:44:49 2024 00:08:04.466 write: IOPS=149k, BW=584MiB/s (612MB/s)(5836MiB/10001msec); 0 zone resets 00:08:04.466 slat (nsec): min=1855, max=194759, avg=33254.18, stdev=10757.46 00:08:04.466 clat (usec): min=17, max=2937, avg=234.88, stdev=90.74 00:08:04.466 lat (usec): min=23, max=2957, avg=268.13, stdev=95.72 00:08:04.466 clat percentiles (usec): 00:08:04.466 | 50.000th=[ 223], 99.000th=[ 457], 99.900th=[ 506], 99.990th=[ 570], 00:08:04.466 | 99.999th=[ 988] 00:08:04.466 bw ( KiB/s): min=501088, max=870464, per=100.00%, avg=600551.00, stdev=8039.00, samples=266 00:08:04.466 iops : min=125272, max=217616, avg=150137.68, stdev=2009.74, samples=266 00:08:04.466 trim: IOPS=149k, BW=584MiB/s (612MB/s)(5836MiB/10001msec); 0 zone resets 00:08:04.466 slat (usec): min=3, max=858, avg=22.87, stdev= 7.01 00:08:04.466 clat (usec): min=3, max=2957, avg=263.44, stdev=98.91 00:08:04.466 lat (usec): min=8, max=2969, avg=286.31, stdev=102.55 00:08:04.466 clat percentiles (usec): 00:08:04.466 | 50.000th=[ 251], 99.000th=[ 498], 99.900th=[ 553], 99.990th=[ 594], 00:08:04.466 | 99.999th=[ 775] 00:08:04.466 bw ( KiB/s): min=501096, max=870520, per=100.00%, avg=600551.42, stdev=8039.05, samples=266 00:08:04.466 iops : min=125274, max=217630, avg=150137.79, stdev=2009.75, samples=266 00:08:04.466 lat (usec) : 4=0.01%, 10=0.02%, 20=0.05%, 50=0.20%, 100=2.80% 00:08:04.466 lat (usec) : 250=52.22%, 500=44.17%, 750=0.54%, 1000=0.01% 00:08:04.466 lat (msec) : 2=0.01%, 4=0.01% 00:08:04.466 cpu : usr=99.64%, sys=0.00%, ctx=554, majf=0, minf=1057 00:08:04.466 IO depths : 1=12.5%, 2=24.9%, 4=50.0%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:04.466 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:04.466 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:04.466 issued rwts: total=0,1494014,1494018,0 short=0,0,0,0 dropped=0,0,0,0 00:08:04.466 latency : target=0, window=0, percentile=100.00%, depth=8 00:08:04.466 00:08:04.466 Run status group 0 (all jobs): 00:08:04.466 WRITE: bw=584MiB/s (612MB/s), 584MiB/s-584MiB/s (612MB/s-612MB/s), io=5836MiB (6119MB), run=10001-10001msec 00:08:04.466 TRIM: bw=584MiB/s (612MB/s), 584MiB/s-584MiB/s (612MB/s-612MB/s), io=5836MiB (6119MB), run=10001-10001msec 00:08:04.725 00:08:04.725 real 0m11.478s 00:08:04.725 user 2m28.001s 00:08:04.725 sys 0m0.495s 00:08:04.725 18:44:49 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:04.725 18:44:49 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:08:04.725 ************************************ 00:08:04.725 END TEST bdev_fio_trim 00:08:04.725 ************************************ 00:08:04.725 18:44:49 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:08:04.725 18:44:49 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:04.725 18:44:49 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:08:04.725 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:04.725 18:44:49 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:08:04.725 00:08:04.725 real 0m23.109s 00:08:04.725 user 5m16.127s 00:08:04.725 sys 0m1.927s 00:08:04.725 18:44:49 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:04.725 18:44:49 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:04.725 ************************************ 00:08:04.725 END TEST bdev_fio 00:08:04.725 ************************************ 00:08:04.983 18:44:49 blockdev_general -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:08:04.983 18:44:49 blockdev_general -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:08:04.983 18:44:49 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:08:04.983 18:44:49 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:04.983 18:44:49 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:04.983 ************************************ 00:08:04.983 START TEST bdev_verify 00:08:04.983 ************************************ 00:08:04.983 18:44:49 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:08:04.983 [2024-07-24 18:44:49.829324] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:08:04.983 [2024-07-24 18:44:49.829364] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2034146 ] 00:08:04.983 [2024-07-24 18:44:49.894933] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:04.983 [2024-07-24 18:44:49.968672] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:04.983 [2024-07-24 18:44:49.968675] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.242 [2024-07-24 18:44:50.108023] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:05.242 [2024-07-24 18:44:50.108065] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:05.242 [2024-07-24 18:44:50.108073] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:05.242 [2024-07-24 18:44:50.116034] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:05.242 [2024-07-24 18:44:50.116050] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:05.242 [2024-07-24 18:44:50.124050] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:05.242 [2024-07-24 18:44:50.124065] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:05.242 [2024-07-24 18:44:50.191447] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:05.242 [2024-07-24 18:44:50.191489] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:05.242 [2024-07-24 18:44:50.191499] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23e4fd0 00:08:05.242 [2024-07-24 18:44:50.191505] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:05.242 [2024-07-24 18:44:50.192522] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:05.242 [2024-07-24 18:44:50.192543] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:05.500 Running I/O for 5 seconds... 00:08:10.766 00:08:10.766 Latency(us) 00:08:10.766 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:10.766 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:10.766 Verification LBA range: start 0x0 length 0x1000 00:08:10.766 Malloc0 : 5.05 1621.07 6.33 0.00 0.00 78830.53 374.49 291603.99 00:08:10.766 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:10.766 Verification LBA range: start 0x1000 length 0x1000 00:08:10.766 Malloc0 : 5.05 1597.43 6.24 0.00 0.00 79992.79 419.35 325557.88 00:08:10.766 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:10.766 Verification LBA range: start 0x0 length 0x800 00:08:10.766 Malloc1p0 : 5.19 838.00 3.27 0.00 0.00 152113.63 2683.86 171766.74 00:08:10.766 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:10.766 Verification LBA range: start 0x800 length 0x800 00:08:10.766 Malloc1p0 : 5.19 838.42 3.28 0.00 0.00 152062.34 2699.46 171766.74 00:08:10.766 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:10.766 Verification LBA range: start 0x0 length 0x800 00:08:10.766 Malloc1p1 : 5.20 837.69 3.27 0.00 0.00 151839.51 2839.89 166773.52 00:08:10.766 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:10.766 Verification LBA range: start 0x800 length 0x800 00:08:10.766 Malloc1p1 : 5.19 838.17 3.27 0.00 0.00 151749.32 2793.08 167772.16 00:08:10.766 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:10.766 Verification LBA range: start 0x0 length 0x200 00:08:10.766 Malloc2p0 : 5.20 837.39 3.27 0.00 0.00 151541.50 2683.86 160781.65 00:08:10.766 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:10.766 Verification LBA range: start 0x200 length 0x200 00:08:10.766 Malloc2p0 : 5.19 837.93 3.27 0.00 0.00 151451.42 2683.86 161780.30 00:08:10.766 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:10.766 Verification LBA range: start 0x0 length 0x200 00:08:10.766 Malloc2p1 : 5.20 837.08 3.27 0.00 0.00 151265.30 2637.04 155788.43 00:08:10.766 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:10.766 Verification LBA range: start 0x200 length 0x200 00:08:10.766 Malloc2p1 : 5.20 837.62 3.27 0.00 0.00 151176.56 2605.84 155788.43 00:08:10.766 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:10.766 Verification LBA range: start 0x0 length 0x200 00:08:10.766 Malloc2p2 : 5.20 836.79 3.27 0.00 0.00 151003.56 2637.04 150795.22 00:08:10.766 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:10.766 Verification LBA range: start 0x200 length 0x200 00:08:10.766 Malloc2p2 : 5.20 837.31 3.27 0.00 0.00 150916.51 2637.04 151793.86 00:08:10.766 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:10.766 Verification LBA range: start 0x0 length 0x200 00:08:10.766 Malloc2p3 : 5.20 836.48 3.27 0.00 0.00 150724.34 2746.27 146800.64 00:08:10.766 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:10.766 Verification LBA range: start 0x200 length 0x200 00:08:10.766 Malloc2p3 : 5.20 837.01 3.27 0.00 0.00 150637.26 2746.27 147799.28 00:08:10.766 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:10.766 Verification LBA range: start 0x0 length 0x200 00:08:10.766 Malloc2p4 : 5.20 836.18 3.27 0.00 0.00 150458.81 2683.86 142806.06 00:08:10.766 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:10.766 Verification LBA range: start 0x200 length 0x200 00:08:10.766 Malloc2p4 : 5.20 836.71 3.27 0.00 0.00 150369.41 2715.06 143804.71 00:08:10.766 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:10.766 Verification LBA range: start 0x0 length 0x200 00:08:10.766 Malloc2p5 : 5.21 835.89 3.27 0.00 0.00 150202.18 2683.86 140808.78 00:08:10.767 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:10.767 Verification LBA range: start 0x200 length 0x200 00:08:10.767 Malloc2p5 : 5.20 836.41 3.27 0.00 0.00 150108.51 2730.67 141807.42 00:08:10.767 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:10.767 Verification LBA range: start 0x0 length 0x200 00:08:10.767 Malloc2p6 : 5.21 835.59 3.26 0.00 0.00 149954.60 2793.08 138811.49 00:08:10.767 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:10.767 Verification LBA range: start 0x200 length 0x200 00:08:10.767 Malloc2p6 : 5.21 836.11 3.27 0.00 0.00 149858.41 2746.27 139810.13 00:08:10.767 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:10.767 Verification LBA range: start 0x0 length 0x200 00:08:10.767 Malloc2p7 : 5.21 835.29 3.26 0.00 0.00 149684.83 2730.67 133818.27 00:08:10.767 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:10.767 Verification LBA range: start 0x200 length 0x200 00:08:10.767 Malloc2p7 : 5.21 835.81 3.26 0.00 0.00 149584.04 2715.06 133818.27 00:08:10.767 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:10.767 Verification LBA range: start 0x0 length 0x1000 00:08:10.767 TestPT : 5.21 813.02 3.18 0.00 0.00 152604.07 15603.81 132819.63 00:08:10.767 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:10.767 Verification LBA range: start 0x1000 length 0x1000 00:08:10.767 TestPT : 5.22 811.76 3.17 0.00 0.00 153365.92 9362.29 185747.75 00:08:10.767 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:10.767 Verification LBA range: start 0x0 length 0x2000 00:08:10.767 raid0 : 5.21 834.61 3.26 0.00 0.00 148963.96 2699.46 113845.39 00:08:10.767 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:10.767 Verification LBA range: start 0x2000 length 0x2000 00:08:10.767 raid0 : 5.21 835.29 3.26 0.00 0.00 148838.94 2730.67 110350.14 00:08:10.767 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:10.767 Verification LBA range: start 0x0 length 0x2000 00:08:10.767 concat0 : 5.22 834.35 3.26 0.00 0.00 148684.81 2715.06 109850.82 00:08:10.767 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:10.767 Verification LBA range: start 0x2000 length 0x2000 00:08:10.767 concat0 : 5.21 834.80 3.26 0.00 0.00 148586.09 2699.46 106355.57 00:08:10.767 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:10.767 Verification LBA range: start 0x0 length 0x1000 00:08:10.767 raid1 : 5.22 833.98 3.26 0.00 0.00 148379.71 3198.78 112846.75 00:08:10.767 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:10.767 Verification LBA range: start 0x1000 length 0x1000 00:08:10.767 raid1 : 5.21 834.52 3.26 0.00 0.00 148280.22 3183.18 110849.46 00:08:10.767 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:10.767 Verification LBA range: start 0x0 length 0x4e2 00:08:10.767 AIO0 : 5.22 833.57 3.26 0.00 0.00 148091.01 1349.73 116841.33 00:08:10.767 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:10.767 Verification LBA range: start 0x4e2 length 0x4e2 00:08:10.767 AIO0 : 5.22 834.31 3.26 0.00 0.00 147944.99 1326.32 115842.68 00:08:10.767 =================================================================================================================== 00:08:10.767 Total : 28256.55 110.38 0.00 0.00 142476.75 374.49 325557.88 00:08:11.025 00:08:11.025 real 0m6.197s 00:08:11.026 user 0m11.717s 00:08:11.026 sys 0m0.276s 00:08:11.026 18:44:55 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:11.026 18:44:55 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:08:11.026 ************************************ 00:08:11.026 END TEST bdev_verify 00:08:11.026 ************************************ 00:08:11.026 18:44:56 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:08:11.026 18:44:56 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:08:11.026 18:44:56 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:11.026 18:44:56 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:11.284 ************************************ 00:08:11.284 START TEST bdev_verify_big_io 00:08:11.284 ************************************ 00:08:11.284 18:44:56 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:08:11.284 [2024-07-24 18:44:56.097160] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:08:11.284 [2024-07-24 18:44:56.097198] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2035288 ] 00:08:11.284 [2024-07-24 18:44:56.159870] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:11.284 [2024-07-24 18:44:56.230742] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:11.284 [2024-07-24 18:44:56.230745] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.542 [2024-07-24 18:44:56.365684] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:11.542 [2024-07-24 18:44:56.365728] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:11.542 [2024-07-24 18:44:56.365736] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:11.542 [2024-07-24 18:44:56.373693] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:11.542 [2024-07-24 18:44:56.373708] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:11.542 [2024-07-24 18:44:56.381713] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:11.542 [2024-07-24 18:44:56.381725] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:11.542 [2024-07-24 18:44:56.448966] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:11.542 [2024-07-24 18:44:56.449006] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:11.542 [2024-07-24 18:44:56.449015] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151afd0 00:08:11.542 [2024-07-24 18:44:56.449026] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:11.542 [2024-07-24 18:44:56.450001] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:11.542 [2024-07-24 18:44:56.450021] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:11.801 [2024-07-24 18:44:56.599637] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:08:11.801 [2024-07-24 18:44:56.600426] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:08:11.801 [2024-07-24 18:44:56.601649] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:08:11.801 [2024-07-24 18:44:56.602421] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:08:11.801 [2024-07-24 18:44:56.603674] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:08:11.801 [2024-07-24 18:44:56.604476] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:08:11.801 [2024-07-24 18:44:56.605713] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:08:11.801 [2024-07-24 18:44:56.606973] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:08:11.801 [2024-07-24 18:44:56.607786] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:08:11.801 [2024-07-24 18:44:56.609027] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:08:11.801 [2024-07-24 18:44:56.609825] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:08:11.801 [2024-07-24 18:44:56.611006] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:08:11.801 [2024-07-24 18:44:56.611683] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:08:11.801 [2024-07-24 18:44:56.612759] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:08:11.801 [2024-07-24 18:44:56.613436] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:08:11.801 [2024-07-24 18:44:56.614531] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:08:11.801 [2024-07-24 18:44:56.632950] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:08:11.801 [2024-07-24 18:44:56.634507] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:08:11.801 Running I/O for 5 seconds... 00:08:18.366 00:08:18.366 Latency(us) 00:08:18.366 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:18.366 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:18.366 Verification LBA range: start 0x0 length 0x100 00:08:18.366 Malloc0 : 5.39 308.70 19.29 0.00 0.00 408775.93 577.34 1190383.42 00:08:18.366 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:18.366 Verification LBA range: start 0x100 length 0x100 00:08:18.366 Malloc0 : 5.45 282.06 17.63 0.00 0.00 447624.20 585.14 1422068.78 00:08:18.366 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:18.366 Verification LBA range: start 0x0 length 0x80 00:08:18.366 Malloc1p0 : 5.84 79.39 4.96 0.00 0.00 1508462.58 1810.04 2460658.35 00:08:18.366 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:18.366 Verification LBA range: start 0x80 length 0x80 00:08:18.366 Malloc1p0 : 5.75 122.36 7.65 0.00 0.00 983758.64 2122.12 1669732.45 00:08:18.366 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:18.366 Verification LBA range: start 0x0 length 0x80 00:08:18.366 Malloc1p1 : 6.05 55.57 3.47 0.00 0.00 2112707.69 1248.30 3419356.40 00:08:18.366 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:18.366 Verification LBA range: start 0x80 length 0x80 00:08:18.367 Malloc1p1 : 6.11 55.00 3.44 0.00 0.00 2131145.12 1256.11 3419356.40 00:08:18.367 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x0 length 0x20 00:08:18.367 Malloc2p0 : 5.75 41.74 2.61 0.00 0.00 703025.50 464.21 1294242.38 00:08:18.367 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x20 length 0x20 00:08:18.367 Malloc2p0 : 5.76 41.70 2.61 0.00 0.00 708179.78 479.82 1206361.72 00:08:18.367 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x0 length 0x20 00:08:18.367 Malloc2p1 : 5.75 41.73 2.61 0.00 0.00 699181.18 466.16 1278264.08 00:08:18.367 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x20 length 0x20 00:08:18.367 Malloc2p1 : 5.76 41.69 2.61 0.00 0.00 704142.85 481.77 1190383.42 00:08:18.367 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x0 length 0x20 00:08:18.367 Malloc2p2 : 5.80 44.12 2.76 0.00 0.00 664730.59 464.21 1254296.62 00:08:18.367 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x20 length 0x20 00:08:18.367 Malloc2p2 : 5.76 41.69 2.61 0.00 0.00 699834.39 475.92 1174405.12 00:08:18.367 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x0 length 0x20 00:08:18.367 Malloc2p3 : 5.80 44.11 2.76 0.00 0.00 661181.36 462.26 1246307.47 00:08:18.367 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x20 length 0x20 00:08:18.367 Malloc2p3 : 5.76 41.68 2.60 0.00 0.00 696097.27 473.97 1158426.82 00:08:18.367 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x0 length 0x20 00:08:18.367 Malloc2p4 : 5.80 44.11 2.76 0.00 0.00 657306.44 448.61 1230329.17 00:08:18.367 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x20 length 0x20 00:08:18.367 Malloc2p4 : 5.76 41.67 2.60 0.00 0.00 692164.51 485.67 1142448.52 00:08:18.367 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x0 length 0x20 00:08:18.367 Malloc2p5 : 5.81 44.10 2.76 0.00 0.00 653359.36 464.21 1214350.87 00:08:18.367 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x20 length 0x20 00:08:18.367 Malloc2p5 : 5.76 41.67 2.60 0.00 0.00 688238.42 475.92 1126470.22 00:08:18.367 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x0 length 0x20 00:08:18.367 Malloc2p6 : 5.81 44.09 2.76 0.00 0.00 649871.26 477.87 1198372.57 00:08:18.367 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x20 length 0x20 00:08:18.367 Malloc2p6 : 5.82 44.02 2.75 0.00 0.00 651194.94 479.82 1110491.92 00:08:18.367 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x0 length 0x20 00:08:18.367 Malloc2p7 : 5.81 44.09 2.76 0.00 0.00 646604.00 475.92 1182394.27 00:08:18.367 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x20 length 0x20 00:08:18.367 Malloc2p7 : 5.82 44.01 2.75 0.00 0.00 647534.29 499.32 1094513.62 00:08:18.367 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x0 length 0x100 00:08:18.367 TestPT : 6.07 55.66 3.48 0.00 0.00 1985622.99 63913.20 2812180.97 00:08:18.367 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x100 length 0x100 00:08:18.367 TestPT : 6.14 52.09 3.26 0.00 0.00 2114489.39 57422.02 2955985.68 00:08:18.367 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x0 length 0x200 00:08:18.367 raid0 : 6.00 61.29 3.83 0.00 0.00 1772956.97 2168.93 3067833.78 00:08:18.367 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x200 length 0x200 00:08:18.367 raid0 : 6.18 59.57 3.72 0.00 0.00 1814254.91 2184.53 3067833.78 00:08:18.367 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x0 length 0x200 00:08:18.367 concat0 : 6.05 72.91 4.56 0.00 0.00 1473536.58 1747.63 2971963.98 00:08:18.367 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x200 length 0x200 00:08:18.367 concat0 : 6.15 65.09 4.07 0.00 0.00 1641313.67 1778.83 2971963.98 00:08:18.367 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x0 length 0x100 00:08:18.367 raid1 : 6.07 79.18 4.95 0.00 0.00 1331586.12 2028.50 2860115.87 00:08:18.367 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x100 length 0x100 00:08:18.367 raid1 : 6.11 79.35 4.96 0.00 0.00 1328909.97 2059.70 2860115.87 00:08:18.367 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x0 length 0x4e 00:08:18.367 AIO0 : 6.18 89.71 5.61 0.00 0.00 705550.72 573.44 1717667.35 00:08:18.367 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:08:18.367 Verification LBA range: start 0x4e length 0x4e 00:08:18.367 AIO0 : 6.18 98.72 6.17 0.00 0.00 641811.29 407.65 1669732.45 00:08:18.367 =================================================================================================================== 00:08:18.367 Total : 2302.85 143.93 0.00 0.00 965299.13 407.65 3419356.40 00:08:18.367 00:08:18.367 real 0m7.192s 00:08:18.367 user 0m13.655s 00:08:18.367 sys 0m0.313s 00:08:18.367 18:45:03 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:18.367 18:45:03 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:08:18.367 ************************************ 00:08:18.367 END TEST bdev_verify_big_io 00:08:18.367 ************************************ 00:08:18.367 18:45:03 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:18.367 18:45:03 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:18.367 18:45:03 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:18.367 18:45:03 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:18.367 ************************************ 00:08:18.367 START TEST bdev_write_zeroes 00:08:18.367 ************************************ 00:08:18.367 18:45:03 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:18.367 [2024-07-24 18:45:03.338042] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:08:18.367 [2024-07-24 18:45:03.338081] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2036570 ] 00:08:18.625 [2024-07-24 18:45:03.401219] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.625 [2024-07-24 18:45:03.471837] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.625 [2024-07-24 18:45:03.609857] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:18.625 [2024-07-24 18:45:03.609891] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:18.625 [2024-07-24 18:45:03.609898] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:18.625 [2024-07-24 18:45:03.617867] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:18.625 [2024-07-24 18:45:03.617882] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:18.625 [2024-07-24 18:45:03.625879] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:18.625 [2024-07-24 18:45:03.625892] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:18.884 [2024-07-24 18:45:03.692594] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:18.884 [2024-07-24 18:45:03.692633] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:18.884 [2024-07-24 18:45:03.692641] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26bed80 00:08:18.884 [2024-07-24 18:45:03.692647] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:18.884 [2024-07-24 18:45:03.693609] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:18.884 [2024-07-24 18:45:03.693634] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:18.884 Running I/O for 1 seconds... 00:08:20.261 00:08:20.261 Latency(us) 00:08:20.261 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:20.261 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:20.261 Malloc0 : 1.03 7461.08 29.14 0.00 0.00 17142.91 454.46 27962.03 00:08:20.261 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:20.261 Malloc1p0 : 1.03 7454.31 29.12 0.00 0.00 17143.49 608.55 27462.70 00:08:20.261 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:20.261 Malloc1p1 : 1.03 7447.57 29.09 0.00 0.00 17127.86 600.75 26963.38 00:08:20.261 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:20.261 Malloc2p0 : 1.03 7440.87 29.07 0.00 0.00 17117.84 604.65 26464.06 00:08:20.261 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:20.261 Malloc2p1 : 1.03 7434.02 29.04 0.00 0.00 17106.14 604.65 25964.74 00:08:20.261 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:20.261 Malloc2p2 : 1.03 7427.16 29.01 0.00 0.00 17096.67 604.65 25465.42 00:08:20.261 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:20.261 Malloc2p3 : 1.04 7420.11 28.98 0.00 0.00 17086.35 604.65 24966.10 00:08:20.261 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:20.261 Malloc2p4 : 1.04 7413.52 28.96 0.00 0.00 17080.53 604.65 24466.77 00:08:20.261 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:20.261 Malloc2p5 : 1.04 7406.94 28.93 0.00 0.00 17068.01 608.55 23967.45 00:08:20.261 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:20.261 Malloc2p6 : 1.04 7400.32 28.91 0.00 0.00 17064.10 600.75 23468.13 00:08:20.261 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:20.261 Malloc2p7 : 1.04 7393.77 28.88 0.00 0.00 17053.56 604.65 22968.81 00:08:20.261 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:20.261 TestPT : 1.04 7387.17 28.86 0.00 0.00 17044.21 620.25 22344.66 00:08:20.261 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:20.261 raid0 : 1.04 7378.82 28.82 0.00 0.00 17030.99 1053.26 21346.01 00:08:20.261 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:20.261 concat0 : 1.04 7371.36 28.79 0.00 0.00 17009.31 1061.06 20721.86 00:08:20.261 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:20.261 raid1 : 1.04 7362.02 28.76 0.00 0.00 16979.01 1685.21 20721.86 00:08:20.261 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:20.261 AIO0 : 1.04 7356.39 28.74 0.00 0.00 16936.71 635.86 20721.86 00:08:20.261 =================================================================================================================== 00:08:20.261 Total : 118555.43 463.11 0.00 0.00 17067.98 454.46 27962.03 00:08:20.261 00:08:20.261 real 0m1.903s 00:08:20.261 user 0m1.619s 00:08:20.261 sys 0m0.231s 00:08:20.261 18:45:05 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:20.261 18:45:05 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:08:20.261 ************************************ 00:08:20.261 END TEST bdev_write_zeroes 00:08:20.261 ************************************ 00:08:20.261 18:45:05 blockdev_general -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:20.261 18:45:05 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:20.261 18:45:05 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:20.261 18:45:05 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:20.261 ************************************ 00:08:20.261 START TEST bdev_json_nonenclosed 00:08:20.261 ************************************ 00:08:20.261 18:45:05 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:20.521 [2024-07-24 18:45:05.300296] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:08:20.521 [2024-07-24 18:45:05.300330] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2036841 ] 00:08:20.521 [2024-07-24 18:45:05.362183] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.521 [2024-07-24 18:45:05.433348] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.521 [2024-07-24 18:45:05.433402] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:08:20.521 [2024-07-24 18:45:05.433429] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:08:20.521 [2024-07-24 18:45:05.433435] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:20.521 00:08:20.521 real 0m0.236s 00:08:20.521 user 0m0.149s 00:08:20.521 sys 0m0.086s 00:08:20.521 18:45:05 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:20.521 18:45:05 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:08:20.521 ************************************ 00:08:20.521 END TEST bdev_json_nonenclosed 00:08:20.521 ************************************ 00:08:20.780 18:45:05 blockdev_general -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:20.780 18:45:05 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:20.780 18:45:05 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:20.780 18:45:05 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:20.780 ************************************ 00:08:20.780 START TEST bdev_json_nonarray 00:08:20.780 ************************************ 00:08:20.780 18:45:05 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:20.780 [2024-07-24 18:45:05.595032] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:08:20.780 [2024-07-24 18:45:05.595070] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2037041 ] 00:08:20.780 [2024-07-24 18:45:05.650974] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.780 [2024-07-24 18:45:05.721694] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.780 [2024-07-24 18:45:05.721753] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:08:20.780 [2024-07-24 18:45:05.721762] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:08:20.780 [2024-07-24 18:45:05.721767] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:21.051 00:08:21.051 real 0m0.236s 00:08:21.051 user 0m0.153s 00:08:21.051 sys 0m0.081s 00:08:21.051 18:45:05 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:21.051 18:45:05 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:08:21.051 ************************************ 00:08:21.051 END TEST bdev_json_nonarray 00:08:21.051 ************************************ 00:08:21.051 18:45:05 blockdev_general -- bdev/blockdev.sh@786 -- # [[ bdev == bdev ]] 00:08:21.051 18:45:05 blockdev_general -- bdev/blockdev.sh@787 -- # run_test bdev_qos qos_test_suite '' 00:08:21.051 18:45:05 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:21.051 18:45:05 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:21.051 18:45:05 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:21.051 ************************************ 00:08:21.051 START TEST bdev_qos 00:08:21.051 ************************************ 00:08:21.051 18:45:05 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:08:21.051 18:45:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # QOS_PID=2037079 00:08:21.051 18:45:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # echo 'Process qos testing pid: 2037079' 00:08:21.051 Process qos testing pid: 2037079 00:08:21.051 18:45:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:08:21.052 18:45:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # waitforlisten 2037079 00:08:21.052 18:45:05 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 2037079 ']' 00:08:21.052 18:45:05 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:21.052 18:45:05 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:21.052 18:45:05 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:21.052 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:21.052 18:45:05 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:21.052 18:45:05 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:21.052 18:45:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@444 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:08:21.052 [2024-07-24 18:45:05.900849] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:08:21.052 [2024-07-24 18:45:05.900886] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2037079 ] 00:08:21.052 [2024-07-24 18:45:05.964312] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.052 [2024-07-24 18:45:06.043258] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@450 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:21.995 Malloc_0 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # waitforbdev Malloc_0 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:21.995 [ 00:08:21.995 { 00:08:21.995 "name": "Malloc_0", 00:08:21.995 "aliases": [ 00:08:21.995 "49fcf61f-8084-4efe-97ae-fcf7ac731547" 00:08:21.995 ], 00:08:21.995 "product_name": "Malloc disk", 00:08:21.995 "block_size": 512, 00:08:21.995 "num_blocks": 262144, 00:08:21.995 "uuid": "49fcf61f-8084-4efe-97ae-fcf7ac731547", 00:08:21.995 "assigned_rate_limits": { 00:08:21.995 "rw_ios_per_sec": 0, 00:08:21.995 "rw_mbytes_per_sec": 0, 00:08:21.995 "r_mbytes_per_sec": 0, 00:08:21.995 "w_mbytes_per_sec": 0 00:08:21.995 }, 00:08:21.995 "claimed": false, 00:08:21.995 "zoned": false, 00:08:21.995 "supported_io_types": { 00:08:21.995 "read": true, 00:08:21.995 "write": true, 00:08:21.995 "unmap": true, 00:08:21.995 "flush": true, 00:08:21.995 "reset": true, 00:08:21.995 "nvme_admin": false, 00:08:21.995 "nvme_io": false, 00:08:21.995 "nvme_io_md": false, 00:08:21.995 "write_zeroes": true, 00:08:21.995 "zcopy": true, 00:08:21.995 "get_zone_info": false, 00:08:21.995 "zone_management": false, 00:08:21.995 "zone_append": false, 00:08:21.995 "compare": false, 00:08:21.995 "compare_and_write": false, 00:08:21.995 "abort": true, 00:08:21.995 "seek_hole": false, 00:08:21.995 "seek_data": false, 00:08:21.995 "copy": true, 00:08:21.995 "nvme_iov_md": false 00:08:21.995 }, 00:08:21.995 "memory_domains": [ 00:08:21.995 { 00:08:21.995 "dma_device_id": "system", 00:08:21.995 "dma_device_type": 1 00:08:21.995 }, 00:08:21.995 { 00:08:21.995 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:21.995 "dma_device_type": 2 00:08:21.995 } 00:08:21.995 ], 00:08:21.995 "driver_specific": {} 00:08:21.995 } 00:08:21.995 ] 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # rpc_cmd bdev_null_create Null_1 128 512 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:21.995 Null_1 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # waitforbdev Null_1 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:21.995 [ 00:08:21.995 { 00:08:21.995 "name": "Null_1", 00:08:21.995 "aliases": [ 00:08:21.995 "b5401c5b-9590-48db-928a-502bad52c01b" 00:08:21.995 ], 00:08:21.995 "product_name": "Null disk", 00:08:21.995 "block_size": 512, 00:08:21.995 "num_blocks": 262144, 00:08:21.995 "uuid": "b5401c5b-9590-48db-928a-502bad52c01b", 00:08:21.995 "assigned_rate_limits": { 00:08:21.995 "rw_ios_per_sec": 0, 00:08:21.995 "rw_mbytes_per_sec": 0, 00:08:21.995 "r_mbytes_per_sec": 0, 00:08:21.995 "w_mbytes_per_sec": 0 00:08:21.995 }, 00:08:21.995 "claimed": false, 00:08:21.995 "zoned": false, 00:08:21.995 "supported_io_types": { 00:08:21.995 "read": true, 00:08:21.995 "write": true, 00:08:21.995 "unmap": false, 00:08:21.995 "flush": false, 00:08:21.995 "reset": true, 00:08:21.995 "nvme_admin": false, 00:08:21.995 "nvme_io": false, 00:08:21.995 "nvme_io_md": false, 00:08:21.995 "write_zeroes": true, 00:08:21.995 "zcopy": false, 00:08:21.995 "get_zone_info": false, 00:08:21.995 "zone_management": false, 00:08:21.995 "zone_append": false, 00:08:21.995 "compare": false, 00:08:21.995 "compare_and_write": false, 00:08:21.995 "abort": true, 00:08:21.995 "seek_hole": false, 00:08:21.995 "seek_data": false, 00:08:21.995 "copy": false, 00:08:21.995 "nvme_iov_md": false 00:08:21.995 }, 00:08:21.995 "driver_specific": {} 00:08:21.995 } 00:08:21.995 ] 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # qos_function_test 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@409 -- # local qos_lower_iops_limit=1000 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_bw_limit=2 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local io_result=0 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@455 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local iops_limit=0 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local bw_limit=0 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@415 -- # get_io_result IOPS Malloc_0 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@374 -- # local limit_type=IOPS 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local iostat_result 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:08:21.995 18:45:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # tail -1 00:08:21.995 Running I/O for 60 seconds... 00:08:27.264 18:45:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 93401.65 373606.61 0.00 0.00 375808.00 0.00 0.00 ' 00:08:27.264 18:45:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # '[' IOPS = IOPS ']' 00:08:27.264 18:45:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # awk '{print $2}' 00:08:27.264 18:45:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # iostat_result=93401.65 00:08:27.264 18:45:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@384 -- # echo 93401 00:08:27.264 18:45:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@415 -- # io_result=93401 00:08:27.264 18:45:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@417 -- # iops_limit=23000 00:08:27.264 18:45:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # '[' 23000 -gt 1000 ']' 00:08:27.264 18:45:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@421 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 23000 Malloc_0 00:08:27.264 18:45:11 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:27.264 18:45:11 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:27.264 18:45:11 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:27.264 18:45:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # run_test bdev_qos_iops run_qos_test 23000 IOPS Malloc_0 00:08:27.264 18:45:11 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:08:27.264 18:45:11 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:27.264 18:45:11 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:27.264 ************************************ 00:08:27.264 START TEST bdev_qos_iops 00:08:27.264 ************************************ 00:08:27.264 18:45:11 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 23000 IOPS Malloc_0 00:08:27.264 18:45:11 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@388 -- # local qos_limit=23000 00:08:27.264 18:45:11 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_result=0 00:08:27.264 18:45:11 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@391 -- # get_io_result IOPS Malloc_0 00:08:27.264 18:45:11 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@374 -- # local limit_type=IOPS 00:08:27.264 18:45:11 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:08:27.264 18:45:11 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local iostat_result 00:08:27.264 18:45:11 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:27.264 18:45:11 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:08:27.264 18:45:11 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # tail -1 00:08:32.572 18:45:17 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 23005.13 92020.53 0.00 0.00 92828.00 0.00 0.00 ' 00:08:32.572 18:45:17 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # '[' IOPS = IOPS ']' 00:08:32.572 18:45:17 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # awk '{print $2}' 00:08:32.572 18:45:17 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # iostat_result=23005.13 00:08:32.572 18:45:17 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@384 -- # echo 23005 00:08:32.572 18:45:17 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@391 -- # qos_result=23005 00:08:32.572 18:45:17 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # '[' IOPS = BANDWIDTH ']' 00:08:32.573 18:45:17 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@395 -- # lower_limit=20700 00:08:32.573 18:45:17 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # upper_limit=25300 00:08:32.573 18:45:17 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@399 -- # '[' 23005 -lt 20700 ']' 00:08:32.573 18:45:17 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@399 -- # '[' 23005 -gt 25300 ']' 00:08:32.573 00:08:32.573 real 0m5.178s 00:08:32.573 user 0m0.083s 00:08:32.573 sys 0m0.038s 00:08:32.573 18:45:17 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:32.573 18:45:17 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:08:32.573 ************************************ 00:08:32.573 END TEST bdev_qos_iops 00:08:32.573 ************************************ 00:08:32.573 18:45:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@426 -- # get_io_result BANDWIDTH Null_1 00:08:32.573 18:45:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:08:32.573 18:45:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local qos_dev=Null_1 00:08:32.573 18:45:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local iostat_result 00:08:32.573 18:45:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:32.573 18:45:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # grep Null_1 00:08:32.573 18:45:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # tail -1 00:08:37.840 18:45:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # iostat_result='Null_1 31184.98 124739.94 0.00 0.00 125952.00 0.00 0.00 ' 00:08:37.840 18:45:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:08:37.840 18:45:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:37.840 18:45:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:08:37.840 18:45:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # iostat_result=125952.00 00:08:37.840 18:45:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@384 -- # echo 125952 00:08:37.840 18:45:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@426 -- # bw_limit=125952 00:08:37.840 18:45:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=12 00:08:37.840 18:45:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # '[' 12 -lt 2 ']' 00:08:37.840 18:45:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@431 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 12 Null_1 00:08:37.840 18:45:22 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:37.840 18:45:22 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:37.840 18:45:22 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:37.840 18:45:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # run_test bdev_qos_bw run_qos_test 12 BANDWIDTH Null_1 00:08:37.840 18:45:22 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:08:37.840 18:45:22 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:37.840 18:45:22 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:37.840 ************************************ 00:08:37.840 START TEST bdev_qos_bw 00:08:37.840 ************************************ 00:08:37.840 18:45:22 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 12 BANDWIDTH Null_1 00:08:37.840 18:45:22 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@388 -- # local qos_limit=12 00:08:37.840 18:45:22 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_result=0 00:08:37.840 18:45:22 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@391 -- # get_io_result BANDWIDTH Null_1 00:08:37.840 18:45:22 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:08:37.840 18:45:22 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local qos_dev=Null_1 00:08:37.840 18:45:22 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local iostat_result 00:08:37.840 18:45:22 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:37.840 18:45:22 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # grep Null_1 00:08:37.840 18:45:22 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # tail -1 00:08:43.103 18:45:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # iostat_result='Null_1 3069.29 12277.18 0.00 0.00 12412.00 0.00 0.00 ' 00:08:43.103 18:45:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:08:43.103 18:45:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:43.103 18:45:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:08:43.103 18:45:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # iostat_result=12412.00 00:08:43.103 18:45:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@384 -- # echo 12412 00:08:43.103 18:45:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@391 -- # qos_result=12412 00:08:43.103 18:45:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:43.103 18:45:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # qos_limit=12288 00:08:43.103 18:45:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@395 -- # lower_limit=11059 00:08:43.103 18:45:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # upper_limit=13516 00:08:43.103 18:45:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@399 -- # '[' 12412 -lt 11059 ']' 00:08:43.103 18:45:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@399 -- # '[' 12412 -gt 13516 ']' 00:08:43.103 00:08:43.103 real 0m5.191s 00:08:43.103 user 0m0.089s 00:08:43.103 sys 0m0.034s 00:08:43.103 18:45:27 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:43.103 18:45:27 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:08:43.103 ************************************ 00:08:43.103 END TEST bdev_qos_bw 00:08:43.103 ************************************ 00:08:43.103 18:45:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@435 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:08:43.104 18:45:27 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.104 18:45:27 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:43.104 18:45:27 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.104 18:45:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:08:43.104 18:45:27 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:08:43.104 18:45:27 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:43.104 18:45:27 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:43.104 ************************************ 00:08:43.104 START TEST bdev_qos_ro_bw 00:08:43.104 ************************************ 00:08:43.104 18:45:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:08:43.104 18:45:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@388 -- # local qos_limit=2 00:08:43.104 18:45:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_result=0 00:08:43.104 18:45:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@391 -- # get_io_result BANDWIDTH Malloc_0 00:08:43.104 18:45:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:08:43.104 18:45:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:08:43.104 18:45:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local iostat_result 00:08:43.104 18:45:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:43.104 18:45:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:08:43.104 18:45:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # tail -1 00:08:48.369 18:45:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 512.20 2048.79 0.00 0.00 2060.00 0.00 0.00 ' 00:08:48.369 18:45:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:08:48.369 18:45:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:48.369 18:45:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:08:48.369 18:45:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # iostat_result=2060.00 00:08:48.369 18:45:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@384 -- # echo 2060 00:08:48.369 18:45:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@391 -- # qos_result=2060 00:08:48.369 18:45:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:48.369 18:45:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # qos_limit=2048 00:08:48.369 18:45:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@395 -- # lower_limit=1843 00:08:48.369 18:45:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # upper_limit=2252 00:08:48.369 18:45:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@399 -- # '[' 2060 -lt 1843 ']' 00:08:48.369 18:45:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@399 -- # '[' 2060 -gt 2252 ']' 00:08:48.369 00:08:48.369 real 0m5.151s 00:08:48.369 user 0m0.089s 00:08:48.369 sys 0m0.034s 00:08:48.369 18:45:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:48.369 18:45:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:08:48.369 ************************************ 00:08:48.369 END TEST bdev_qos_ro_bw 00:08:48.369 ************************************ 00:08:48.369 18:45:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@458 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:08:48.369 18:45:32 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:48.369 18:45:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:48.627 18:45:33 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:48.627 18:45:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_null_delete Null_1 00:08:48.627 18:45:33 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:48.627 18:45:33 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:48.627 00:08:48.627 Latency(us) 00:08:48.627 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:48.627 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:08:48.627 Malloc_0 : 26.49 31885.10 124.55 0.00 0.00 7951.81 1388.74 503316.48 00:08:48.627 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:08:48.627 Null_1 : 26.59 31957.58 124.83 0.00 0.00 7995.30 530.53 101861.67 00:08:48.627 =================================================================================================================== 00:08:48.627 Total : 63842.69 249.39 0.00 0.00 7973.62 530.53 503316.48 00:08:48.627 0 00:08:48.627 18:45:33 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:48.627 18:45:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # killprocess 2037079 00:08:48.627 18:45:33 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 2037079 ']' 00:08:48.627 18:45:33 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 2037079 00:08:48.627 18:45:33 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:08:48.627 18:45:33 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:48.627 18:45:33 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2037079 00:08:48.627 18:45:33 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:08:48.627 18:45:33 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:08:48.627 18:45:33 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2037079' 00:08:48.627 killing process with pid 2037079 00:08:48.627 18:45:33 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 2037079 00:08:48.627 Received shutdown signal, test time was about 26.640732 seconds 00:08:48.627 00:08:48.627 Latency(us) 00:08:48.627 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:48.627 =================================================================================================================== 00:08:48.627 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:08:48.627 18:45:33 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 2037079 00:08:48.886 18:45:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # trap - SIGINT SIGTERM EXIT 00:08:48.886 00:08:48.886 real 0m27.862s 00:08:48.886 user 0m28.420s 00:08:48.886 sys 0m0.617s 00:08:48.886 18:45:33 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:48.886 18:45:33 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:48.886 ************************************ 00:08:48.886 END TEST bdev_qos 00:08:48.886 ************************************ 00:08:48.886 18:45:33 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:08:48.886 18:45:33 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:48.886 18:45:33 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:48.886 18:45:33 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:48.886 ************************************ 00:08:48.886 START TEST bdev_qd_sampling 00:08:48.886 ************************************ 00:08:48.886 18:45:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:08:48.886 18:45:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@537 -- # QD_DEV=Malloc_QD 00:08:48.886 18:45:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # QD_PID=2042120 00:08:48.886 18:45:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # echo 'Process bdev QD sampling period testing pid: 2042120' 00:08:48.886 Process bdev QD sampling period testing pid: 2042120 00:08:48.886 18:45:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:08:48.886 18:45:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:08:48.886 18:45:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # waitforlisten 2042120 00:08:48.886 18:45:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 2042120 ']' 00:08:48.886 18:45:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:48.886 18:45:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:48.886 18:45:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:48.886 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:48.886 18:45:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:48.886 18:45:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:48.886 [2024-07-24 18:45:33.830313] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:08:48.886 [2024-07-24 18:45:33.830349] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2042120 ] 00:08:48.886 [2024-07-24 18:45:33.894809] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:49.144 [2024-07-24 18:45:33.966744] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:49.144 [2024-07-24 18:45:33.966746] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@545 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:49.709 Malloc_QD 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # waitforbdev Malloc_QD 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:49.709 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:49.709 [ 00:08:49.709 { 00:08:49.709 "name": "Malloc_QD", 00:08:49.709 "aliases": [ 00:08:49.709 "370907ee-a99f-424f-b0a0-2e99619c9a78" 00:08:49.709 ], 00:08:49.709 "product_name": "Malloc disk", 00:08:49.709 "block_size": 512, 00:08:49.709 "num_blocks": 262144, 00:08:49.709 "uuid": "370907ee-a99f-424f-b0a0-2e99619c9a78", 00:08:49.709 "assigned_rate_limits": { 00:08:49.709 "rw_ios_per_sec": 0, 00:08:49.709 "rw_mbytes_per_sec": 0, 00:08:49.709 "r_mbytes_per_sec": 0, 00:08:49.709 "w_mbytes_per_sec": 0 00:08:49.709 }, 00:08:49.709 "claimed": false, 00:08:49.709 "zoned": false, 00:08:49.709 "supported_io_types": { 00:08:49.709 "read": true, 00:08:49.709 "write": true, 00:08:49.709 "unmap": true, 00:08:49.709 "flush": true, 00:08:49.709 "reset": true, 00:08:49.709 "nvme_admin": false, 00:08:49.709 "nvme_io": false, 00:08:49.709 "nvme_io_md": false, 00:08:49.709 "write_zeroes": true, 00:08:49.709 "zcopy": true, 00:08:49.709 "get_zone_info": false, 00:08:49.709 "zone_management": false, 00:08:49.709 "zone_append": false, 00:08:49.709 "compare": false, 00:08:49.709 "compare_and_write": false, 00:08:49.709 "abort": true, 00:08:49.710 "seek_hole": false, 00:08:49.710 "seek_data": false, 00:08:49.710 "copy": true, 00:08:49.710 "nvme_iov_md": false 00:08:49.710 }, 00:08:49.710 "memory_domains": [ 00:08:49.710 { 00:08:49.710 "dma_device_id": "system", 00:08:49.710 "dma_device_type": 1 00:08:49.710 }, 00:08:49.710 { 00:08:49.710 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:49.710 "dma_device_type": 2 00:08:49.710 } 00:08:49.710 ], 00:08:49.710 "driver_specific": {} 00:08:49.710 } 00:08:49.710 ] 00:08:49.710 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:49.710 18:45:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:08:49.710 18:45:34 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # sleep 2 00:08:49.710 18:45:34 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:08:49.968 Running I/O for 5 seconds... 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # qd_sampling_function_test Malloc_QD 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@518 -- # local bdev_name=Malloc_QD 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local sampling_period=10 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local iostats 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@522 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@524 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@524 -- # iostats='{ 00:08:51.866 "tick_rate": 2100000000, 00:08:51.866 "ticks": 11842212936661416, 00:08:51.866 "bdevs": [ 00:08:51.866 { 00:08:51.866 "name": "Malloc_QD", 00:08:51.866 "bytes_read": 987804160, 00:08:51.866 "num_read_ops": 241156, 00:08:51.866 "bytes_written": 0, 00:08:51.866 "num_write_ops": 0, 00:08:51.866 "bytes_unmapped": 0, 00:08:51.866 "num_unmap_ops": 0, 00:08:51.866 "bytes_copied": 0, 00:08:51.866 "num_copy_ops": 0, 00:08:51.866 "read_latency_ticks": 2082683080894, 00:08:51.866 "max_read_latency_ticks": 10678970, 00:08:51.866 "min_read_latency_ticks": 180482, 00:08:51.866 "write_latency_ticks": 0, 00:08:51.866 "max_write_latency_ticks": 0, 00:08:51.866 "min_write_latency_ticks": 0, 00:08:51.866 "unmap_latency_ticks": 0, 00:08:51.866 "max_unmap_latency_ticks": 0, 00:08:51.866 "min_unmap_latency_ticks": 0, 00:08:51.866 "copy_latency_ticks": 0, 00:08:51.866 "max_copy_latency_ticks": 0, 00:08:51.866 "min_copy_latency_ticks": 0, 00:08:51.866 "io_error": {}, 00:08:51.866 "queue_depth_polling_period": 10, 00:08:51.866 "queue_depth": 512, 00:08:51.866 "io_time": 30, 00:08:51.866 "weighted_io_time": 15360 00:08:51.866 } 00:08:51.866 ] 00:08:51.866 }' 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@526 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@526 -- # qd_sampling_period=10 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@528 -- # '[' 10 == null ']' 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@528 -- # '[' 10 -ne 10 ']' 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@552 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:08:51.866 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:51.867 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:51.867 00:08:51.867 Latency(us) 00:08:51.867 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:51.867 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:08:51.867 Malloc_QD : 2.01 61895.71 241.78 0.00 0.00 4126.89 1053.26 4462.69 00:08:51.867 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:08:51.867 Malloc_QD : 2.01 62383.61 243.69 0.00 0.00 4095.04 635.86 5086.84 00:08:51.867 =================================================================================================================== 00:08:51.867 Total : 124279.31 485.47 0.00 0.00 4110.90 635.86 5086.84 00:08:51.867 0 00:08:51.867 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:51.867 18:45:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # killprocess 2042120 00:08:51.867 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 2042120 ']' 00:08:51.867 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 2042120 00:08:51.867 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:08:51.867 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:51.867 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2042120 00:08:51.867 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:51.867 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:51.867 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2042120' 00:08:51.867 killing process with pid 2042120 00:08:51.867 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 2042120 00:08:51.867 Received shutdown signal, test time was about 2.071997 seconds 00:08:51.867 00:08:51.867 Latency(us) 00:08:51.867 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:51.867 =================================================================================================================== 00:08:51.867 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:08:51.867 18:45:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 2042120 00:08:52.125 18:45:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # trap - SIGINT SIGTERM EXIT 00:08:52.125 00:08:52.125 real 0m3.225s 00:08:52.125 user 0m6.382s 00:08:52.125 sys 0m0.304s 00:08:52.125 18:45:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:52.125 18:45:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:52.125 ************************************ 00:08:52.125 END TEST bdev_qd_sampling 00:08:52.125 ************************************ 00:08:52.125 18:45:37 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_error error_test_suite '' 00:08:52.125 18:45:37 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:52.125 18:45:37 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:52.125 18:45:37 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:52.125 ************************************ 00:08:52.125 START TEST bdev_error 00:08:52.125 ************************************ 00:08:52.125 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:08:52.125 18:45:37 blockdev_general.bdev_error -- bdev/blockdev.sh@465 -- # DEV_1=Dev_1 00:08:52.125 18:45:37 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_2=Dev_2 00:08:52.125 18:45:37 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # ERR_DEV=EE_Dev_1 00:08:52.125 18:45:37 blockdev_general.bdev_error -- bdev/blockdev.sh@470 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:08:52.125 18:45:37 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # ERR_PID=2042822 00:08:52.125 18:45:37 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # echo 'Process error testing pid: 2042822' 00:08:52.125 Process error testing pid: 2042822 00:08:52.125 18:45:37 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # waitforlisten 2042822 00:08:52.125 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 2042822 ']' 00:08:52.125 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:52.125 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:52.125 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:52.125 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:52.125 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:52.125 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:52.125 [2024-07-24 18:45:37.109724] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:08:52.125 [2024-07-24 18:45:37.109761] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2042822 ] 00:08:52.382 [2024-07-24 18:45:37.174323] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:52.382 [2024-07-24 18:45:37.252560] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:52.948 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:52.948 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:08:52.948 18:45:37 blockdev_general.bdev_error -- bdev/blockdev.sh@475 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:08:52.948 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:52.948 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:52.948 Dev_1 00:08:52.948 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:52.948 18:45:37 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # waitforbdev Dev_1 00:08:52.948 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:08:52.948 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:08:52.948 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:08:52.948 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:08:52.948 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:08:52.948 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:08:52.948 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:52.948 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:52.948 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:52.948 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:08:52.948 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:52.948 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:53.206 [ 00:08:53.206 { 00:08:53.206 "name": "Dev_1", 00:08:53.206 "aliases": [ 00:08:53.206 "dcd7d96d-f853-4fc8-8b1e-c36778ea903c" 00:08:53.206 ], 00:08:53.206 "product_name": "Malloc disk", 00:08:53.206 "block_size": 512, 00:08:53.206 "num_blocks": 262144, 00:08:53.206 "uuid": "dcd7d96d-f853-4fc8-8b1e-c36778ea903c", 00:08:53.206 "assigned_rate_limits": { 00:08:53.206 "rw_ios_per_sec": 0, 00:08:53.206 "rw_mbytes_per_sec": 0, 00:08:53.206 "r_mbytes_per_sec": 0, 00:08:53.206 "w_mbytes_per_sec": 0 00:08:53.206 }, 00:08:53.206 "claimed": false, 00:08:53.206 "zoned": false, 00:08:53.206 "supported_io_types": { 00:08:53.206 "read": true, 00:08:53.206 "write": true, 00:08:53.206 "unmap": true, 00:08:53.206 "flush": true, 00:08:53.206 "reset": true, 00:08:53.206 "nvme_admin": false, 00:08:53.206 "nvme_io": false, 00:08:53.206 "nvme_io_md": false, 00:08:53.206 "write_zeroes": true, 00:08:53.206 "zcopy": true, 00:08:53.206 "get_zone_info": false, 00:08:53.206 "zone_management": false, 00:08:53.206 "zone_append": false, 00:08:53.206 "compare": false, 00:08:53.206 "compare_and_write": false, 00:08:53.206 "abort": true, 00:08:53.206 "seek_hole": false, 00:08:53.206 "seek_data": false, 00:08:53.206 "copy": true, 00:08:53.206 "nvme_iov_md": false 00:08:53.206 }, 00:08:53.206 "memory_domains": [ 00:08:53.206 { 00:08:53.206 "dma_device_id": "system", 00:08:53.206 "dma_device_type": 1 00:08:53.206 }, 00:08:53.206 { 00:08:53.206 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:53.206 "dma_device_type": 2 00:08:53.206 } 00:08:53.206 ], 00:08:53.206 "driver_specific": {} 00:08:53.206 } 00:08:53.206 ] 00:08:53.206 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:53.206 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:08:53.206 18:45:37 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # rpc_cmd bdev_error_create Dev_1 00:08:53.206 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:53.206 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:53.206 true 00:08:53.206 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:53.206 18:45:37 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:08:53.206 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:53.206 18:45:37 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:53.206 Dev_2 00:08:53.206 18:45:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:53.206 18:45:38 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # waitforbdev Dev_2 00:08:53.206 18:45:38 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:08:53.206 18:45:38 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:08:53.206 18:45:38 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:08:53.206 18:45:38 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:08:53.206 18:45:38 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:08:53.206 18:45:38 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:08:53.206 18:45:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:53.206 18:45:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:53.206 18:45:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:53.206 18:45:38 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:08:53.206 18:45:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:53.206 18:45:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:53.206 [ 00:08:53.206 { 00:08:53.206 "name": "Dev_2", 00:08:53.206 "aliases": [ 00:08:53.206 "63e33d92-e26b-4c24-ad74-9fcfbeb3d808" 00:08:53.206 ], 00:08:53.206 "product_name": "Malloc disk", 00:08:53.206 "block_size": 512, 00:08:53.206 "num_blocks": 262144, 00:08:53.206 "uuid": "63e33d92-e26b-4c24-ad74-9fcfbeb3d808", 00:08:53.206 "assigned_rate_limits": { 00:08:53.206 "rw_ios_per_sec": 0, 00:08:53.206 "rw_mbytes_per_sec": 0, 00:08:53.206 "r_mbytes_per_sec": 0, 00:08:53.206 "w_mbytes_per_sec": 0 00:08:53.206 }, 00:08:53.206 "claimed": false, 00:08:53.206 "zoned": false, 00:08:53.206 "supported_io_types": { 00:08:53.206 "read": true, 00:08:53.206 "write": true, 00:08:53.206 "unmap": true, 00:08:53.206 "flush": true, 00:08:53.206 "reset": true, 00:08:53.206 "nvme_admin": false, 00:08:53.206 "nvme_io": false, 00:08:53.206 "nvme_io_md": false, 00:08:53.207 "write_zeroes": true, 00:08:53.207 "zcopy": true, 00:08:53.207 "get_zone_info": false, 00:08:53.207 "zone_management": false, 00:08:53.207 "zone_append": false, 00:08:53.207 "compare": false, 00:08:53.207 "compare_and_write": false, 00:08:53.207 "abort": true, 00:08:53.207 "seek_hole": false, 00:08:53.207 "seek_data": false, 00:08:53.207 "copy": true, 00:08:53.207 "nvme_iov_md": false 00:08:53.207 }, 00:08:53.207 "memory_domains": [ 00:08:53.207 { 00:08:53.207 "dma_device_id": "system", 00:08:53.207 "dma_device_type": 1 00:08:53.207 }, 00:08:53.207 { 00:08:53.207 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:53.207 "dma_device_type": 2 00:08:53.207 } 00:08:53.207 ], 00:08:53.207 "driver_specific": {} 00:08:53.207 } 00:08:53.207 ] 00:08:53.207 18:45:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:53.207 18:45:38 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:08:53.207 18:45:38 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:08:53.207 18:45:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:53.207 18:45:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:53.207 18:45:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:53.207 18:45:38 blockdev_general.bdev_error -- bdev/blockdev.sh@482 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:08:53.207 18:45:38 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # sleep 1 00:08:53.207 Running I/O for 5 seconds... 00:08:54.139 18:45:39 blockdev_general.bdev_error -- bdev/blockdev.sh@486 -- # kill -0 2042822 00:08:54.139 18:45:39 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # echo 'Process is existed as continue on error is set. Pid: 2042822' 00:08:54.139 Process is existed as continue on error is set. Pid: 2042822 00:08:54.139 18:45:39 blockdev_general.bdev_error -- bdev/blockdev.sh@494 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:08:54.139 18:45:39 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:54.139 18:45:39 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:54.139 18:45:39 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:54.139 18:45:39 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_malloc_delete Dev_1 00:08:54.139 18:45:39 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:54.139 18:45:39 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:54.139 18:45:39 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:54.139 18:45:39 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # sleep 5 00:08:54.139 Timeout while waiting for response: 00:08:54.139 00:08:54.139 00:08:58.317 00:08:58.317 Latency(us) 00:08:58.317 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:58.317 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:08:58.317 EE_Dev_1 : 0.93 58034.70 226.70 5.37 0.00 273.43 91.18 464.21 00:08:58.317 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:08:58.317 Dev_2 : 5.00 125154.09 488.88 0.00 0.00 125.62 42.18 18599.74 00:08:58.317 =================================================================================================================== 00:08:58.317 Total : 183188.79 715.58 5.37 0.00 137.37 42.18 18599.74 00:08:59.262 18:45:44 blockdev_general.bdev_error -- bdev/blockdev.sh@498 -- # killprocess 2042822 00:08:59.262 18:45:44 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 2042822 ']' 00:08:59.262 18:45:44 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 2042822 00:08:59.262 18:45:44 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:08:59.262 18:45:44 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:59.262 18:45:44 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2042822 00:08:59.262 18:45:44 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:08:59.262 18:45:44 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:08:59.262 18:45:44 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2042822' 00:08:59.262 killing process with pid 2042822 00:08:59.262 18:45:44 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 2042822 00:08:59.262 Received shutdown signal, test time was about 5.000000 seconds 00:08:59.262 00:08:59.262 Latency(us) 00:08:59.262 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:59.262 =================================================================================================================== 00:08:59.262 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:08:59.262 18:45:44 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 2042822 00:08:59.519 18:45:44 blockdev_general.bdev_error -- bdev/blockdev.sh@501 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:08:59.519 18:45:44 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # ERR_PID=2043967 00:08:59.519 18:45:44 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # echo 'Process error testing pid: 2043967' 00:08:59.519 Process error testing pid: 2043967 00:08:59.519 18:45:44 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # waitforlisten 2043967 00:08:59.519 18:45:44 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 2043967 ']' 00:08:59.519 18:45:44 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:59.519 18:45:44 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:59.519 18:45:44 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:59.519 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:59.519 18:45:44 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:59.519 18:45:44 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:59.520 [2024-07-24 18:45:44.373770] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:08:59.520 [2024-07-24 18:45:44.373813] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2043967 ] 00:08:59.520 [2024-07-24 18:45:44.436781] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:59.520 [2024-07-24 18:45:44.516475] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:00.507 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:00.507 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:09:00.507 18:45:45 blockdev_general.bdev_error -- bdev/blockdev.sh@506 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:09:00.507 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:00.507 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:00.507 Dev_1 00:09:00.507 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:00.507 18:45:45 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # waitforbdev Dev_1 00:09:00.507 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:09:00.507 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:00.507 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:00.508 [ 00:09:00.508 { 00:09:00.508 "name": "Dev_1", 00:09:00.508 "aliases": [ 00:09:00.508 "b635504a-e06b-45f6-9d01-c77a10b3309a" 00:09:00.508 ], 00:09:00.508 "product_name": "Malloc disk", 00:09:00.508 "block_size": 512, 00:09:00.508 "num_blocks": 262144, 00:09:00.508 "uuid": "b635504a-e06b-45f6-9d01-c77a10b3309a", 00:09:00.508 "assigned_rate_limits": { 00:09:00.508 "rw_ios_per_sec": 0, 00:09:00.508 "rw_mbytes_per_sec": 0, 00:09:00.508 "r_mbytes_per_sec": 0, 00:09:00.508 "w_mbytes_per_sec": 0 00:09:00.508 }, 00:09:00.508 "claimed": false, 00:09:00.508 "zoned": false, 00:09:00.508 "supported_io_types": { 00:09:00.508 "read": true, 00:09:00.508 "write": true, 00:09:00.508 "unmap": true, 00:09:00.508 "flush": true, 00:09:00.508 "reset": true, 00:09:00.508 "nvme_admin": false, 00:09:00.508 "nvme_io": false, 00:09:00.508 "nvme_io_md": false, 00:09:00.508 "write_zeroes": true, 00:09:00.508 "zcopy": true, 00:09:00.508 "get_zone_info": false, 00:09:00.508 "zone_management": false, 00:09:00.508 "zone_append": false, 00:09:00.508 "compare": false, 00:09:00.508 "compare_and_write": false, 00:09:00.508 "abort": true, 00:09:00.508 "seek_hole": false, 00:09:00.508 "seek_data": false, 00:09:00.508 "copy": true, 00:09:00.508 "nvme_iov_md": false 00:09:00.508 }, 00:09:00.508 "memory_domains": [ 00:09:00.508 { 00:09:00.508 "dma_device_id": "system", 00:09:00.508 "dma_device_type": 1 00:09:00.508 }, 00:09:00.508 { 00:09:00.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:00.508 "dma_device_type": 2 00:09:00.508 } 00:09:00.508 ], 00:09:00.508 "driver_specific": {} 00:09:00.508 } 00:09:00.508 ] 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:09:00.508 18:45:45 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # rpc_cmd bdev_error_create Dev_1 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:00.508 true 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:00.508 18:45:45 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:00.508 Dev_2 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:00.508 18:45:45 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # waitforbdev Dev_2 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:00.508 [ 00:09:00.508 { 00:09:00.508 "name": "Dev_2", 00:09:00.508 "aliases": [ 00:09:00.508 "8dbdd17a-8899-4c03-89b3-a453eee23dbf" 00:09:00.508 ], 00:09:00.508 "product_name": "Malloc disk", 00:09:00.508 "block_size": 512, 00:09:00.508 "num_blocks": 262144, 00:09:00.508 "uuid": "8dbdd17a-8899-4c03-89b3-a453eee23dbf", 00:09:00.508 "assigned_rate_limits": { 00:09:00.508 "rw_ios_per_sec": 0, 00:09:00.508 "rw_mbytes_per_sec": 0, 00:09:00.508 "r_mbytes_per_sec": 0, 00:09:00.508 "w_mbytes_per_sec": 0 00:09:00.508 }, 00:09:00.508 "claimed": false, 00:09:00.508 "zoned": false, 00:09:00.508 "supported_io_types": { 00:09:00.508 "read": true, 00:09:00.508 "write": true, 00:09:00.508 "unmap": true, 00:09:00.508 "flush": true, 00:09:00.508 "reset": true, 00:09:00.508 "nvme_admin": false, 00:09:00.508 "nvme_io": false, 00:09:00.508 "nvme_io_md": false, 00:09:00.508 "write_zeroes": true, 00:09:00.508 "zcopy": true, 00:09:00.508 "get_zone_info": false, 00:09:00.508 "zone_management": false, 00:09:00.508 "zone_append": false, 00:09:00.508 "compare": false, 00:09:00.508 "compare_and_write": false, 00:09:00.508 "abort": true, 00:09:00.508 "seek_hole": false, 00:09:00.508 "seek_data": false, 00:09:00.508 "copy": true, 00:09:00.508 "nvme_iov_md": false 00:09:00.508 }, 00:09:00.508 "memory_domains": [ 00:09:00.508 { 00:09:00.508 "dma_device_id": "system", 00:09:00.508 "dma_device_type": 1 00:09:00.508 }, 00:09:00.508 { 00:09:00.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:00.508 "dma_device_type": 2 00:09:00.508 } 00:09:00.508 ], 00:09:00.508 "driver_specific": {} 00:09:00.508 } 00:09:00.508 ] 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:09:00.508 18:45:45 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:00.508 18:45:45 blockdev_general.bdev_error -- bdev/blockdev.sh@513 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:09:00.508 18:45:45 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # NOT wait 2043967 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 2043967 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:00.508 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 2043967 00:09:00.508 Running I/O for 5 seconds... 00:09:00.508 task offset: 248408 on job bdev=EE_Dev_1 fails 00:09:00.508 00:09:00.508 Latency(us) 00:09:00.508 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:00.508 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:00.508 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:09:00.508 EE_Dev_1 : 0.00 44088.18 172.22 10020.04 0.00 244.50 90.21 434.96 00:09:00.508 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:00.508 Dev_2 : 0.00 27562.45 107.67 0.00 0.00 429.50 87.77 795.79 00:09:00.508 =================================================================================================================== 00:09:00.508 Total : 71650.62 279.89 10020.04 0.00 344.84 87.77 795.79 00:09:00.508 [2024-07-24 18:45:45.397971] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:00.508 request: 00:09:00.508 { 00:09:00.508 "method": "perform_tests", 00:09:00.508 "req_id": 1 00:09:00.508 } 00:09:00.508 Got JSON-RPC error response 00:09:00.508 response: 00:09:00.508 { 00:09:00.508 "code": -32603, 00:09:00.508 "message": "bdevperf failed with error Operation not permitted" 00:09:00.508 } 00:09:00.766 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:09:00.766 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:00.766 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:09:00.766 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:09:00.766 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:09:00.766 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:00.766 00:09:00.766 real 0m8.558s 00:09:00.766 user 0m8.862s 00:09:00.766 sys 0m0.585s 00:09:00.766 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:00.766 18:45:45 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:00.766 ************************************ 00:09:00.766 END TEST bdev_error 00:09:00.766 ************************************ 00:09:00.766 18:45:45 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_stat stat_test_suite '' 00:09:00.766 18:45:45 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:00.766 18:45:45 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:00.766 18:45:45 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:00.766 ************************************ 00:09:00.766 START TEST bdev_stat 00:09:00.766 ************************************ 00:09:00.766 18:45:45 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:09:00.766 18:45:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@591 -- # STAT_DEV=Malloc_STAT 00:09:00.766 18:45:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # STAT_PID=2044219 00:09:00.766 18:45:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@594 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:09:00.766 18:45:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # echo 'Process Bdev IO statistics testing pid: 2044219' 00:09:00.766 Process Bdev IO statistics testing pid: 2044219 00:09:00.766 18:45:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:09:00.766 18:45:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # waitforlisten 2044219 00:09:00.766 18:45:45 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 2044219 ']' 00:09:00.766 18:45:45 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:00.766 18:45:45 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:00.766 18:45:45 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:00.766 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:00.766 18:45:45 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:00.766 18:45:45 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:00.766 [2024-07-24 18:45:45.744587] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:09:00.766 [2024-07-24 18:45:45.744625] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2044219 ] 00:09:01.024 [2024-07-24 18:45:45.804103] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:01.024 [2024-07-24 18:45:45.882790] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:01.024 [2024-07-24 18:45:45.882794] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:01.589 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:01.589 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:09:01.590 18:45:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@600 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:09:01.590 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:01.590 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:01.590 Malloc_STAT 00:09:01.590 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:01.590 18:45:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # waitforbdev Malloc_STAT 00:09:01.590 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:09:01.590 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:01.590 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:09:01.590 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:01.590 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:01.590 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:01.590 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:01.590 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:01.590 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:01.590 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:09:01.590 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:01.590 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:01.590 [ 00:09:01.590 { 00:09:01.590 "name": "Malloc_STAT", 00:09:01.590 "aliases": [ 00:09:01.590 "e708ca92-e930-40ac-bf12-bf7389c05d88" 00:09:01.590 ], 00:09:01.590 "product_name": "Malloc disk", 00:09:01.590 "block_size": 512, 00:09:01.590 "num_blocks": 262144, 00:09:01.590 "uuid": "e708ca92-e930-40ac-bf12-bf7389c05d88", 00:09:01.590 "assigned_rate_limits": { 00:09:01.590 "rw_ios_per_sec": 0, 00:09:01.590 "rw_mbytes_per_sec": 0, 00:09:01.590 "r_mbytes_per_sec": 0, 00:09:01.590 "w_mbytes_per_sec": 0 00:09:01.590 }, 00:09:01.590 "claimed": false, 00:09:01.590 "zoned": false, 00:09:01.590 "supported_io_types": { 00:09:01.590 "read": true, 00:09:01.590 "write": true, 00:09:01.590 "unmap": true, 00:09:01.590 "flush": true, 00:09:01.590 "reset": true, 00:09:01.590 "nvme_admin": false, 00:09:01.590 "nvme_io": false, 00:09:01.590 "nvme_io_md": false, 00:09:01.590 "write_zeroes": true, 00:09:01.590 "zcopy": true, 00:09:01.590 "get_zone_info": false, 00:09:01.590 "zone_management": false, 00:09:01.590 "zone_append": false, 00:09:01.590 "compare": false, 00:09:01.590 "compare_and_write": false, 00:09:01.590 "abort": true, 00:09:01.590 "seek_hole": false, 00:09:01.847 "seek_data": false, 00:09:01.847 "copy": true, 00:09:01.847 "nvme_iov_md": false 00:09:01.847 }, 00:09:01.847 "memory_domains": [ 00:09:01.847 { 00:09:01.847 "dma_device_id": "system", 00:09:01.847 "dma_device_type": 1 00:09:01.847 }, 00:09:01.847 { 00:09:01.847 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:01.847 "dma_device_type": 2 00:09:01.847 } 00:09:01.847 ], 00:09:01.847 "driver_specific": {} 00:09:01.847 } 00:09:01.847 ] 00:09:01.847 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:01.847 18:45:46 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:09:01.847 18:45:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # sleep 2 00:09:01.847 18:45:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:01.847 Running I/O for 10 seconds... 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # stat_function_test Malloc_STAT 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@558 -- # local bdev_name=Malloc_STAT 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local iostats 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local io_count1 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count2 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local iostats_per_channel 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local io_count_per_channel1 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel2 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel_all=0 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@567 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@567 -- # iostats='{ 00:09:03.744 "tick_rate": 2100000000, 00:09:03.744 "ticks": 11842237885100118, 00:09:03.744 "bdevs": [ 00:09:03.744 { 00:09:03.744 "name": "Malloc_STAT", 00:09:03.744 "bytes_read": 983609856, 00:09:03.744 "num_read_ops": 240132, 00:09:03.744 "bytes_written": 0, 00:09:03.744 "num_write_ops": 0, 00:09:03.744 "bytes_unmapped": 0, 00:09:03.744 "num_unmap_ops": 0, 00:09:03.744 "bytes_copied": 0, 00:09:03.744 "num_copy_ops": 0, 00:09:03.744 "read_latency_ticks": 2069306133266, 00:09:03.744 "max_read_latency_ticks": 10127540, 00:09:03.744 "min_read_latency_ticks": 178170, 00:09:03.744 "write_latency_ticks": 0, 00:09:03.744 "max_write_latency_ticks": 0, 00:09:03.744 "min_write_latency_ticks": 0, 00:09:03.744 "unmap_latency_ticks": 0, 00:09:03.744 "max_unmap_latency_ticks": 0, 00:09:03.744 "min_unmap_latency_ticks": 0, 00:09:03.744 "copy_latency_ticks": 0, 00:09:03.744 "max_copy_latency_ticks": 0, 00:09:03.744 "min_copy_latency_ticks": 0, 00:09:03.744 "io_error": {} 00:09:03.744 } 00:09:03.744 ] 00:09:03.744 }' 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # jq -r '.bdevs[0].num_read_ops' 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # io_count1=240132 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@570 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:03.744 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@570 -- # iostats_per_channel='{ 00:09:03.744 "tick_rate": 2100000000, 00:09:03.744 "ticks": 11842238015858582, 00:09:03.744 "name": "Malloc_STAT", 00:09:03.744 "channels": [ 00:09:03.744 { 00:09:03.745 "thread_id": 2, 00:09:03.745 "bytes_read": 505413632, 00:09:03.745 "num_read_ops": 123392, 00:09:03.745 "bytes_written": 0, 00:09:03.745 "num_write_ops": 0, 00:09:03.745 "bytes_unmapped": 0, 00:09:03.745 "num_unmap_ops": 0, 00:09:03.745 "bytes_copied": 0, 00:09:03.745 "num_copy_ops": 0, 00:09:03.745 "read_latency_ticks": 1067608407134, 00:09:03.745 "max_read_latency_ticks": 9233030, 00:09:03.745 "min_read_latency_ticks": 5534334, 00:09:03.745 "write_latency_ticks": 0, 00:09:03.745 "max_write_latency_ticks": 0, 00:09:03.745 "min_write_latency_ticks": 0, 00:09:03.745 "unmap_latency_ticks": 0, 00:09:03.745 "max_unmap_latency_ticks": 0, 00:09:03.745 "min_unmap_latency_ticks": 0, 00:09:03.745 "copy_latency_ticks": 0, 00:09:03.745 "max_copy_latency_ticks": 0, 00:09:03.745 "min_copy_latency_ticks": 0 00:09:03.745 }, 00:09:03.745 { 00:09:03.745 "thread_id": 3, 00:09:03.745 "bytes_read": 509607936, 00:09:03.745 "num_read_ops": 124416, 00:09:03.745 "bytes_written": 0, 00:09:03.745 "num_write_ops": 0, 00:09:03.745 "bytes_unmapped": 0, 00:09:03.745 "num_unmap_ops": 0, 00:09:03.745 "bytes_copied": 0, 00:09:03.745 "num_copy_ops": 0, 00:09:03.745 "read_latency_ticks": 1067986359996, 00:09:03.745 "max_read_latency_ticks": 10127540, 00:09:03.745 "min_read_latency_ticks": 5610462, 00:09:03.745 "write_latency_ticks": 0, 00:09:03.745 "max_write_latency_ticks": 0, 00:09:03.745 "min_write_latency_ticks": 0, 00:09:03.745 "unmap_latency_ticks": 0, 00:09:03.745 "max_unmap_latency_ticks": 0, 00:09:03.745 "min_unmap_latency_ticks": 0, 00:09:03.745 "copy_latency_ticks": 0, 00:09:03.745 "max_copy_latency_ticks": 0, 00:09:03.745 "min_copy_latency_ticks": 0 00:09:03.745 } 00:09:03.745 ] 00:09:03.745 }' 00:09:03.745 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # jq -r '.channels[0].num_read_ops' 00:09:03.745 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # io_count_per_channel1=123392 00:09:03.745 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel_all=123392 00:09:03.745 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # jq -r '.channels[1].num_read_ops' 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel2=124416 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel_all=247808 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@576 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@576 -- # iostats='{ 00:09:04.003 "tick_rate": 2100000000, 00:09:04.003 "ticks": 11842238241247294, 00:09:04.003 "bdevs": [ 00:09:04.003 { 00:09:04.003 "name": "Malloc_STAT", 00:09:04.003 "bytes_read": 1070641664, 00:09:04.003 "num_read_ops": 261380, 00:09:04.003 "bytes_written": 0, 00:09:04.003 "num_write_ops": 0, 00:09:04.003 "bytes_unmapped": 0, 00:09:04.003 "num_unmap_ops": 0, 00:09:04.003 "bytes_copied": 0, 00:09:04.003 "num_copy_ops": 0, 00:09:04.003 "read_latency_ticks": 2252573863878, 00:09:04.003 "max_read_latency_ticks": 10127540, 00:09:04.003 "min_read_latency_ticks": 178170, 00:09:04.003 "write_latency_ticks": 0, 00:09:04.003 "max_write_latency_ticks": 0, 00:09:04.003 "min_write_latency_ticks": 0, 00:09:04.003 "unmap_latency_ticks": 0, 00:09:04.003 "max_unmap_latency_ticks": 0, 00:09:04.003 "min_unmap_latency_ticks": 0, 00:09:04.003 "copy_latency_ticks": 0, 00:09:04.003 "max_copy_latency_ticks": 0, 00:09:04.003 "min_copy_latency_ticks": 0, 00:09:04.003 "io_error": {} 00:09:04.003 } 00:09:04.003 ] 00:09:04.003 }' 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # jq -r '.bdevs[0].num_read_ops' 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # io_count2=261380 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@582 -- # '[' 247808 -lt 240132 ']' 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@582 -- # '[' 247808 -gt 261380 ']' 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@607 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:04.003 00:09:04.003 Latency(us) 00:09:04.003 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:04.003 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:09:04.003 Malloc_STAT : 2.16 62025.08 242.29 0.00 0.00 4118.73 1061.06 4400.27 00:09:04.003 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:04.003 Malloc_STAT : 2.16 62576.73 244.44 0.00 0.00 4082.58 663.16 4837.18 00:09:04.003 =================================================================================================================== 00:09:04.003 Total : 124601.82 486.73 0.00 0.00 4100.57 663.16 4837.18 00:09:04.003 0 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # killprocess 2044219 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 2044219 ']' 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 2044219 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2044219 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2044219' 00:09:04.003 killing process with pid 2044219 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 2044219 00:09:04.003 Received shutdown signal, test time was about 2.232366 seconds 00:09:04.003 00:09:04.003 Latency(us) 00:09:04.003 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:04.003 =================================================================================================================== 00:09:04.003 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:04.003 18:45:48 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 2044219 00:09:04.262 18:45:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # trap - SIGINT SIGTERM EXIT 00:09:04.262 00:09:04.262 real 0m3.394s 00:09:04.262 user 0m6.881s 00:09:04.262 sys 0m0.304s 00:09:04.262 18:45:49 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:04.262 18:45:49 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:04.262 ************************************ 00:09:04.262 END TEST bdev_stat 00:09:04.262 ************************************ 00:09:04.262 18:45:49 blockdev_general -- bdev/blockdev.sh@793 -- # [[ bdev == gpt ]] 00:09:04.262 18:45:49 blockdev_general -- bdev/blockdev.sh@797 -- # [[ bdev == crypto_sw ]] 00:09:04.262 18:45:49 blockdev_general -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:09:04.262 18:45:49 blockdev_general -- bdev/blockdev.sh@810 -- # cleanup 00:09:04.262 18:45:49 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:09:04.262 18:45:49 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:04.262 18:45:49 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:09:04.262 18:45:49 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:09:04.262 18:45:49 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:09:04.262 18:45:49 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:09:04.262 00:09:04.262 real 1m43.756s 00:09:04.262 user 7m3.065s 00:09:04.262 sys 0m14.443s 00:09:04.262 18:45:49 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:04.262 18:45:49 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:04.262 ************************************ 00:09:04.262 END TEST blockdev_general 00:09:04.262 ************************************ 00:09:04.262 18:45:49 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:09:04.262 18:45:49 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:04.262 18:45:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:04.262 18:45:49 -- common/autotest_common.sh@10 -- # set +x 00:09:04.262 ************************************ 00:09:04.262 START TEST bdev_raid 00:09:04.262 ************************************ 00:09:04.262 18:45:49 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:09:04.520 * Looking for test storage... 00:09:04.520 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:04.520 18:45:49 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:09:04.520 18:45:49 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:09:04.520 18:45:49 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:09:04.520 18:45:49 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:09:04.520 18:45:49 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:09:04.520 18:45:49 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:09:04.520 18:45:49 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:09:04.520 18:45:49 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:09:04.520 18:45:49 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:09:04.520 18:45:49 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:09:04.520 18:45:49 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:09:04.521 18:45:49 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:09:04.521 18:45:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:04.521 18:45:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:04.521 18:45:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:04.521 ************************************ 00:09:04.521 START TEST raid_function_test_raid0 00:09:04.521 ************************************ 00:09:04.521 18:45:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:09:04.521 18:45:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:09:04.521 18:45:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:09:04.521 18:45:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:09:04.521 18:45:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=2044987 00:09:04.521 18:45:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:04.521 18:45:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 2044987' 00:09:04.521 Process raid pid: 2044987 00:09:04.521 18:45:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 2044987 /var/tmp/spdk-raid.sock 00:09:04.521 18:45:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 2044987 ']' 00:09:04.521 18:45:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:04.521 18:45:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:04.521 18:45:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:04.521 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:04.521 18:45:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:04.521 18:45:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:09:04.521 [2024-07-24 18:45:49.393335] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:09:04.521 [2024-07-24 18:45:49.393372] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:04.521 [2024-07-24 18:45:49.456884] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:04.779 [2024-07-24 18:45:49.536638] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:04.779 [2024-07-24 18:45:49.587271] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:04.779 [2024-07-24 18:45:49.587295] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:05.346 18:45:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:05.346 18:45:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:09:05.346 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:09:05.346 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:09:05.346 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:05.346 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:09:05.346 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:09:05.604 [2024-07-24 18:45:50.385934] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:09:05.604 [2024-07-24 18:45:50.386919] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:09:05.604 [2024-07-24 18:45:50.386965] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x21b8700 00:09:05.604 [2024-07-24 18:45:50.386971] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:05.604 [2024-07-24 18:45:50.387092] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x201b960 00:09:05.604 [2024-07-24 18:45:50.387168] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21b8700 00:09:05.604 [2024-07-24 18:45:50.387173] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x21b8700 00:09:05.604 [2024-07-24 18:45:50.387239] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:05.604 Base_1 00:09:05.604 Base_2 00:09:05.604 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:05.604 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:09:05.604 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:09:05.604 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:09:05.604 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:09:05.604 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:09:05.604 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:05.604 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:09:05.604 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:05.604 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:09:05.604 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:05.604 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:09:05.604 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:05.604 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:05.604 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:09:05.862 [2024-07-24 18:45:50.722834] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ffc0b0 00:09:05.862 /dev/nbd0 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:05.862 1+0 records in 00:09:05.862 1+0 records out 00:09:05.862 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229151 s, 17.9 MB/s 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:05.862 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:05.863 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:05.863 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:06.121 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:06.121 { 00:09:06.121 "nbd_device": "/dev/nbd0", 00:09:06.121 "bdev_name": "raid" 00:09:06.121 } 00:09:06.121 ]' 00:09:06.121 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:06.121 { 00:09:06.121 "nbd_device": "/dev/nbd0", 00:09:06.121 "bdev_name": "raid" 00:09:06.121 } 00:09:06.121 ]' 00:09:06.121 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:06.121 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:09:06.121 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:09:06.121 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:06.121 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:09:06.121 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:09:06.121 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:09:06.121 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:09:06.121 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:09:06.121 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:09:06.121 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:09:06.121 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:06.121 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:09:06.121 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:09:06.121 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:09:06.121 18:45:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:09:06.121 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:09:06.121 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:09:06.121 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:09:06.121 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:09:06.121 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:09:06.121 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:09:06.121 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:09:06.121 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:09:06.121 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:09:06.121 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:09:06.121 4096+0 records in 00:09:06.121 4096+0 records out 00:09:06.121 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0225624 s, 92.9 MB/s 00:09:06.121 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:09:06.379 4096+0 records in 00:09:06.379 4096+0 records out 00:09:06.379 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.14457 s, 14.5 MB/s 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:09:06.379 128+0 records in 00:09:06.379 128+0 records out 00:09:06.379 65536 bytes (66 kB, 64 KiB) copied, 0.000373473 s, 175 MB/s 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:09:06.379 2035+0 records in 00:09:06.379 2035+0 records out 00:09:06.379 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00495814 s, 210 MB/s 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:09:06.379 456+0 records in 00:09:06.379 456+0 records out 00:09:06.379 233472 bytes (233 kB, 228 KiB) copied, 0.000364503 s, 641 MB/s 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:06.379 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:09:06.637 [2024-07-24 18:45:51.422099] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:06.637 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:06.637 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:06.637 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:06.637 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:06.637 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:06.637 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:06.637 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:09:06.637 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:09:06.637 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:06.637 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:06.637 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:06.637 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:06.637 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:06.637 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:06.637 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:06.637 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:09:06.637 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 2044987 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 2044987 ']' 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 2044987 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2044987 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2044987' 00:09:06.896 killing process with pid 2044987 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 2044987 00:09:06.896 [2024-07-24 18:45:51.697069] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:06.896 [2024-07-24 18:45:51.697116] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:06.896 [2024-07-24 18:45:51.697145] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:06.896 [2024-07-24 18:45:51.697151] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21b8700 name raid, state offline 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 2044987 00:09:06.896 [2024-07-24 18:45:51.712246] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:09:06.896 00:09:06.896 real 0m2.533s 00:09:06.896 user 0m3.450s 00:09:06.896 sys 0m0.728s 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:06.896 18:45:51 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:09:06.896 ************************************ 00:09:06.896 END TEST raid_function_test_raid0 00:09:06.896 ************************************ 00:09:07.154 18:45:51 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:09:07.154 18:45:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:07.154 18:45:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:07.154 18:45:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:07.154 ************************************ 00:09:07.154 START TEST raid_function_test_concat 00:09:07.154 ************************************ 00:09:07.154 18:45:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:09:07.154 18:45:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:09:07.154 18:45:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:09:07.154 18:45:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:09:07.154 18:45:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=2045489 00:09:07.154 18:45:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 2045489' 00:09:07.154 Process raid pid: 2045489 00:09:07.154 18:45:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:07.154 18:45:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 2045489 /var/tmp/spdk-raid.sock 00:09:07.154 18:45:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 2045489 ']' 00:09:07.154 18:45:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:07.154 18:45:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:07.155 18:45:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:07.155 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:07.155 18:45:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:07.155 18:45:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:09:07.155 [2024-07-24 18:45:51.997991] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:09:07.155 [2024-07-24 18:45:51.998028] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:07.155 [2024-07-24 18:45:52.062481] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:07.155 [2024-07-24 18:45:52.134389] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:07.413 [2024-07-24 18:45:52.182201] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:07.413 [2024-07-24 18:45:52.182224] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:07.979 18:45:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:07.979 18:45:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:09:07.979 18:45:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:09:07.979 18:45:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:09:07.979 18:45:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:07.979 18:45:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:09:07.979 18:45:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:09:07.979 [2024-07-24 18:45:52.968163] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:09:07.979 [2024-07-24 18:45:52.969099] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:09:07.979 [2024-07-24 18:45:52.969137] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x227c700 00:09:07.979 [2024-07-24 18:45:52.969143] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:07.979 [2024-07-24 18:45:52.969261] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20df960 00:09:07.979 [2024-07-24 18:45:52.969338] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x227c700 00:09:07.979 [2024-07-24 18:45:52.969343] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x227c700 00:09:07.979 [2024-07-24 18:45:52.969406] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:07.979 Base_1 00:09:07.979 Base_2 00:09:08.237 18:45:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:08.237 18:45:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:09:08.237 18:45:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:09:08.237 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:09:08.237 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:09:08.237 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:09:08.237 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:08.237 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:09:08.237 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:08.237 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:09:08.237 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:08.237 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:09:08.237 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:08.237 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:08.237 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:09:08.496 [2024-07-24 18:45:53.321107] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20c01b0 00:09:08.496 /dev/nbd0 00:09:08.496 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:08.496 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:08.496 18:45:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:08.497 18:45:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:09:08.497 18:45:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:08.497 18:45:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:08.497 18:45:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:08.497 18:45:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:09:08.497 18:45:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:08.497 18:45:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:08.497 18:45:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:08.497 1+0 records in 00:09:08.497 1+0 records out 00:09:08.497 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000231431 s, 17.7 MB/s 00:09:08.497 18:45:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.497 18:45:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:09:08.497 18:45:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.497 18:45:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:08.497 18:45:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:09:08.497 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:08.497 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:08.497 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:08.497 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:08.497 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:08.755 { 00:09:08.755 "nbd_device": "/dev/nbd0", 00:09:08.755 "bdev_name": "raid" 00:09:08.755 } 00:09:08.755 ]' 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:08.755 { 00:09:08.755 "nbd_device": "/dev/nbd0", 00:09:08.755 "bdev_name": "raid" 00:09:08.755 } 00:09:08.755 ]' 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:09:08.755 4096+0 records in 00:09:08.755 4096+0 records out 00:09:08.755 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0231098 s, 90.7 MB/s 00:09:08.755 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:09:09.014 4096+0 records in 00:09:09.014 4096+0 records out 00:09:09.014 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.146308 s, 14.3 MB/s 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:09:09.014 128+0 records in 00:09:09.014 128+0 records out 00:09:09.014 65536 bytes (66 kB, 64 KiB) copied, 0.000374205 s, 175 MB/s 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:09:09.014 2035+0 records in 00:09:09.014 2035+0 records out 00:09:09.014 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00497373 s, 209 MB/s 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:09:09.014 456+0 records in 00:09:09.014 456+0 records out 00:09:09.014 233472 bytes (233 kB, 228 KiB) copied, 0.0011691 s, 200 MB/s 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:09.014 18:45:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:09:09.272 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:09.272 [2024-07-24 18:45:54.036998] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:09.272 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:09.272 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:09.272 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:09.272 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:09.272 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:09.272 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:09:09.272 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 2045489 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 2045489 ']' 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 2045489 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:09.273 18:45:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2045489 00:09:09.531 18:45:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:09.531 18:45:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:09.531 18:45:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2045489' 00:09:09.531 killing process with pid 2045489 00:09:09.531 18:45:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 2045489 00:09:09.531 [2024-07-24 18:45:54.301606] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:09.531 [2024-07-24 18:45:54.301655] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:09.531 [2024-07-24 18:45:54.301683] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:09.531 [2024-07-24 18:45:54.301689] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x227c700 name raid, state offline 00:09:09.531 18:45:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 2045489 00:09:09.531 [2024-07-24 18:45:54.317082] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:09.531 18:45:54 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:09:09.531 00:09:09.531 real 0m2.539s 00:09:09.531 user 0m3.402s 00:09:09.531 sys 0m0.775s 00:09:09.531 18:45:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:09.531 18:45:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:09:09.531 ************************************ 00:09:09.531 END TEST raid_function_test_concat 00:09:09.531 ************************************ 00:09:09.531 18:45:54 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:09:09.531 18:45:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:09.531 18:45:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:09.531 18:45:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:09.789 ************************************ 00:09:09.789 START TEST raid0_resize_test 00:09:09.789 ************************************ 00:09:09.789 18:45:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:09:09.789 18:45:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:09:09.789 18:45:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:09:09.789 18:45:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:09:09.789 18:45:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:09:09.789 18:45:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:09:09.789 18:45:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:09:09.789 18:45:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=2045938 00:09:09.789 18:45:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 2045938' 00:09:09.789 Process raid pid: 2045938 00:09:09.789 18:45:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 2045938 /var/tmp/spdk-raid.sock 00:09:09.789 18:45:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 2045938 ']' 00:09:09.789 18:45:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:09.789 18:45:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:09.789 18:45:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:09.789 18:45:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:09.789 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:09.789 18:45:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:09.789 18:45:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:09:09.789 [2024-07-24 18:45:54.588162] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:09:09.789 [2024-07-24 18:45:54.588200] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:09.789 [2024-07-24 18:45:54.654294] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:09.789 [2024-07-24 18:45:54.732543] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:09.789 [2024-07-24 18:45:54.782923] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:09.789 [2024-07-24 18:45:54.782948] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:10.723 18:45:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:10.723 18:45:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:09:10.723 18:45:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:09:10.723 Base_1 00:09:10.723 18:45:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:09:10.723 Base_2 00:09:10.723 18:45:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:09:10.981 [2024-07-24 18:45:55.858518] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:09:10.981 [2024-07-24 18:45:55.859506] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:09:10.981 [2024-07-24 18:45:55.859540] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1500e80 00:09:10.981 [2024-07-24 18:45:55.859545] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:10.981 [2024-07-24 18:45:55.859685] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16b27b0 00:09:10.981 [2024-07-24 18:45:55.859749] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1500e80 00:09:10.981 [2024-07-24 18:45:55.859753] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x1500e80 00:09:10.981 [2024-07-24 18:45:55.859823] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:10.981 18:45:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:09:11.240 [2024-07-24 18:45:56.018922] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:09:11.240 [2024-07-24 18:45:56.018934] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:09:11.240 true 00:09:11.240 18:45:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:09:11.240 18:45:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:09:11.240 [2024-07-24 18:45:56.187456] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:11.240 18:45:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:09:11.240 18:45:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:09:11.240 18:45:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:09:11.240 18:45:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:09:11.499 [2024-07-24 18:45:56.351785] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:09:11.499 [2024-07-24 18:45:56.351797] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:09:11.499 [2024-07-24 18:45:56.351810] bdev_raid.c:2315:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:09:11.499 true 00:09:11.499 18:45:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:09:11.499 18:45:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:09:11.758 [2024-07-24 18:45:56.520313] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:11.758 18:45:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:09:11.758 18:45:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:09:11.758 18:45:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:09:11.758 18:45:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 2045938 00:09:11.758 18:45:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 2045938 ']' 00:09:11.758 18:45:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 2045938 00:09:11.758 18:45:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:09:11.758 18:45:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:11.758 18:45:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2045938 00:09:11.758 18:45:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:11.758 18:45:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:11.758 18:45:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2045938' 00:09:11.758 killing process with pid 2045938 00:09:11.758 18:45:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 2045938 00:09:11.758 [2024-07-24 18:45:56.576118] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:11.758 18:45:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 2045938 00:09:11.758 [2024-07-24 18:45:56.576161] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:11.758 [2024-07-24 18:45:56.576191] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:11.758 [2024-07-24 18:45:56.576196] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1500e80 name Raid, state offline 00:09:11.758 [2024-07-24 18:45:56.577246] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:11.758 18:45:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:09:11.758 00:09:11.758 real 0m2.181s 00:09:11.758 user 0m3.313s 00:09:11.758 sys 0m0.393s 00:09:11.758 18:45:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:11.758 18:45:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:09:11.758 ************************************ 00:09:11.758 END TEST raid0_resize_test 00:09:11.758 ************************************ 00:09:11.758 18:45:56 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:09:11.758 18:45:56 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:09:11.758 18:45:56 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:09:11.758 18:45:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:11.758 18:45:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:11.758 18:45:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:12.017 ************************************ 00:09:12.017 START TEST raid_state_function_test 00:09:12.017 ************************************ 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2046340 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2046340' 00:09:12.017 Process raid pid: 2046340 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2046340 /var/tmp/spdk-raid.sock 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2046340 ']' 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:12.017 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:12.017 18:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:12.017 [2024-07-24 18:45:56.850059] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:09:12.017 [2024-07-24 18:45:56.850097] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:12.017 [2024-07-24 18:45:56.912380] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:12.017 [2024-07-24 18:45:56.984321] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:12.276 [2024-07-24 18:45:57.033301] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:12.276 [2024-07-24 18:45:57.033321] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:12.842 18:45:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:12.842 18:45:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:09:12.843 18:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:12.843 [2024-07-24 18:45:57.779931] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:12.843 [2024-07-24 18:45:57.779960] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:12.843 [2024-07-24 18:45:57.779966] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:12.843 [2024-07-24 18:45:57.779971] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:12.843 18:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:12.843 18:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:12.843 18:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:12.843 18:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:12.843 18:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:12.843 18:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:12.843 18:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:12.843 18:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:12.843 18:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:12.843 18:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:12.843 18:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:12.843 18:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:13.101 18:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:13.101 "name": "Existed_Raid", 00:09:13.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:13.101 "strip_size_kb": 64, 00:09:13.101 "state": "configuring", 00:09:13.101 "raid_level": "raid0", 00:09:13.101 "superblock": false, 00:09:13.101 "num_base_bdevs": 2, 00:09:13.101 "num_base_bdevs_discovered": 0, 00:09:13.101 "num_base_bdevs_operational": 2, 00:09:13.101 "base_bdevs_list": [ 00:09:13.101 { 00:09:13.101 "name": "BaseBdev1", 00:09:13.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:13.101 "is_configured": false, 00:09:13.101 "data_offset": 0, 00:09:13.101 "data_size": 0 00:09:13.101 }, 00:09:13.101 { 00:09:13.101 "name": "BaseBdev2", 00:09:13.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:13.101 "is_configured": false, 00:09:13.101 "data_offset": 0, 00:09:13.101 "data_size": 0 00:09:13.101 } 00:09:13.101 ] 00:09:13.101 }' 00:09:13.101 18:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:13.101 18:45:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:13.668 18:45:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:13.668 [2024-07-24 18:45:58.577902] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:13.668 [2024-07-24 18:45:58.577922] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1574b80 name Existed_Raid, state configuring 00:09:13.668 18:45:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:13.926 [2024-07-24 18:45:58.754369] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:13.927 [2024-07-24 18:45:58.754387] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:13.927 [2024-07-24 18:45:58.754391] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:13.927 [2024-07-24 18:45:58.754396] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:13.927 18:45:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:14.185 [2024-07-24 18:45:58.946947] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:14.185 BaseBdev1 00:09:14.185 18:45:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:09:14.185 18:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:09:14.185 18:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:14.185 18:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:09:14.185 18:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:14.185 18:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:14.185 18:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:14.185 18:45:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:14.443 [ 00:09:14.443 { 00:09:14.443 "name": "BaseBdev1", 00:09:14.443 "aliases": [ 00:09:14.443 "d6ab53f7-edde-4251-9411-160ea14fbf5c" 00:09:14.443 ], 00:09:14.443 "product_name": "Malloc disk", 00:09:14.443 "block_size": 512, 00:09:14.443 "num_blocks": 65536, 00:09:14.443 "uuid": "d6ab53f7-edde-4251-9411-160ea14fbf5c", 00:09:14.443 "assigned_rate_limits": { 00:09:14.443 "rw_ios_per_sec": 0, 00:09:14.443 "rw_mbytes_per_sec": 0, 00:09:14.443 "r_mbytes_per_sec": 0, 00:09:14.443 "w_mbytes_per_sec": 0 00:09:14.443 }, 00:09:14.443 "claimed": true, 00:09:14.443 "claim_type": "exclusive_write", 00:09:14.443 "zoned": false, 00:09:14.443 "supported_io_types": { 00:09:14.443 "read": true, 00:09:14.443 "write": true, 00:09:14.443 "unmap": true, 00:09:14.443 "flush": true, 00:09:14.443 "reset": true, 00:09:14.443 "nvme_admin": false, 00:09:14.443 "nvme_io": false, 00:09:14.443 "nvme_io_md": false, 00:09:14.443 "write_zeroes": true, 00:09:14.443 "zcopy": true, 00:09:14.443 "get_zone_info": false, 00:09:14.443 "zone_management": false, 00:09:14.443 "zone_append": false, 00:09:14.443 "compare": false, 00:09:14.443 "compare_and_write": false, 00:09:14.443 "abort": true, 00:09:14.443 "seek_hole": false, 00:09:14.443 "seek_data": false, 00:09:14.443 "copy": true, 00:09:14.443 "nvme_iov_md": false 00:09:14.443 }, 00:09:14.443 "memory_domains": [ 00:09:14.443 { 00:09:14.443 "dma_device_id": "system", 00:09:14.443 "dma_device_type": 1 00:09:14.444 }, 00:09:14.444 { 00:09:14.444 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:14.444 "dma_device_type": 2 00:09:14.444 } 00:09:14.444 ], 00:09:14.444 "driver_specific": {} 00:09:14.444 } 00:09:14.444 ] 00:09:14.444 18:45:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:09:14.444 18:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:14.444 18:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:14.444 18:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:14.444 18:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:14.444 18:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:14.444 18:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:14.444 18:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:14.444 18:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:14.444 18:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:14.444 18:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:14.444 18:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:14.444 18:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:14.703 18:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:14.703 "name": "Existed_Raid", 00:09:14.703 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:14.703 "strip_size_kb": 64, 00:09:14.703 "state": "configuring", 00:09:14.703 "raid_level": "raid0", 00:09:14.703 "superblock": false, 00:09:14.703 "num_base_bdevs": 2, 00:09:14.703 "num_base_bdevs_discovered": 1, 00:09:14.703 "num_base_bdevs_operational": 2, 00:09:14.703 "base_bdevs_list": [ 00:09:14.703 { 00:09:14.703 "name": "BaseBdev1", 00:09:14.703 "uuid": "d6ab53f7-edde-4251-9411-160ea14fbf5c", 00:09:14.703 "is_configured": true, 00:09:14.703 "data_offset": 0, 00:09:14.703 "data_size": 65536 00:09:14.703 }, 00:09:14.703 { 00:09:14.703 "name": "BaseBdev2", 00:09:14.703 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:14.703 "is_configured": false, 00:09:14.703 "data_offset": 0, 00:09:14.703 "data_size": 0 00:09:14.703 } 00:09:14.703 ] 00:09:14.703 }' 00:09:14.703 18:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:14.703 18:45:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:14.962 18:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:15.220 [2024-07-24 18:46:00.130006] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:15.220 [2024-07-24 18:46:00.130041] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1574470 name Existed_Raid, state configuring 00:09:15.220 18:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:15.479 [2024-07-24 18:46:00.310490] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:15.479 [2024-07-24 18:46:00.311534] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:15.479 [2024-07-24 18:46:00.311559] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:15.479 18:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:09:15.479 18:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:15.479 18:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:15.479 18:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:15.479 18:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:15.479 18:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:15.479 18:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:15.479 18:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:15.479 18:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:15.479 18:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:15.479 18:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:15.479 18:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:15.479 18:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:15.479 18:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:15.738 18:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:15.738 "name": "Existed_Raid", 00:09:15.738 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:15.738 "strip_size_kb": 64, 00:09:15.738 "state": "configuring", 00:09:15.738 "raid_level": "raid0", 00:09:15.738 "superblock": false, 00:09:15.738 "num_base_bdevs": 2, 00:09:15.738 "num_base_bdevs_discovered": 1, 00:09:15.738 "num_base_bdevs_operational": 2, 00:09:15.738 "base_bdevs_list": [ 00:09:15.738 { 00:09:15.738 "name": "BaseBdev1", 00:09:15.738 "uuid": "d6ab53f7-edde-4251-9411-160ea14fbf5c", 00:09:15.738 "is_configured": true, 00:09:15.738 "data_offset": 0, 00:09:15.738 "data_size": 65536 00:09:15.738 }, 00:09:15.738 { 00:09:15.739 "name": "BaseBdev2", 00:09:15.739 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:15.739 "is_configured": false, 00:09:15.739 "data_offset": 0, 00:09:15.739 "data_size": 0 00:09:15.739 } 00:09:15.739 ] 00:09:15.739 }' 00:09:15.739 18:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:15.739 18:46:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:15.997 18:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:16.256 [2024-07-24 18:46:01.119248] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:16.256 [2024-07-24 18:46:01.119273] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1575260 00:09:16.256 [2024-07-24 18:46:01.119277] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:16.256 [2024-07-24 18:46:01.119407] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x171e3d0 00:09:16.256 [2024-07-24 18:46:01.119494] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1575260 00:09:16.256 [2024-07-24 18:46:01.119499] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1575260 00:09:16.256 [2024-07-24 18:46:01.119616] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:16.256 BaseBdev2 00:09:16.256 18:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:09:16.256 18:46:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:09:16.256 18:46:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:16.256 18:46:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:09:16.256 18:46:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:16.256 18:46:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:16.256 18:46:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:16.514 18:46:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:16.514 [ 00:09:16.514 { 00:09:16.514 "name": "BaseBdev2", 00:09:16.514 "aliases": [ 00:09:16.514 "3c7e96d2-7360-4197-9de1-ed3e677e51b9" 00:09:16.514 ], 00:09:16.514 "product_name": "Malloc disk", 00:09:16.514 "block_size": 512, 00:09:16.514 "num_blocks": 65536, 00:09:16.514 "uuid": "3c7e96d2-7360-4197-9de1-ed3e677e51b9", 00:09:16.514 "assigned_rate_limits": { 00:09:16.514 "rw_ios_per_sec": 0, 00:09:16.514 "rw_mbytes_per_sec": 0, 00:09:16.514 "r_mbytes_per_sec": 0, 00:09:16.514 "w_mbytes_per_sec": 0 00:09:16.514 }, 00:09:16.514 "claimed": true, 00:09:16.514 "claim_type": "exclusive_write", 00:09:16.514 "zoned": false, 00:09:16.514 "supported_io_types": { 00:09:16.514 "read": true, 00:09:16.514 "write": true, 00:09:16.514 "unmap": true, 00:09:16.514 "flush": true, 00:09:16.514 "reset": true, 00:09:16.514 "nvme_admin": false, 00:09:16.514 "nvme_io": false, 00:09:16.514 "nvme_io_md": false, 00:09:16.514 "write_zeroes": true, 00:09:16.514 "zcopy": true, 00:09:16.514 "get_zone_info": false, 00:09:16.514 "zone_management": false, 00:09:16.514 "zone_append": false, 00:09:16.514 "compare": false, 00:09:16.514 "compare_and_write": false, 00:09:16.514 "abort": true, 00:09:16.514 "seek_hole": false, 00:09:16.514 "seek_data": false, 00:09:16.514 "copy": true, 00:09:16.514 "nvme_iov_md": false 00:09:16.514 }, 00:09:16.514 "memory_domains": [ 00:09:16.514 { 00:09:16.514 "dma_device_id": "system", 00:09:16.514 "dma_device_type": 1 00:09:16.514 }, 00:09:16.514 { 00:09:16.514 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:16.514 "dma_device_type": 2 00:09:16.514 } 00:09:16.514 ], 00:09:16.514 "driver_specific": {} 00:09:16.514 } 00:09:16.514 ] 00:09:16.514 18:46:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:09:16.514 18:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:09:16.514 18:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:16.514 18:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:09:16.514 18:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:16.514 18:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:16.514 18:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:16.514 18:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:16.514 18:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:16.514 18:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:16.514 18:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:16.514 18:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:16.514 18:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:16.514 18:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:16.514 18:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:16.772 18:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:16.772 "name": "Existed_Raid", 00:09:16.772 "uuid": "a84bc1df-446f-4686-bf9a-d76946e4897f", 00:09:16.772 "strip_size_kb": 64, 00:09:16.772 "state": "online", 00:09:16.772 "raid_level": "raid0", 00:09:16.772 "superblock": false, 00:09:16.772 "num_base_bdevs": 2, 00:09:16.772 "num_base_bdevs_discovered": 2, 00:09:16.772 "num_base_bdevs_operational": 2, 00:09:16.772 "base_bdevs_list": [ 00:09:16.772 { 00:09:16.772 "name": "BaseBdev1", 00:09:16.772 "uuid": "d6ab53f7-edde-4251-9411-160ea14fbf5c", 00:09:16.772 "is_configured": true, 00:09:16.772 "data_offset": 0, 00:09:16.772 "data_size": 65536 00:09:16.772 }, 00:09:16.772 { 00:09:16.772 "name": "BaseBdev2", 00:09:16.772 "uuid": "3c7e96d2-7360-4197-9de1-ed3e677e51b9", 00:09:16.772 "is_configured": true, 00:09:16.772 "data_offset": 0, 00:09:16.772 "data_size": 65536 00:09:16.772 } 00:09:16.772 ] 00:09:16.772 }' 00:09:16.772 18:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:16.772 18:46:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:17.379 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:09:17.379 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:09:17.379 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:17.379 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:17.379 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:17.379 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:17.379 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:17.379 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:17.379 [2024-07-24 18:46:02.314506] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:17.379 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:17.379 "name": "Existed_Raid", 00:09:17.379 "aliases": [ 00:09:17.379 "a84bc1df-446f-4686-bf9a-d76946e4897f" 00:09:17.379 ], 00:09:17.379 "product_name": "Raid Volume", 00:09:17.379 "block_size": 512, 00:09:17.379 "num_blocks": 131072, 00:09:17.379 "uuid": "a84bc1df-446f-4686-bf9a-d76946e4897f", 00:09:17.379 "assigned_rate_limits": { 00:09:17.379 "rw_ios_per_sec": 0, 00:09:17.379 "rw_mbytes_per_sec": 0, 00:09:17.379 "r_mbytes_per_sec": 0, 00:09:17.379 "w_mbytes_per_sec": 0 00:09:17.379 }, 00:09:17.379 "claimed": false, 00:09:17.379 "zoned": false, 00:09:17.379 "supported_io_types": { 00:09:17.379 "read": true, 00:09:17.379 "write": true, 00:09:17.379 "unmap": true, 00:09:17.379 "flush": true, 00:09:17.379 "reset": true, 00:09:17.379 "nvme_admin": false, 00:09:17.379 "nvme_io": false, 00:09:17.379 "nvme_io_md": false, 00:09:17.379 "write_zeroes": true, 00:09:17.379 "zcopy": false, 00:09:17.379 "get_zone_info": false, 00:09:17.379 "zone_management": false, 00:09:17.379 "zone_append": false, 00:09:17.379 "compare": false, 00:09:17.379 "compare_and_write": false, 00:09:17.379 "abort": false, 00:09:17.379 "seek_hole": false, 00:09:17.379 "seek_data": false, 00:09:17.379 "copy": false, 00:09:17.379 "nvme_iov_md": false 00:09:17.379 }, 00:09:17.379 "memory_domains": [ 00:09:17.379 { 00:09:17.379 "dma_device_id": "system", 00:09:17.379 "dma_device_type": 1 00:09:17.379 }, 00:09:17.379 { 00:09:17.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:17.379 "dma_device_type": 2 00:09:17.379 }, 00:09:17.379 { 00:09:17.379 "dma_device_id": "system", 00:09:17.379 "dma_device_type": 1 00:09:17.379 }, 00:09:17.379 { 00:09:17.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:17.379 "dma_device_type": 2 00:09:17.379 } 00:09:17.379 ], 00:09:17.379 "driver_specific": { 00:09:17.379 "raid": { 00:09:17.379 "uuid": "a84bc1df-446f-4686-bf9a-d76946e4897f", 00:09:17.379 "strip_size_kb": 64, 00:09:17.379 "state": "online", 00:09:17.379 "raid_level": "raid0", 00:09:17.379 "superblock": false, 00:09:17.379 "num_base_bdevs": 2, 00:09:17.379 "num_base_bdevs_discovered": 2, 00:09:17.379 "num_base_bdevs_operational": 2, 00:09:17.379 "base_bdevs_list": [ 00:09:17.379 { 00:09:17.379 "name": "BaseBdev1", 00:09:17.379 "uuid": "d6ab53f7-edde-4251-9411-160ea14fbf5c", 00:09:17.379 "is_configured": true, 00:09:17.379 "data_offset": 0, 00:09:17.379 "data_size": 65536 00:09:17.379 }, 00:09:17.379 { 00:09:17.379 "name": "BaseBdev2", 00:09:17.379 "uuid": "3c7e96d2-7360-4197-9de1-ed3e677e51b9", 00:09:17.379 "is_configured": true, 00:09:17.379 "data_offset": 0, 00:09:17.379 "data_size": 65536 00:09:17.379 } 00:09:17.379 ] 00:09:17.379 } 00:09:17.379 } 00:09:17.379 }' 00:09:17.379 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:17.379 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:09:17.379 BaseBdev2' 00:09:17.379 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:17.380 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:17.380 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:17.638 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:17.638 "name": "BaseBdev1", 00:09:17.638 "aliases": [ 00:09:17.638 "d6ab53f7-edde-4251-9411-160ea14fbf5c" 00:09:17.638 ], 00:09:17.638 "product_name": "Malloc disk", 00:09:17.638 "block_size": 512, 00:09:17.638 "num_blocks": 65536, 00:09:17.638 "uuid": "d6ab53f7-edde-4251-9411-160ea14fbf5c", 00:09:17.638 "assigned_rate_limits": { 00:09:17.638 "rw_ios_per_sec": 0, 00:09:17.638 "rw_mbytes_per_sec": 0, 00:09:17.638 "r_mbytes_per_sec": 0, 00:09:17.638 "w_mbytes_per_sec": 0 00:09:17.638 }, 00:09:17.638 "claimed": true, 00:09:17.638 "claim_type": "exclusive_write", 00:09:17.638 "zoned": false, 00:09:17.638 "supported_io_types": { 00:09:17.638 "read": true, 00:09:17.638 "write": true, 00:09:17.638 "unmap": true, 00:09:17.638 "flush": true, 00:09:17.638 "reset": true, 00:09:17.638 "nvme_admin": false, 00:09:17.638 "nvme_io": false, 00:09:17.638 "nvme_io_md": false, 00:09:17.638 "write_zeroes": true, 00:09:17.638 "zcopy": true, 00:09:17.638 "get_zone_info": false, 00:09:17.638 "zone_management": false, 00:09:17.638 "zone_append": false, 00:09:17.638 "compare": false, 00:09:17.638 "compare_and_write": false, 00:09:17.638 "abort": true, 00:09:17.638 "seek_hole": false, 00:09:17.638 "seek_data": false, 00:09:17.638 "copy": true, 00:09:17.638 "nvme_iov_md": false 00:09:17.638 }, 00:09:17.638 "memory_domains": [ 00:09:17.638 { 00:09:17.638 "dma_device_id": "system", 00:09:17.638 "dma_device_type": 1 00:09:17.638 }, 00:09:17.638 { 00:09:17.638 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:17.638 "dma_device_type": 2 00:09:17.638 } 00:09:17.638 ], 00:09:17.638 "driver_specific": {} 00:09:17.638 }' 00:09:17.638 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:17.638 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:17.638 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:17.638 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:17.896 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:17.896 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:17.896 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:17.896 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:17.896 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:17.896 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:17.896 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:17.896 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:17.896 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:17.896 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:17.896 18:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:18.154 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:18.154 "name": "BaseBdev2", 00:09:18.154 "aliases": [ 00:09:18.154 "3c7e96d2-7360-4197-9de1-ed3e677e51b9" 00:09:18.154 ], 00:09:18.154 "product_name": "Malloc disk", 00:09:18.154 "block_size": 512, 00:09:18.154 "num_blocks": 65536, 00:09:18.154 "uuid": "3c7e96d2-7360-4197-9de1-ed3e677e51b9", 00:09:18.154 "assigned_rate_limits": { 00:09:18.154 "rw_ios_per_sec": 0, 00:09:18.154 "rw_mbytes_per_sec": 0, 00:09:18.154 "r_mbytes_per_sec": 0, 00:09:18.154 "w_mbytes_per_sec": 0 00:09:18.154 }, 00:09:18.154 "claimed": true, 00:09:18.154 "claim_type": "exclusive_write", 00:09:18.154 "zoned": false, 00:09:18.154 "supported_io_types": { 00:09:18.154 "read": true, 00:09:18.154 "write": true, 00:09:18.154 "unmap": true, 00:09:18.154 "flush": true, 00:09:18.154 "reset": true, 00:09:18.154 "nvme_admin": false, 00:09:18.154 "nvme_io": false, 00:09:18.154 "nvme_io_md": false, 00:09:18.154 "write_zeroes": true, 00:09:18.154 "zcopy": true, 00:09:18.154 "get_zone_info": false, 00:09:18.154 "zone_management": false, 00:09:18.154 "zone_append": false, 00:09:18.154 "compare": false, 00:09:18.154 "compare_and_write": false, 00:09:18.154 "abort": true, 00:09:18.154 "seek_hole": false, 00:09:18.154 "seek_data": false, 00:09:18.154 "copy": true, 00:09:18.154 "nvme_iov_md": false 00:09:18.154 }, 00:09:18.154 "memory_domains": [ 00:09:18.154 { 00:09:18.154 "dma_device_id": "system", 00:09:18.154 "dma_device_type": 1 00:09:18.154 }, 00:09:18.154 { 00:09:18.154 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:18.154 "dma_device_type": 2 00:09:18.154 } 00:09:18.154 ], 00:09:18.154 "driver_specific": {} 00:09:18.154 }' 00:09:18.154 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:18.154 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:18.154 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:18.154 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:18.154 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:18.466 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:18.466 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:18.466 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:18.466 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:18.466 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:18.466 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:18.466 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:18.466 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:18.727 [2024-07-24 18:46:03.513613] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:18.727 [2024-07-24 18:46:03.513630] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:18.727 [2024-07-24 18:46:03.513658] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:18.727 "name": "Existed_Raid", 00:09:18.727 "uuid": "a84bc1df-446f-4686-bf9a-d76946e4897f", 00:09:18.727 "strip_size_kb": 64, 00:09:18.727 "state": "offline", 00:09:18.727 "raid_level": "raid0", 00:09:18.727 "superblock": false, 00:09:18.727 "num_base_bdevs": 2, 00:09:18.727 "num_base_bdevs_discovered": 1, 00:09:18.727 "num_base_bdevs_operational": 1, 00:09:18.727 "base_bdevs_list": [ 00:09:18.727 { 00:09:18.727 "name": null, 00:09:18.727 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:18.727 "is_configured": false, 00:09:18.727 "data_offset": 0, 00:09:18.727 "data_size": 65536 00:09:18.727 }, 00:09:18.727 { 00:09:18.727 "name": "BaseBdev2", 00:09:18.727 "uuid": "3c7e96d2-7360-4197-9de1-ed3e677e51b9", 00:09:18.727 "is_configured": true, 00:09:18.727 "data_offset": 0, 00:09:18.727 "data_size": 65536 00:09:18.727 } 00:09:18.727 ] 00:09:18.727 }' 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:18.727 18:46:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:19.292 18:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:09:19.292 18:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:19.292 18:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:19.292 18:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:09:19.550 18:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:09:19.550 18:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:19.550 18:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:19.550 [2024-07-24 18:46:04.529122] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:19.550 [2024-07-24 18:46:04.529158] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1575260 name Existed_Raid, state offline 00:09:19.550 18:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:09:19.550 18:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:19.550 18:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:19.550 18:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:09:19.808 18:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:09:19.808 18:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:09:19.808 18:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:09:19.808 18:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2046340 00:09:19.808 18:46:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2046340 ']' 00:09:19.808 18:46:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2046340 00:09:19.808 18:46:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:09:19.808 18:46:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:19.808 18:46:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2046340 00:09:19.808 18:46:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:19.808 18:46:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:19.808 18:46:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2046340' 00:09:19.808 killing process with pid 2046340 00:09:19.808 18:46:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2046340 00:09:19.808 [2024-07-24 18:46:04.752662] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:19.808 18:46:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2046340 00:09:19.808 [2024-07-24 18:46:04.753432] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:20.066 18:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:09:20.066 00:09:20.066 real 0m8.129s 00:09:20.066 user 0m14.642s 00:09:20.066 sys 0m1.274s 00:09:20.066 18:46:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:20.066 18:46:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:20.066 ************************************ 00:09:20.066 END TEST raid_state_function_test 00:09:20.066 ************************************ 00:09:20.066 18:46:04 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:09:20.066 18:46:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:20.066 18:46:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:20.067 18:46:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:20.067 ************************************ 00:09:20.067 START TEST raid_state_function_test_sb 00:09:20.067 ************************************ 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:09:20.067 18:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2047936 00:09:20.067 18:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2047936' 00:09:20.067 Process raid pid: 2047936 00:09:20.067 18:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:20.067 18:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2047936 /var/tmp/spdk-raid.sock 00:09:20.067 18:46:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2047936 ']' 00:09:20.067 18:46:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:20.067 18:46:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:20.067 18:46:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:20.067 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:20.067 18:46:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:20.067 18:46:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:20.067 [2024-07-24 18:46:05.048202] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:09:20.067 [2024-07-24 18:46:05.048242] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:20.325 [2024-07-24 18:46:05.113295] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:20.325 [2024-07-24 18:46:05.191694] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:20.325 [2024-07-24 18:46:05.241340] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:20.325 [2024-07-24 18:46:05.241362] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:20.891 18:46:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:20.891 18:46:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:09:20.891 18:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:21.152 [2024-07-24 18:46:05.984201] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:21.152 [2024-07-24 18:46:05.984228] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:21.152 [2024-07-24 18:46:05.984233] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:21.152 [2024-07-24 18:46:05.984238] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:21.152 18:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:21.152 18:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:21.152 18:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:21.152 18:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:21.152 18:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:21.152 18:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:21.152 18:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:21.152 18:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:21.152 18:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:21.152 18:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:21.152 18:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:21.152 18:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:21.412 18:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:21.412 "name": "Existed_Raid", 00:09:21.412 "uuid": "f8b23324-2057-4765-9b7f-65170f2d840d", 00:09:21.412 "strip_size_kb": 64, 00:09:21.412 "state": "configuring", 00:09:21.412 "raid_level": "raid0", 00:09:21.412 "superblock": true, 00:09:21.412 "num_base_bdevs": 2, 00:09:21.412 "num_base_bdevs_discovered": 0, 00:09:21.412 "num_base_bdevs_operational": 2, 00:09:21.412 "base_bdevs_list": [ 00:09:21.412 { 00:09:21.412 "name": "BaseBdev1", 00:09:21.412 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:21.412 "is_configured": false, 00:09:21.412 "data_offset": 0, 00:09:21.412 "data_size": 0 00:09:21.412 }, 00:09:21.412 { 00:09:21.412 "name": "BaseBdev2", 00:09:21.412 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:21.412 "is_configured": false, 00:09:21.412 "data_offset": 0, 00:09:21.412 "data_size": 0 00:09:21.412 } 00:09:21.412 ] 00:09:21.412 }' 00:09:21.412 18:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:21.412 18:46:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:21.670 18:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:21.927 [2024-07-24 18:46:06.802235] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:21.927 [2024-07-24 18:46:06.802258] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14fbb80 name Existed_Raid, state configuring 00:09:21.927 18:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:22.185 [2024-07-24 18:46:06.978714] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:22.185 [2024-07-24 18:46:06.978736] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:22.185 [2024-07-24 18:46:06.978741] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:22.185 [2024-07-24 18:46:06.978746] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:22.185 18:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:22.185 [2024-07-24 18:46:07.167299] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:22.185 BaseBdev1 00:09:22.185 18:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:09:22.185 18:46:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:09:22.185 18:46:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:22.185 18:46:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:09:22.185 18:46:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:22.185 18:46:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:22.185 18:46:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:22.443 18:46:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:22.701 [ 00:09:22.701 { 00:09:22.701 "name": "BaseBdev1", 00:09:22.701 "aliases": [ 00:09:22.701 "0bbe8803-6217-4759-8c99-5355e9bc4faf" 00:09:22.701 ], 00:09:22.701 "product_name": "Malloc disk", 00:09:22.701 "block_size": 512, 00:09:22.701 "num_blocks": 65536, 00:09:22.701 "uuid": "0bbe8803-6217-4759-8c99-5355e9bc4faf", 00:09:22.701 "assigned_rate_limits": { 00:09:22.701 "rw_ios_per_sec": 0, 00:09:22.701 "rw_mbytes_per_sec": 0, 00:09:22.701 "r_mbytes_per_sec": 0, 00:09:22.701 "w_mbytes_per_sec": 0 00:09:22.701 }, 00:09:22.701 "claimed": true, 00:09:22.701 "claim_type": "exclusive_write", 00:09:22.701 "zoned": false, 00:09:22.701 "supported_io_types": { 00:09:22.701 "read": true, 00:09:22.701 "write": true, 00:09:22.701 "unmap": true, 00:09:22.701 "flush": true, 00:09:22.701 "reset": true, 00:09:22.701 "nvme_admin": false, 00:09:22.701 "nvme_io": false, 00:09:22.701 "nvme_io_md": false, 00:09:22.701 "write_zeroes": true, 00:09:22.701 "zcopy": true, 00:09:22.701 "get_zone_info": false, 00:09:22.701 "zone_management": false, 00:09:22.701 "zone_append": false, 00:09:22.701 "compare": false, 00:09:22.701 "compare_and_write": false, 00:09:22.701 "abort": true, 00:09:22.701 "seek_hole": false, 00:09:22.701 "seek_data": false, 00:09:22.701 "copy": true, 00:09:22.701 "nvme_iov_md": false 00:09:22.701 }, 00:09:22.701 "memory_domains": [ 00:09:22.701 { 00:09:22.701 "dma_device_id": "system", 00:09:22.701 "dma_device_type": 1 00:09:22.701 }, 00:09:22.701 { 00:09:22.701 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:22.701 "dma_device_type": 2 00:09:22.701 } 00:09:22.701 ], 00:09:22.701 "driver_specific": {} 00:09:22.701 } 00:09:22.701 ] 00:09:22.701 18:46:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:09:22.701 18:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:22.701 18:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:22.701 18:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:22.701 18:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:22.701 18:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:22.701 18:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:22.701 18:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:22.701 18:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:22.701 18:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:22.701 18:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:22.701 18:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:22.701 18:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:22.701 18:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:22.701 "name": "Existed_Raid", 00:09:22.701 "uuid": "d24e56a3-8c9d-4767-aee4-2d6849fb1725", 00:09:22.701 "strip_size_kb": 64, 00:09:22.701 "state": "configuring", 00:09:22.701 "raid_level": "raid0", 00:09:22.701 "superblock": true, 00:09:22.701 "num_base_bdevs": 2, 00:09:22.701 "num_base_bdevs_discovered": 1, 00:09:22.701 "num_base_bdevs_operational": 2, 00:09:22.701 "base_bdevs_list": [ 00:09:22.701 { 00:09:22.701 "name": "BaseBdev1", 00:09:22.701 "uuid": "0bbe8803-6217-4759-8c99-5355e9bc4faf", 00:09:22.701 "is_configured": true, 00:09:22.701 "data_offset": 2048, 00:09:22.701 "data_size": 63488 00:09:22.701 }, 00:09:22.701 { 00:09:22.701 "name": "BaseBdev2", 00:09:22.701 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:22.701 "is_configured": false, 00:09:22.701 "data_offset": 0, 00:09:22.701 "data_size": 0 00:09:22.701 } 00:09:22.701 ] 00:09:22.701 }' 00:09:22.701 18:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:22.701 18:46:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:23.267 18:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:23.525 [2024-07-24 18:46:08.306242] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:23.525 [2024-07-24 18:46:08.306269] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14fb470 name Existed_Raid, state configuring 00:09:23.525 18:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:23.525 [2024-07-24 18:46:08.486814] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:23.525 [2024-07-24 18:46:08.487853] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:23.525 [2024-07-24 18:46:08.487878] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:23.525 18:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:09:23.525 18:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:23.525 18:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:23.525 18:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:23.525 18:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:23.525 18:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:23.525 18:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:23.525 18:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:23.525 18:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:23.525 18:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:23.525 18:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:23.525 18:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:23.525 18:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:23.525 18:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:23.783 18:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:23.783 "name": "Existed_Raid", 00:09:23.783 "uuid": "a3f4ca53-6356-4add-bffe-4d47765f81cc", 00:09:23.783 "strip_size_kb": 64, 00:09:23.783 "state": "configuring", 00:09:23.783 "raid_level": "raid0", 00:09:23.783 "superblock": true, 00:09:23.783 "num_base_bdevs": 2, 00:09:23.783 "num_base_bdevs_discovered": 1, 00:09:23.783 "num_base_bdevs_operational": 2, 00:09:23.783 "base_bdevs_list": [ 00:09:23.783 { 00:09:23.783 "name": "BaseBdev1", 00:09:23.783 "uuid": "0bbe8803-6217-4759-8c99-5355e9bc4faf", 00:09:23.783 "is_configured": true, 00:09:23.783 "data_offset": 2048, 00:09:23.783 "data_size": 63488 00:09:23.783 }, 00:09:23.783 { 00:09:23.783 "name": "BaseBdev2", 00:09:23.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:23.783 "is_configured": false, 00:09:23.783 "data_offset": 0, 00:09:23.783 "data_size": 0 00:09:23.783 } 00:09:23.783 ] 00:09:23.783 }' 00:09:23.783 18:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:23.783 18:46:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:24.349 18:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:24.349 [2024-07-24 18:46:09.323590] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:24.349 [2024-07-24 18:46:09.323692] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x14fc260 00:09:24.349 [2024-07-24 18:46:09.323701] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:24.349 [2024-07-24 18:46:09.323817] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14fb3c0 00:09:24.349 [2024-07-24 18:46:09.323895] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14fc260 00:09:24.349 [2024-07-24 18:46:09.323901] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x14fc260 00:09:24.349 [2024-07-24 18:46:09.323963] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:24.349 BaseBdev2 00:09:24.349 18:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:09:24.349 18:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:09:24.349 18:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:24.349 18:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:09:24.349 18:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:24.349 18:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:24.349 18:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:24.607 18:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:24.865 [ 00:09:24.865 { 00:09:24.865 "name": "BaseBdev2", 00:09:24.865 "aliases": [ 00:09:24.865 "61e50e67-4825-43a2-b005-f313ff691bab" 00:09:24.865 ], 00:09:24.865 "product_name": "Malloc disk", 00:09:24.865 "block_size": 512, 00:09:24.865 "num_blocks": 65536, 00:09:24.865 "uuid": "61e50e67-4825-43a2-b005-f313ff691bab", 00:09:24.865 "assigned_rate_limits": { 00:09:24.865 "rw_ios_per_sec": 0, 00:09:24.865 "rw_mbytes_per_sec": 0, 00:09:24.865 "r_mbytes_per_sec": 0, 00:09:24.865 "w_mbytes_per_sec": 0 00:09:24.865 }, 00:09:24.865 "claimed": true, 00:09:24.865 "claim_type": "exclusive_write", 00:09:24.865 "zoned": false, 00:09:24.865 "supported_io_types": { 00:09:24.865 "read": true, 00:09:24.865 "write": true, 00:09:24.865 "unmap": true, 00:09:24.865 "flush": true, 00:09:24.865 "reset": true, 00:09:24.865 "nvme_admin": false, 00:09:24.865 "nvme_io": false, 00:09:24.865 "nvme_io_md": false, 00:09:24.865 "write_zeroes": true, 00:09:24.865 "zcopy": true, 00:09:24.865 "get_zone_info": false, 00:09:24.865 "zone_management": false, 00:09:24.865 "zone_append": false, 00:09:24.865 "compare": false, 00:09:24.865 "compare_and_write": false, 00:09:24.865 "abort": true, 00:09:24.865 "seek_hole": false, 00:09:24.865 "seek_data": false, 00:09:24.865 "copy": true, 00:09:24.865 "nvme_iov_md": false 00:09:24.865 }, 00:09:24.865 "memory_domains": [ 00:09:24.865 { 00:09:24.865 "dma_device_id": "system", 00:09:24.865 "dma_device_type": 1 00:09:24.865 }, 00:09:24.865 { 00:09:24.865 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:24.865 "dma_device_type": 2 00:09:24.865 } 00:09:24.865 ], 00:09:24.865 "driver_specific": {} 00:09:24.865 } 00:09:24.865 ] 00:09:24.865 18:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:09:24.865 18:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:09:24.865 18:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:24.865 18:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:09:24.865 18:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:24.865 18:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:24.865 18:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:24.865 18:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:24.865 18:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:24.865 18:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:24.865 18:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:24.865 18:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:24.865 18:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:24.865 18:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:24.865 18:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:25.123 18:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:25.123 "name": "Existed_Raid", 00:09:25.123 "uuid": "a3f4ca53-6356-4add-bffe-4d47765f81cc", 00:09:25.123 "strip_size_kb": 64, 00:09:25.123 "state": "online", 00:09:25.123 "raid_level": "raid0", 00:09:25.123 "superblock": true, 00:09:25.123 "num_base_bdevs": 2, 00:09:25.123 "num_base_bdevs_discovered": 2, 00:09:25.123 "num_base_bdevs_operational": 2, 00:09:25.123 "base_bdevs_list": [ 00:09:25.123 { 00:09:25.123 "name": "BaseBdev1", 00:09:25.123 "uuid": "0bbe8803-6217-4759-8c99-5355e9bc4faf", 00:09:25.123 "is_configured": true, 00:09:25.123 "data_offset": 2048, 00:09:25.123 "data_size": 63488 00:09:25.123 }, 00:09:25.123 { 00:09:25.123 "name": "BaseBdev2", 00:09:25.123 "uuid": "61e50e67-4825-43a2-b005-f313ff691bab", 00:09:25.123 "is_configured": true, 00:09:25.123 "data_offset": 2048, 00:09:25.123 "data_size": 63488 00:09:25.123 } 00:09:25.123 ] 00:09:25.123 }' 00:09:25.123 18:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:25.123 18:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:25.381 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:09:25.381 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:09:25.381 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:25.381 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:25.381 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:25.381 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:09:25.381 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:25.381 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:25.639 [2024-07-24 18:46:10.526872] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:25.639 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:25.639 "name": "Existed_Raid", 00:09:25.639 "aliases": [ 00:09:25.639 "a3f4ca53-6356-4add-bffe-4d47765f81cc" 00:09:25.639 ], 00:09:25.639 "product_name": "Raid Volume", 00:09:25.639 "block_size": 512, 00:09:25.639 "num_blocks": 126976, 00:09:25.639 "uuid": "a3f4ca53-6356-4add-bffe-4d47765f81cc", 00:09:25.639 "assigned_rate_limits": { 00:09:25.639 "rw_ios_per_sec": 0, 00:09:25.639 "rw_mbytes_per_sec": 0, 00:09:25.639 "r_mbytes_per_sec": 0, 00:09:25.639 "w_mbytes_per_sec": 0 00:09:25.639 }, 00:09:25.639 "claimed": false, 00:09:25.639 "zoned": false, 00:09:25.639 "supported_io_types": { 00:09:25.639 "read": true, 00:09:25.639 "write": true, 00:09:25.639 "unmap": true, 00:09:25.639 "flush": true, 00:09:25.639 "reset": true, 00:09:25.639 "nvme_admin": false, 00:09:25.639 "nvme_io": false, 00:09:25.639 "nvme_io_md": false, 00:09:25.639 "write_zeroes": true, 00:09:25.639 "zcopy": false, 00:09:25.639 "get_zone_info": false, 00:09:25.639 "zone_management": false, 00:09:25.639 "zone_append": false, 00:09:25.639 "compare": false, 00:09:25.639 "compare_and_write": false, 00:09:25.639 "abort": false, 00:09:25.639 "seek_hole": false, 00:09:25.639 "seek_data": false, 00:09:25.639 "copy": false, 00:09:25.639 "nvme_iov_md": false 00:09:25.639 }, 00:09:25.639 "memory_domains": [ 00:09:25.639 { 00:09:25.639 "dma_device_id": "system", 00:09:25.639 "dma_device_type": 1 00:09:25.639 }, 00:09:25.639 { 00:09:25.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:25.639 "dma_device_type": 2 00:09:25.639 }, 00:09:25.639 { 00:09:25.639 "dma_device_id": "system", 00:09:25.639 "dma_device_type": 1 00:09:25.639 }, 00:09:25.639 { 00:09:25.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:25.639 "dma_device_type": 2 00:09:25.639 } 00:09:25.639 ], 00:09:25.639 "driver_specific": { 00:09:25.639 "raid": { 00:09:25.639 "uuid": "a3f4ca53-6356-4add-bffe-4d47765f81cc", 00:09:25.639 "strip_size_kb": 64, 00:09:25.639 "state": "online", 00:09:25.639 "raid_level": "raid0", 00:09:25.639 "superblock": true, 00:09:25.639 "num_base_bdevs": 2, 00:09:25.639 "num_base_bdevs_discovered": 2, 00:09:25.639 "num_base_bdevs_operational": 2, 00:09:25.639 "base_bdevs_list": [ 00:09:25.639 { 00:09:25.639 "name": "BaseBdev1", 00:09:25.639 "uuid": "0bbe8803-6217-4759-8c99-5355e9bc4faf", 00:09:25.639 "is_configured": true, 00:09:25.639 "data_offset": 2048, 00:09:25.639 "data_size": 63488 00:09:25.639 }, 00:09:25.639 { 00:09:25.639 "name": "BaseBdev2", 00:09:25.639 "uuid": "61e50e67-4825-43a2-b005-f313ff691bab", 00:09:25.639 "is_configured": true, 00:09:25.639 "data_offset": 2048, 00:09:25.639 "data_size": 63488 00:09:25.639 } 00:09:25.639 ] 00:09:25.639 } 00:09:25.639 } 00:09:25.639 }' 00:09:25.639 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:25.639 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:09:25.639 BaseBdev2' 00:09:25.639 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:25.639 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:25.639 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:25.896 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:25.896 "name": "BaseBdev1", 00:09:25.896 "aliases": [ 00:09:25.896 "0bbe8803-6217-4759-8c99-5355e9bc4faf" 00:09:25.896 ], 00:09:25.896 "product_name": "Malloc disk", 00:09:25.896 "block_size": 512, 00:09:25.896 "num_blocks": 65536, 00:09:25.896 "uuid": "0bbe8803-6217-4759-8c99-5355e9bc4faf", 00:09:25.896 "assigned_rate_limits": { 00:09:25.896 "rw_ios_per_sec": 0, 00:09:25.896 "rw_mbytes_per_sec": 0, 00:09:25.896 "r_mbytes_per_sec": 0, 00:09:25.896 "w_mbytes_per_sec": 0 00:09:25.896 }, 00:09:25.896 "claimed": true, 00:09:25.896 "claim_type": "exclusive_write", 00:09:25.896 "zoned": false, 00:09:25.896 "supported_io_types": { 00:09:25.896 "read": true, 00:09:25.896 "write": true, 00:09:25.896 "unmap": true, 00:09:25.897 "flush": true, 00:09:25.897 "reset": true, 00:09:25.897 "nvme_admin": false, 00:09:25.897 "nvme_io": false, 00:09:25.897 "nvme_io_md": false, 00:09:25.897 "write_zeroes": true, 00:09:25.897 "zcopy": true, 00:09:25.897 "get_zone_info": false, 00:09:25.897 "zone_management": false, 00:09:25.897 "zone_append": false, 00:09:25.897 "compare": false, 00:09:25.897 "compare_and_write": false, 00:09:25.897 "abort": true, 00:09:25.897 "seek_hole": false, 00:09:25.897 "seek_data": false, 00:09:25.897 "copy": true, 00:09:25.897 "nvme_iov_md": false 00:09:25.897 }, 00:09:25.897 "memory_domains": [ 00:09:25.897 { 00:09:25.897 "dma_device_id": "system", 00:09:25.897 "dma_device_type": 1 00:09:25.897 }, 00:09:25.897 { 00:09:25.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:25.897 "dma_device_type": 2 00:09:25.897 } 00:09:25.897 ], 00:09:25.897 "driver_specific": {} 00:09:25.897 }' 00:09:25.897 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:25.897 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:25.897 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:25.897 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:25.897 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:26.154 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:26.154 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:26.154 18:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:26.154 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:26.154 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:26.154 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:26.154 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:26.154 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:26.154 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:26.154 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:26.412 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:26.412 "name": "BaseBdev2", 00:09:26.412 "aliases": [ 00:09:26.412 "61e50e67-4825-43a2-b005-f313ff691bab" 00:09:26.412 ], 00:09:26.412 "product_name": "Malloc disk", 00:09:26.412 "block_size": 512, 00:09:26.412 "num_blocks": 65536, 00:09:26.412 "uuid": "61e50e67-4825-43a2-b005-f313ff691bab", 00:09:26.412 "assigned_rate_limits": { 00:09:26.412 "rw_ios_per_sec": 0, 00:09:26.412 "rw_mbytes_per_sec": 0, 00:09:26.412 "r_mbytes_per_sec": 0, 00:09:26.412 "w_mbytes_per_sec": 0 00:09:26.412 }, 00:09:26.412 "claimed": true, 00:09:26.412 "claim_type": "exclusive_write", 00:09:26.412 "zoned": false, 00:09:26.412 "supported_io_types": { 00:09:26.412 "read": true, 00:09:26.412 "write": true, 00:09:26.412 "unmap": true, 00:09:26.412 "flush": true, 00:09:26.412 "reset": true, 00:09:26.412 "nvme_admin": false, 00:09:26.412 "nvme_io": false, 00:09:26.412 "nvme_io_md": false, 00:09:26.412 "write_zeroes": true, 00:09:26.412 "zcopy": true, 00:09:26.412 "get_zone_info": false, 00:09:26.412 "zone_management": false, 00:09:26.412 "zone_append": false, 00:09:26.412 "compare": false, 00:09:26.412 "compare_and_write": false, 00:09:26.412 "abort": true, 00:09:26.412 "seek_hole": false, 00:09:26.412 "seek_data": false, 00:09:26.412 "copy": true, 00:09:26.412 "nvme_iov_md": false 00:09:26.412 }, 00:09:26.412 "memory_domains": [ 00:09:26.412 { 00:09:26.412 "dma_device_id": "system", 00:09:26.412 "dma_device_type": 1 00:09:26.412 }, 00:09:26.412 { 00:09:26.412 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:26.412 "dma_device_type": 2 00:09:26.412 } 00:09:26.412 ], 00:09:26.412 "driver_specific": {} 00:09:26.412 }' 00:09:26.412 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:26.412 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:26.412 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:26.412 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:26.412 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:26.412 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:26.412 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:26.670 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:26.670 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:26.670 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:26.670 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:26.670 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:26.670 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:26.928 [2024-07-24 18:46:11.693743] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:26.928 [2024-07-24 18:46:11.693761] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:26.928 [2024-07-24 18:46:11.693791] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:26.928 "name": "Existed_Raid", 00:09:26.928 "uuid": "a3f4ca53-6356-4add-bffe-4d47765f81cc", 00:09:26.928 "strip_size_kb": 64, 00:09:26.928 "state": "offline", 00:09:26.928 "raid_level": "raid0", 00:09:26.928 "superblock": true, 00:09:26.928 "num_base_bdevs": 2, 00:09:26.928 "num_base_bdevs_discovered": 1, 00:09:26.928 "num_base_bdevs_operational": 1, 00:09:26.928 "base_bdevs_list": [ 00:09:26.928 { 00:09:26.928 "name": null, 00:09:26.928 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:26.928 "is_configured": false, 00:09:26.928 "data_offset": 2048, 00:09:26.928 "data_size": 63488 00:09:26.928 }, 00:09:26.928 { 00:09:26.928 "name": "BaseBdev2", 00:09:26.928 "uuid": "61e50e67-4825-43a2-b005-f313ff691bab", 00:09:26.928 "is_configured": true, 00:09:26.928 "data_offset": 2048, 00:09:26.928 "data_size": 63488 00:09:26.928 } 00:09:26.928 ] 00:09:26.928 }' 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:26.928 18:46:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:27.493 18:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:09:27.493 18:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:27.493 18:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:27.493 18:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:09:27.751 18:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:09:27.751 18:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:27.751 18:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:27.751 [2024-07-24 18:46:12.693243] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:27.751 [2024-07-24 18:46:12.693284] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14fc260 name Existed_Raid, state offline 00:09:27.751 18:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:09:27.751 18:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:27.751 18:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:27.751 18:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:09:28.009 18:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:09:28.009 18:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:09:28.009 18:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:09:28.009 18:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2047936 00:09:28.009 18:46:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2047936 ']' 00:09:28.009 18:46:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2047936 00:09:28.009 18:46:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:09:28.009 18:46:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:28.009 18:46:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2047936 00:09:28.009 18:46:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:28.009 18:46:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:28.009 18:46:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2047936' 00:09:28.009 killing process with pid 2047936 00:09:28.009 18:46:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2047936 00:09:28.009 [2024-07-24 18:46:12.931379] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:28.009 18:46:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2047936 00:09:28.009 [2024-07-24 18:46:12.932141] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:28.267 18:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:09:28.267 00:09:28.267 real 0m8.110s 00:09:28.267 user 0m14.529s 00:09:28.267 sys 0m1.328s 00:09:28.267 18:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:28.267 18:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:28.267 ************************************ 00:09:28.267 END TEST raid_state_function_test_sb 00:09:28.267 ************************************ 00:09:28.267 18:46:13 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:09:28.267 18:46:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:09:28.267 18:46:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:28.267 18:46:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:28.267 ************************************ 00:09:28.267 START TEST raid_superblock_test 00:09:28.268 ************************************ 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2049533 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2049533 /var/tmp/spdk-raid.sock 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2049533 ']' 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:28.268 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:28.268 18:46:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:28.268 [2024-07-24 18:46:13.212954] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:09:28.268 [2024-07-24 18:46:13.212993] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2049533 ] 00:09:28.268 [2024-07-24 18:46:13.270547] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:28.526 [2024-07-24 18:46:13.349775] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:28.526 [2024-07-24 18:46:13.400390] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:28.526 [2024-07-24 18:46:13.400418] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:29.090 18:46:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:29.090 18:46:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:09:29.090 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:09:29.090 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:29.090 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:09:29.090 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:09:29.090 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:09:29.090 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:09:29.090 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:09:29.090 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:09:29.090 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:09:29.347 malloc1 00:09:29.347 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:09:29.347 [2024-07-24 18:46:14.304705] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:09:29.347 [2024-07-24 18:46:14.304739] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:29.347 [2024-07-24 18:46:14.304751] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x121be20 00:09:29.347 [2024-07-24 18:46:14.304773] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:29.348 [2024-07-24 18:46:14.305916] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:29.348 [2024-07-24 18:46:14.305937] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:09:29.348 pt1 00:09:29.348 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:09:29.348 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:29.348 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:09:29.348 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:09:29.348 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:09:29.348 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:09:29.348 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:09:29.348 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:09:29.348 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:09:29.605 malloc2 00:09:29.605 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:09:29.863 [2024-07-24 18:46:14.641324] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:09:29.863 [2024-07-24 18:46:14.641355] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:29.863 [2024-07-24 18:46:14.641365] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13c5ed0 00:09:29.863 [2024-07-24 18:46:14.641371] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:29.863 [2024-07-24 18:46:14.642429] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:29.863 [2024-07-24 18:46:14.642449] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:09:29.863 pt2 00:09:29.863 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:09:29.863 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:29.863 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:09:29.863 [2024-07-24 18:46:14.809788] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:09:29.863 [2024-07-24 18:46:14.810660] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:09:29.863 [2024-07-24 18:46:14.810758] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x13c5170 00:09:29.863 [2024-07-24 18:46:14.810766] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:29.863 [2024-07-24 18:46:14.810894] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13c75d0 00:09:29.863 [2024-07-24 18:46:14.810984] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13c5170 00:09:29.863 [2024-07-24 18:46:14.810989] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13c5170 00:09:29.863 [2024-07-24 18:46:14.811054] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:29.863 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:29.863 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:29.863 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:29.863 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:29.863 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:29.863 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:29.863 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:29.863 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:29.863 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:29.863 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:29.863 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:29.863 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:30.121 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:30.121 "name": "raid_bdev1", 00:09:30.121 "uuid": "dba11512-a124-4497-8a7d-b26bf2c0b280", 00:09:30.121 "strip_size_kb": 64, 00:09:30.121 "state": "online", 00:09:30.121 "raid_level": "raid0", 00:09:30.121 "superblock": true, 00:09:30.121 "num_base_bdevs": 2, 00:09:30.121 "num_base_bdevs_discovered": 2, 00:09:30.121 "num_base_bdevs_operational": 2, 00:09:30.121 "base_bdevs_list": [ 00:09:30.121 { 00:09:30.121 "name": "pt1", 00:09:30.121 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:30.121 "is_configured": true, 00:09:30.121 "data_offset": 2048, 00:09:30.121 "data_size": 63488 00:09:30.121 }, 00:09:30.121 { 00:09:30.121 "name": "pt2", 00:09:30.121 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:30.121 "is_configured": true, 00:09:30.121 "data_offset": 2048, 00:09:30.121 "data_size": 63488 00:09:30.121 } 00:09:30.121 ] 00:09:30.121 }' 00:09:30.121 18:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:30.121 18:46:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:30.687 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:09:30.687 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:09:30.687 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:30.687 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:30.687 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:30.687 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:30.687 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:30.687 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:30.687 [2024-07-24 18:46:15.636066] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:30.687 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:30.687 "name": "raid_bdev1", 00:09:30.687 "aliases": [ 00:09:30.687 "dba11512-a124-4497-8a7d-b26bf2c0b280" 00:09:30.687 ], 00:09:30.687 "product_name": "Raid Volume", 00:09:30.687 "block_size": 512, 00:09:30.687 "num_blocks": 126976, 00:09:30.687 "uuid": "dba11512-a124-4497-8a7d-b26bf2c0b280", 00:09:30.687 "assigned_rate_limits": { 00:09:30.687 "rw_ios_per_sec": 0, 00:09:30.687 "rw_mbytes_per_sec": 0, 00:09:30.687 "r_mbytes_per_sec": 0, 00:09:30.687 "w_mbytes_per_sec": 0 00:09:30.687 }, 00:09:30.687 "claimed": false, 00:09:30.687 "zoned": false, 00:09:30.687 "supported_io_types": { 00:09:30.687 "read": true, 00:09:30.687 "write": true, 00:09:30.687 "unmap": true, 00:09:30.687 "flush": true, 00:09:30.687 "reset": true, 00:09:30.687 "nvme_admin": false, 00:09:30.687 "nvme_io": false, 00:09:30.687 "nvme_io_md": false, 00:09:30.687 "write_zeroes": true, 00:09:30.687 "zcopy": false, 00:09:30.687 "get_zone_info": false, 00:09:30.687 "zone_management": false, 00:09:30.687 "zone_append": false, 00:09:30.687 "compare": false, 00:09:30.687 "compare_and_write": false, 00:09:30.687 "abort": false, 00:09:30.687 "seek_hole": false, 00:09:30.687 "seek_data": false, 00:09:30.687 "copy": false, 00:09:30.687 "nvme_iov_md": false 00:09:30.687 }, 00:09:30.687 "memory_domains": [ 00:09:30.687 { 00:09:30.687 "dma_device_id": "system", 00:09:30.687 "dma_device_type": 1 00:09:30.687 }, 00:09:30.687 { 00:09:30.687 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:30.687 "dma_device_type": 2 00:09:30.687 }, 00:09:30.687 { 00:09:30.687 "dma_device_id": "system", 00:09:30.687 "dma_device_type": 1 00:09:30.687 }, 00:09:30.687 { 00:09:30.687 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:30.687 "dma_device_type": 2 00:09:30.687 } 00:09:30.687 ], 00:09:30.687 "driver_specific": { 00:09:30.687 "raid": { 00:09:30.687 "uuid": "dba11512-a124-4497-8a7d-b26bf2c0b280", 00:09:30.687 "strip_size_kb": 64, 00:09:30.687 "state": "online", 00:09:30.687 "raid_level": "raid0", 00:09:30.687 "superblock": true, 00:09:30.687 "num_base_bdevs": 2, 00:09:30.687 "num_base_bdevs_discovered": 2, 00:09:30.687 "num_base_bdevs_operational": 2, 00:09:30.687 "base_bdevs_list": [ 00:09:30.687 { 00:09:30.687 "name": "pt1", 00:09:30.687 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:30.687 "is_configured": true, 00:09:30.687 "data_offset": 2048, 00:09:30.687 "data_size": 63488 00:09:30.687 }, 00:09:30.687 { 00:09:30.687 "name": "pt2", 00:09:30.687 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:30.687 "is_configured": true, 00:09:30.687 "data_offset": 2048, 00:09:30.687 "data_size": 63488 00:09:30.687 } 00:09:30.687 ] 00:09:30.687 } 00:09:30.687 } 00:09:30.687 }' 00:09:30.687 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:30.945 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:09:30.945 pt2' 00:09:30.945 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:30.945 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:09:30.945 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:30.945 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:30.945 "name": "pt1", 00:09:30.945 "aliases": [ 00:09:30.945 "00000000-0000-0000-0000-000000000001" 00:09:30.945 ], 00:09:30.945 "product_name": "passthru", 00:09:30.945 "block_size": 512, 00:09:30.945 "num_blocks": 65536, 00:09:30.945 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:30.945 "assigned_rate_limits": { 00:09:30.945 "rw_ios_per_sec": 0, 00:09:30.945 "rw_mbytes_per_sec": 0, 00:09:30.945 "r_mbytes_per_sec": 0, 00:09:30.945 "w_mbytes_per_sec": 0 00:09:30.945 }, 00:09:30.945 "claimed": true, 00:09:30.945 "claim_type": "exclusive_write", 00:09:30.945 "zoned": false, 00:09:30.945 "supported_io_types": { 00:09:30.945 "read": true, 00:09:30.945 "write": true, 00:09:30.945 "unmap": true, 00:09:30.945 "flush": true, 00:09:30.945 "reset": true, 00:09:30.945 "nvme_admin": false, 00:09:30.945 "nvme_io": false, 00:09:30.945 "nvme_io_md": false, 00:09:30.945 "write_zeroes": true, 00:09:30.945 "zcopy": true, 00:09:30.945 "get_zone_info": false, 00:09:30.945 "zone_management": false, 00:09:30.945 "zone_append": false, 00:09:30.945 "compare": false, 00:09:30.945 "compare_and_write": false, 00:09:30.945 "abort": true, 00:09:30.945 "seek_hole": false, 00:09:30.945 "seek_data": false, 00:09:30.945 "copy": true, 00:09:30.945 "nvme_iov_md": false 00:09:30.945 }, 00:09:30.945 "memory_domains": [ 00:09:30.945 { 00:09:30.945 "dma_device_id": "system", 00:09:30.945 "dma_device_type": 1 00:09:30.945 }, 00:09:30.945 { 00:09:30.945 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:30.945 "dma_device_type": 2 00:09:30.945 } 00:09:30.945 ], 00:09:30.945 "driver_specific": { 00:09:30.945 "passthru": { 00:09:30.945 "name": "pt1", 00:09:30.945 "base_bdev_name": "malloc1" 00:09:30.945 } 00:09:30.945 } 00:09:30.945 }' 00:09:30.945 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:30.945 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:30.945 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:30.945 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:31.203 18:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:31.203 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:31.203 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:31.203 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:31.203 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:31.203 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:31.203 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:31.203 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:31.203 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:31.203 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:09:31.203 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:31.460 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:31.460 "name": "pt2", 00:09:31.460 "aliases": [ 00:09:31.460 "00000000-0000-0000-0000-000000000002" 00:09:31.460 ], 00:09:31.460 "product_name": "passthru", 00:09:31.460 "block_size": 512, 00:09:31.460 "num_blocks": 65536, 00:09:31.460 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:31.460 "assigned_rate_limits": { 00:09:31.460 "rw_ios_per_sec": 0, 00:09:31.460 "rw_mbytes_per_sec": 0, 00:09:31.460 "r_mbytes_per_sec": 0, 00:09:31.460 "w_mbytes_per_sec": 0 00:09:31.460 }, 00:09:31.460 "claimed": true, 00:09:31.460 "claim_type": "exclusive_write", 00:09:31.460 "zoned": false, 00:09:31.460 "supported_io_types": { 00:09:31.460 "read": true, 00:09:31.460 "write": true, 00:09:31.460 "unmap": true, 00:09:31.460 "flush": true, 00:09:31.460 "reset": true, 00:09:31.460 "nvme_admin": false, 00:09:31.460 "nvme_io": false, 00:09:31.460 "nvme_io_md": false, 00:09:31.460 "write_zeroes": true, 00:09:31.460 "zcopy": true, 00:09:31.460 "get_zone_info": false, 00:09:31.460 "zone_management": false, 00:09:31.460 "zone_append": false, 00:09:31.460 "compare": false, 00:09:31.460 "compare_and_write": false, 00:09:31.460 "abort": true, 00:09:31.460 "seek_hole": false, 00:09:31.460 "seek_data": false, 00:09:31.460 "copy": true, 00:09:31.460 "nvme_iov_md": false 00:09:31.460 }, 00:09:31.460 "memory_domains": [ 00:09:31.460 { 00:09:31.460 "dma_device_id": "system", 00:09:31.460 "dma_device_type": 1 00:09:31.460 }, 00:09:31.460 { 00:09:31.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:31.460 "dma_device_type": 2 00:09:31.460 } 00:09:31.460 ], 00:09:31.460 "driver_specific": { 00:09:31.460 "passthru": { 00:09:31.460 "name": "pt2", 00:09:31.460 "base_bdev_name": "malloc2" 00:09:31.460 } 00:09:31.460 } 00:09:31.460 }' 00:09:31.460 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:31.460 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:31.460 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:31.460 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:31.461 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:31.718 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:31.718 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:31.718 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:31.718 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:31.718 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:31.718 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:31.718 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:31.718 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:31.718 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:09:31.975 [2024-07-24 18:46:16.803071] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:31.975 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=dba11512-a124-4497-8a7d-b26bf2c0b280 00:09:31.975 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z dba11512-a124-4497-8a7d-b26bf2c0b280 ']' 00:09:31.975 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:09:31.975 [2024-07-24 18:46:16.971358] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:31.975 [2024-07-24 18:46:16.971369] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:31.975 [2024-07-24 18:46:16.971409] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:31.975 [2024-07-24 18:46:16.971440] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:31.975 [2024-07-24 18:46:16.971446] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13c5170 name raid_bdev1, state offline 00:09:32.233 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:32.233 18:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:09:32.233 18:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:09:32.233 18:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:09:32.233 18:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:09:32.233 18:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:09:32.491 18:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:09:32.491 18:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:09:32.491 18:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:09:32.491 18:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:09:32.749 18:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:09:32.749 18:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:09:32.749 18:46:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:09:32.749 18:46:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:09:32.749 18:46:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:32.749 18:46:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:32.749 18:46:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:32.749 18:46:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:32.749 18:46:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:32.749 18:46:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:32.749 18:46:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:32.749 18:46:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:09:32.749 18:46:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:09:33.006 [2024-07-24 18:46:17.825577] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:09:33.006 [2024-07-24 18:46:17.826561] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:09:33.006 [2024-07-24 18:46:17.826602] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:09:33.006 [2024-07-24 18:46:17.826629] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:09:33.006 [2024-07-24 18:46:17.826639] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:33.006 [2024-07-24 18:46:17.826644] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13c6900 name raid_bdev1, state configuring 00:09:33.006 request: 00:09:33.006 { 00:09:33.006 "name": "raid_bdev1", 00:09:33.006 "raid_level": "raid0", 00:09:33.006 "base_bdevs": [ 00:09:33.006 "malloc1", 00:09:33.006 "malloc2" 00:09:33.006 ], 00:09:33.006 "strip_size_kb": 64, 00:09:33.006 "superblock": false, 00:09:33.006 "method": "bdev_raid_create", 00:09:33.006 "req_id": 1 00:09:33.006 } 00:09:33.006 Got JSON-RPC error response 00:09:33.006 response: 00:09:33.006 { 00:09:33.006 "code": -17, 00:09:33.006 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:09:33.006 } 00:09:33.006 18:46:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:09:33.006 18:46:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:33.006 18:46:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:09:33.006 18:46:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:33.006 18:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:09:33.006 18:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:33.006 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:09:33.006 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:09:33.006 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:09:33.263 [2024-07-24 18:46:18.154404] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:09:33.263 [2024-07-24 18:46:18.154433] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:33.263 [2024-07-24 18:46:18.154443] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13c9200 00:09:33.263 [2024-07-24 18:46:18.154449] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:33.263 [2024-07-24 18:46:18.155612] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:33.263 [2024-07-24 18:46:18.155632] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:09:33.263 [2024-07-24 18:46:18.155675] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:09:33.263 [2024-07-24 18:46:18.155692] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:09:33.263 pt1 00:09:33.263 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:09:33.263 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:33.263 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:33.263 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:33.263 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:33.263 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:33.264 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:33.264 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:33.264 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:33.264 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:33.264 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:33.264 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:33.521 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:33.521 "name": "raid_bdev1", 00:09:33.521 "uuid": "dba11512-a124-4497-8a7d-b26bf2c0b280", 00:09:33.521 "strip_size_kb": 64, 00:09:33.521 "state": "configuring", 00:09:33.521 "raid_level": "raid0", 00:09:33.521 "superblock": true, 00:09:33.521 "num_base_bdevs": 2, 00:09:33.521 "num_base_bdevs_discovered": 1, 00:09:33.521 "num_base_bdevs_operational": 2, 00:09:33.521 "base_bdevs_list": [ 00:09:33.521 { 00:09:33.521 "name": "pt1", 00:09:33.521 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:33.521 "is_configured": true, 00:09:33.521 "data_offset": 2048, 00:09:33.521 "data_size": 63488 00:09:33.521 }, 00:09:33.521 { 00:09:33.521 "name": null, 00:09:33.521 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:33.521 "is_configured": false, 00:09:33.521 "data_offset": 2048, 00:09:33.521 "data_size": 63488 00:09:33.521 } 00:09:33.521 ] 00:09:33.521 }' 00:09:33.521 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:33.521 18:46:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:34.125 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:09:34.125 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:09:34.125 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:09:34.125 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:09:34.125 [2024-07-24 18:46:18.964497] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:09:34.125 [2024-07-24 18:46:18.964532] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:34.125 [2024-07-24 18:46:18.964543] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x121c050 00:09:34.125 [2024-07-24 18:46:18.964566] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:34.125 [2024-07-24 18:46:18.964804] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:34.125 [2024-07-24 18:46:18.964814] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:09:34.125 [2024-07-24 18:46:18.964854] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:09:34.125 [2024-07-24 18:46:18.964866] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:09:34.125 [2024-07-24 18:46:18.964931] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x121afe0 00:09:34.125 [2024-07-24 18:46:18.964936] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:34.125 [2024-07-24 18:46:18.965051] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13c8c60 00:09:34.125 [2024-07-24 18:46:18.965130] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x121afe0 00:09:34.125 [2024-07-24 18:46:18.965136] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x121afe0 00:09:34.125 [2024-07-24 18:46:18.965201] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:34.125 pt2 00:09:34.125 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:09:34.125 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:09:34.125 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:34.125 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:34.125 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:34.125 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:34.125 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:34.125 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:34.125 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:34.125 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:34.125 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:34.125 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:34.125 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:34.125 18:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:34.383 18:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:34.383 "name": "raid_bdev1", 00:09:34.383 "uuid": "dba11512-a124-4497-8a7d-b26bf2c0b280", 00:09:34.383 "strip_size_kb": 64, 00:09:34.383 "state": "online", 00:09:34.383 "raid_level": "raid0", 00:09:34.383 "superblock": true, 00:09:34.383 "num_base_bdevs": 2, 00:09:34.383 "num_base_bdevs_discovered": 2, 00:09:34.383 "num_base_bdevs_operational": 2, 00:09:34.383 "base_bdevs_list": [ 00:09:34.383 { 00:09:34.383 "name": "pt1", 00:09:34.383 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:34.383 "is_configured": true, 00:09:34.383 "data_offset": 2048, 00:09:34.383 "data_size": 63488 00:09:34.383 }, 00:09:34.383 { 00:09:34.383 "name": "pt2", 00:09:34.383 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:34.383 "is_configured": true, 00:09:34.383 "data_offset": 2048, 00:09:34.383 "data_size": 63488 00:09:34.383 } 00:09:34.383 ] 00:09:34.383 }' 00:09:34.383 18:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:34.383 18:46:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:34.640 18:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:09:34.640 18:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:09:34.640 18:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:34.640 18:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:34.640 18:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:34.640 18:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:34.640 18:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:34.640 18:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:34.898 [2024-07-24 18:46:19.794785] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:34.898 18:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:34.898 "name": "raid_bdev1", 00:09:34.898 "aliases": [ 00:09:34.898 "dba11512-a124-4497-8a7d-b26bf2c0b280" 00:09:34.898 ], 00:09:34.898 "product_name": "Raid Volume", 00:09:34.898 "block_size": 512, 00:09:34.898 "num_blocks": 126976, 00:09:34.898 "uuid": "dba11512-a124-4497-8a7d-b26bf2c0b280", 00:09:34.898 "assigned_rate_limits": { 00:09:34.898 "rw_ios_per_sec": 0, 00:09:34.898 "rw_mbytes_per_sec": 0, 00:09:34.898 "r_mbytes_per_sec": 0, 00:09:34.898 "w_mbytes_per_sec": 0 00:09:34.898 }, 00:09:34.898 "claimed": false, 00:09:34.898 "zoned": false, 00:09:34.898 "supported_io_types": { 00:09:34.898 "read": true, 00:09:34.898 "write": true, 00:09:34.898 "unmap": true, 00:09:34.898 "flush": true, 00:09:34.898 "reset": true, 00:09:34.898 "nvme_admin": false, 00:09:34.898 "nvme_io": false, 00:09:34.898 "nvme_io_md": false, 00:09:34.898 "write_zeroes": true, 00:09:34.898 "zcopy": false, 00:09:34.898 "get_zone_info": false, 00:09:34.898 "zone_management": false, 00:09:34.898 "zone_append": false, 00:09:34.898 "compare": false, 00:09:34.898 "compare_and_write": false, 00:09:34.898 "abort": false, 00:09:34.898 "seek_hole": false, 00:09:34.898 "seek_data": false, 00:09:34.898 "copy": false, 00:09:34.898 "nvme_iov_md": false 00:09:34.898 }, 00:09:34.898 "memory_domains": [ 00:09:34.898 { 00:09:34.898 "dma_device_id": "system", 00:09:34.898 "dma_device_type": 1 00:09:34.898 }, 00:09:34.898 { 00:09:34.898 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:34.898 "dma_device_type": 2 00:09:34.898 }, 00:09:34.898 { 00:09:34.898 "dma_device_id": "system", 00:09:34.898 "dma_device_type": 1 00:09:34.898 }, 00:09:34.898 { 00:09:34.898 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:34.898 "dma_device_type": 2 00:09:34.898 } 00:09:34.898 ], 00:09:34.898 "driver_specific": { 00:09:34.898 "raid": { 00:09:34.898 "uuid": "dba11512-a124-4497-8a7d-b26bf2c0b280", 00:09:34.898 "strip_size_kb": 64, 00:09:34.898 "state": "online", 00:09:34.898 "raid_level": "raid0", 00:09:34.898 "superblock": true, 00:09:34.898 "num_base_bdevs": 2, 00:09:34.898 "num_base_bdevs_discovered": 2, 00:09:34.898 "num_base_bdevs_operational": 2, 00:09:34.898 "base_bdevs_list": [ 00:09:34.898 { 00:09:34.898 "name": "pt1", 00:09:34.898 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:34.898 "is_configured": true, 00:09:34.898 "data_offset": 2048, 00:09:34.898 "data_size": 63488 00:09:34.898 }, 00:09:34.898 { 00:09:34.898 "name": "pt2", 00:09:34.898 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:34.898 "is_configured": true, 00:09:34.898 "data_offset": 2048, 00:09:34.898 "data_size": 63488 00:09:34.898 } 00:09:34.898 ] 00:09:34.898 } 00:09:34.898 } 00:09:34.898 }' 00:09:34.898 18:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:34.898 18:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:09:34.898 pt2' 00:09:34.898 18:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:34.898 18:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:09:34.898 18:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:35.156 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:35.156 "name": "pt1", 00:09:35.156 "aliases": [ 00:09:35.156 "00000000-0000-0000-0000-000000000001" 00:09:35.156 ], 00:09:35.156 "product_name": "passthru", 00:09:35.156 "block_size": 512, 00:09:35.156 "num_blocks": 65536, 00:09:35.156 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:35.156 "assigned_rate_limits": { 00:09:35.156 "rw_ios_per_sec": 0, 00:09:35.156 "rw_mbytes_per_sec": 0, 00:09:35.156 "r_mbytes_per_sec": 0, 00:09:35.156 "w_mbytes_per_sec": 0 00:09:35.156 }, 00:09:35.156 "claimed": true, 00:09:35.156 "claim_type": "exclusive_write", 00:09:35.156 "zoned": false, 00:09:35.156 "supported_io_types": { 00:09:35.156 "read": true, 00:09:35.156 "write": true, 00:09:35.156 "unmap": true, 00:09:35.156 "flush": true, 00:09:35.156 "reset": true, 00:09:35.156 "nvme_admin": false, 00:09:35.156 "nvme_io": false, 00:09:35.156 "nvme_io_md": false, 00:09:35.156 "write_zeroes": true, 00:09:35.156 "zcopy": true, 00:09:35.156 "get_zone_info": false, 00:09:35.156 "zone_management": false, 00:09:35.156 "zone_append": false, 00:09:35.156 "compare": false, 00:09:35.156 "compare_and_write": false, 00:09:35.156 "abort": true, 00:09:35.156 "seek_hole": false, 00:09:35.156 "seek_data": false, 00:09:35.156 "copy": true, 00:09:35.156 "nvme_iov_md": false 00:09:35.156 }, 00:09:35.156 "memory_domains": [ 00:09:35.156 { 00:09:35.156 "dma_device_id": "system", 00:09:35.156 "dma_device_type": 1 00:09:35.156 }, 00:09:35.156 { 00:09:35.156 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:35.156 "dma_device_type": 2 00:09:35.156 } 00:09:35.156 ], 00:09:35.156 "driver_specific": { 00:09:35.156 "passthru": { 00:09:35.156 "name": "pt1", 00:09:35.156 "base_bdev_name": "malloc1" 00:09:35.156 } 00:09:35.156 } 00:09:35.156 }' 00:09:35.156 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:35.156 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:35.156 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:35.156 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:35.156 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:35.156 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:35.156 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:35.413 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:35.414 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:35.414 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:35.414 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:35.414 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:35.414 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:35.414 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:09:35.414 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:35.671 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:35.671 "name": "pt2", 00:09:35.671 "aliases": [ 00:09:35.671 "00000000-0000-0000-0000-000000000002" 00:09:35.671 ], 00:09:35.671 "product_name": "passthru", 00:09:35.671 "block_size": 512, 00:09:35.671 "num_blocks": 65536, 00:09:35.671 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:35.671 "assigned_rate_limits": { 00:09:35.671 "rw_ios_per_sec": 0, 00:09:35.671 "rw_mbytes_per_sec": 0, 00:09:35.671 "r_mbytes_per_sec": 0, 00:09:35.671 "w_mbytes_per_sec": 0 00:09:35.671 }, 00:09:35.671 "claimed": true, 00:09:35.671 "claim_type": "exclusive_write", 00:09:35.671 "zoned": false, 00:09:35.671 "supported_io_types": { 00:09:35.671 "read": true, 00:09:35.671 "write": true, 00:09:35.671 "unmap": true, 00:09:35.671 "flush": true, 00:09:35.671 "reset": true, 00:09:35.671 "nvme_admin": false, 00:09:35.671 "nvme_io": false, 00:09:35.671 "nvme_io_md": false, 00:09:35.671 "write_zeroes": true, 00:09:35.671 "zcopy": true, 00:09:35.671 "get_zone_info": false, 00:09:35.671 "zone_management": false, 00:09:35.671 "zone_append": false, 00:09:35.671 "compare": false, 00:09:35.671 "compare_and_write": false, 00:09:35.671 "abort": true, 00:09:35.671 "seek_hole": false, 00:09:35.671 "seek_data": false, 00:09:35.671 "copy": true, 00:09:35.671 "nvme_iov_md": false 00:09:35.671 }, 00:09:35.671 "memory_domains": [ 00:09:35.671 { 00:09:35.671 "dma_device_id": "system", 00:09:35.671 "dma_device_type": 1 00:09:35.671 }, 00:09:35.671 { 00:09:35.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:35.671 "dma_device_type": 2 00:09:35.671 } 00:09:35.671 ], 00:09:35.671 "driver_specific": { 00:09:35.671 "passthru": { 00:09:35.671 "name": "pt2", 00:09:35.671 "base_bdev_name": "malloc2" 00:09:35.671 } 00:09:35.671 } 00:09:35.671 }' 00:09:35.671 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:35.671 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:35.671 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:35.671 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:35.671 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:35.671 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:35.671 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:35.671 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:35.928 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:35.928 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:35.928 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:35.928 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:35.928 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:35.928 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:09:35.928 [2024-07-24 18:46:20.933709] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:36.186 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' dba11512-a124-4497-8a7d-b26bf2c0b280 '!=' dba11512-a124-4497-8a7d-b26bf2c0b280 ']' 00:09:36.186 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:09:36.186 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:36.186 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:36.186 18:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2049533 00:09:36.186 18:46:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2049533 ']' 00:09:36.186 18:46:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2049533 00:09:36.186 18:46:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:09:36.186 18:46:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:36.186 18:46:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2049533 00:09:36.186 18:46:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:36.186 18:46:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:36.186 18:46:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2049533' 00:09:36.186 killing process with pid 2049533 00:09:36.186 18:46:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2049533 00:09:36.186 [2024-07-24 18:46:20.988416] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:36.186 [2024-07-24 18:46:20.988456] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:36.186 [2024-07-24 18:46:20.988491] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:36.186 [2024-07-24 18:46:20.988497] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x121afe0 name raid_bdev1, state offline 00:09:36.186 18:46:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2049533 00:09:36.186 [2024-07-24 18:46:21.003855] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:36.186 18:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:09:36.186 00:09:36.186 real 0m8.001s 00:09:36.186 user 0m14.426s 00:09:36.186 sys 0m1.252s 00:09:36.186 18:46:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:36.186 18:46:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:36.186 ************************************ 00:09:36.186 END TEST raid_superblock_test 00:09:36.186 ************************************ 00:09:36.444 18:46:21 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:09:36.444 18:46:21 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:36.444 18:46:21 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:36.444 18:46:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:36.444 ************************************ 00:09:36.444 START TEST raid_read_error_test 00:09:36.444 ************************************ 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.O2iq6RUqMC 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2051122 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2051122 /var/tmp/spdk-raid.sock 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2051122 ']' 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:36.444 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:36.444 18:46:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:36.444 [2024-07-24 18:46:21.286844] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:09:36.445 [2024-07-24 18:46:21.286880] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2051122 ] 00:09:36.445 [2024-07-24 18:46:21.344268] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:36.445 [2024-07-24 18:46:21.422425] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:36.702 [2024-07-24 18:46:21.476206] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:36.702 [2024-07-24 18:46:21.476231] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:37.267 18:46:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:37.267 18:46:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:09:37.267 18:46:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:37.267 18:46:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:09:37.267 BaseBdev1_malloc 00:09:37.267 18:46:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:09:37.525 true 00:09:37.525 18:46:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:09:37.782 [2024-07-24 18:46:22.568721] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:09:37.782 [2024-07-24 18:46:22.568752] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:37.782 [2024-07-24 18:46:22.568762] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x978d20 00:09:37.782 [2024-07-24 18:46:22.568768] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:37.782 [2024-07-24 18:46:22.570011] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:37.783 [2024-07-24 18:46:22.570031] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:09:37.783 BaseBdev1 00:09:37.783 18:46:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:37.783 18:46:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:09:37.783 BaseBdev2_malloc 00:09:37.783 18:46:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:09:38.041 true 00:09:38.041 18:46:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:09:38.299 [2024-07-24 18:46:23.053379] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:09:38.299 [2024-07-24 18:46:23.053411] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:38.299 [2024-07-24 18:46:23.053422] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x97dd50 00:09:38.299 [2024-07-24 18:46:23.053428] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:38.299 [2024-07-24 18:46:23.054511] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:38.299 [2024-07-24 18:46:23.054531] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:09:38.299 BaseBdev2 00:09:38.299 18:46:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:09:38.299 [2024-07-24 18:46:23.217836] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:38.299 [2024-07-24 18:46:23.218764] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:38.299 [2024-07-24 18:46:23.218894] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x97f0e0 00:09:38.299 [2024-07-24 18:46:23.218907] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:38.299 [2024-07-24 18:46:23.219037] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9877d0 00:09:38.299 [2024-07-24 18:46:23.219133] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x97f0e0 00:09:38.299 [2024-07-24 18:46:23.219138] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x97f0e0 00:09:38.299 [2024-07-24 18:46:23.219207] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:38.299 18:46:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:38.299 18:46:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:38.299 18:46:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:38.299 18:46:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:38.299 18:46:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:38.299 18:46:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:38.299 18:46:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:38.299 18:46:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:38.299 18:46:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:38.299 18:46:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:38.299 18:46:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:38.299 18:46:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:38.556 18:46:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:38.556 "name": "raid_bdev1", 00:09:38.556 "uuid": "364d8bcb-33d7-45d1-b8a9-cbd1b8a9e1c3", 00:09:38.556 "strip_size_kb": 64, 00:09:38.556 "state": "online", 00:09:38.556 "raid_level": "raid0", 00:09:38.556 "superblock": true, 00:09:38.556 "num_base_bdevs": 2, 00:09:38.556 "num_base_bdevs_discovered": 2, 00:09:38.556 "num_base_bdevs_operational": 2, 00:09:38.556 "base_bdevs_list": [ 00:09:38.556 { 00:09:38.556 "name": "BaseBdev1", 00:09:38.556 "uuid": "dbb88219-257e-5c53-8d71-03de5da013a8", 00:09:38.556 "is_configured": true, 00:09:38.556 "data_offset": 2048, 00:09:38.556 "data_size": 63488 00:09:38.556 }, 00:09:38.556 { 00:09:38.556 "name": "BaseBdev2", 00:09:38.556 "uuid": "bf376c16-96ca-51a9-bbbf-dc8d092ebf53", 00:09:38.556 "is_configured": true, 00:09:38.556 "data_offset": 2048, 00:09:38.556 "data_size": 63488 00:09:38.556 } 00:09:38.556 ] 00:09:38.556 }' 00:09:38.556 18:46:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:38.556 18:46:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:39.121 18:46:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:09:39.121 18:46:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:09:39.121 [2024-07-24 18:46:23.967972] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x97aac0 00:09:40.056 18:46:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:09:40.056 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:09:40.056 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:09:40.056 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:09:40.056 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:40.056 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:40.056 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:40.056 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:40.056 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:40.056 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:40.056 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:40.056 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:40.056 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:40.056 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:40.314 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:40.314 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:40.314 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:40.314 "name": "raid_bdev1", 00:09:40.314 "uuid": "364d8bcb-33d7-45d1-b8a9-cbd1b8a9e1c3", 00:09:40.314 "strip_size_kb": 64, 00:09:40.314 "state": "online", 00:09:40.314 "raid_level": "raid0", 00:09:40.314 "superblock": true, 00:09:40.314 "num_base_bdevs": 2, 00:09:40.314 "num_base_bdevs_discovered": 2, 00:09:40.314 "num_base_bdevs_operational": 2, 00:09:40.314 "base_bdevs_list": [ 00:09:40.314 { 00:09:40.314 "name": "BaseBdev1", 00:09:40.314 "uuid": "dbb88219-257e-5c53-8d71-03de5da013a8", 00:09:40.314 "is_configured": true, 00:09:40.314 "data_offset": 2048, 00:09:40.314 "data_size": 63488 00:09:40.314 }, 00:09:40.314 { 00:09:40.314 "name": "BaseBdev2", 00:09:40.314 "uuid": "bf376c16-96ca-51a9-bbbf-dc8d092ebf53", 00:09:40.314 "is_configured": true, 00:09:40.314 "data_offset": 2048, 00:09:40.314 "data_size": 63488 00:09:40.314 } 00:09:40.314 ] 00:09:40.314 }' 00:09:40.314 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:40.314 18:46:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:40.881 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:09:40.881 [2024-07-24 18:46:25.875733] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:40.881 [2024-07-24 18:46:25.875767] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:40.881 [2024-07-24 18:46:25.877793] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:40.881 [2024-07-24 18:46:25.877814] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:40.881 [2024-07-24 18:46:25.877830] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:40.881 [2024-07-24 18:46:25.877835] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x97f0e0 name raid_bdev1, state offline 00:09:40.881 0 00:09:41.140 18:46:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2051122 00:09:41.140 18:46:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2051122 ']' 00:09:41.140 18:46:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2051122 00:09:41.140 18:46:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:09:41.140 18:46:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:41.140 18:46:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2051122 00:09:41.140 18:46:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:41.140 18:46:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:41.140 18:46:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2051122' 00:09:41.140 killing process with pid 2051122 00:09:41.140 18:46:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2051122 00:09:41.140 [2024-07-24 18:46:25.932850] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:41.140 18:46:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2051122 00:09:41.140 [2024-07-24 18:46:25.942240] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:41.140 18:46:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:09:41.140 18:46:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.O2iq6RUqMC 00:09:41.140 18:46:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:09:41.140 18:46:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.53 00:09:41.140 18:46:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:09:41.140 18:46:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:41.140 18:46:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:41.140 18:46:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.53 != \0\.\0\0 ]] 00:09:41.140 00:09:41.140 real 0m4.888s 00:09:41.140 user 0m7.458s 00:09:41.140 sys 0m0.708s 00:09:41.140 18:46:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:41.140 18:46:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:41.140 ************************************ 00:09:41.140 END TEST raid_read_error_test 00:09:41.140 ************************************ 00:09:41.400 18:46:26 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:09:41.400 18:46:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:41.400 18:46:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:41.400 18:46:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:41.400 ************************************ 00:09:41.400 START TEST raid_write_error_test 00:09:41.400 ************************************ 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.DjXybWyRLh 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2052106 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2052106 /var/tmp/spdk-raid.sock 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2052106 ']' 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:41.400 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:41.400 18:46:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:41.400 [2024-07-24 18:46:26.255113] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:09:41.400 [2024-07-24 18:46:26.255150] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2052106 ] 00:09:41.400 [2024-07-24 18:46:26.318484] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:41.400 [2024-07-24 18:46:26.388811] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:41.659 [2024-07-24 18:46:26.445254] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:41.659 [2024-07-24 18:46:26.445282] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:42.225 18:46:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:42.225 18:46:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:09:42.225 18:46:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:42.225 18:46:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:09:42.225 BaseBdev1_malloc 00:09:42.225 18:46:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:09:42.482 true 00:09:42.482 18:46:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:09:42.739 [2024-07-24 18:46:27.536994] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:09:42.739 [2024-07-24 18:46:27.537028] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:42.739 [2024-07-24 18:46:27.537039] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2699d20 00:09:42.739 [2024-07-24 18:46:27.537045] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:42.739 [2024-07-24 18:46:27.538183] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:42.739 [2024-07-24 18:46:27.538205] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:09:42.739 BaseBdev1 00:09:42.739 18:46:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:42.739 18:46:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:09:42.739 BaseBdev2_malloc 00:09:42.739 18:46:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:09:42.997 true 00:09:42.997 18:46:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:09:43.254 [2024-07-24 18:46:28.053674] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:09:43.254 [2024-07-24 18:46:28.053704] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:43.254 [2024-07-24 18:46:28.053714] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x269ed50 00:09:43.254 [2024-07-24 18:46:28.053724] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:43.254 [2024-07-24 18:46:28.054653] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:43.254 [2024-07-24 18:46:28.054673] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:09:43.254 BaseBdev2 00:09:43.254 18:46:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:09:43.254 [2024-07-24 18:46:28.222141] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:43.254 [2024-07-24 18:46:28.223055] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:43.254 [2024-07-24 18:46:28.223181] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x26a00e0 00:09:43.254 [2024-07-24 18:46:28.223188] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:43.254 [2024-07-24 18:46:28.223317] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26a87d0 00:09:43.254 [2024-07-24 18:46:28.223418] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26a00e0 00:09:43.254 [2024-07-24 18:46:28.223423] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26a00e0 00:09:43.254 [2024-07-24 18:46:28.223498] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:43.254 18:46:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:43.254 18:46:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:43.254 18:46:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:43.254 18:46:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:43.254 18:46:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:43.254 18:46:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:43.254 18:46:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:43.254 18:46:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:43.254 18:46:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:43.254 18:46:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:43.254 18:46:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:43.254 18:46:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:43.512 18:46:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:43.512 "name": "raid_bdev1", 00:09:43.512 "uuid": "ab7289b0-c8c1-47c6-bc57-73c75a6e909b", 00:09:43.512 "strip_size_kb": 64, 00:09:43.512 "state": "online", 00:09:43.513 "raid_level": "raid0", 00:09:43.513 "superblock": true, 00:09:43.513 "num_base_bdevs": 2, 00:09:43.513 "num_base_bdevs_discovered": 2, 00:09:43.513 "num_base_bdevs_operational": 2, 00:09:43.513 "base_bdevs_list": [ 00:09:43.513 { 00:09:43.513 "name": "BaseBdev1", 00:09:43.513 "uuid": "4e059268-daa9-5783-ac3f-e22854aec585", 00:09:43.513 "is_configured": true, 00:09:43.513 "data_offset": 2048, 00:09:43.513 "data_size": 63488 00:09:43.513 }, 00:09:43.513 { 00:09:43.513 "name": "BaseBdev2", 00:09:43.513 "uuid": "21707c2c-b54e-5431-b26b-50f21a19ccab", 00:09:43.513 "is_configured": true, 00:09:43.513 "data_offset": 2048, 00:09:43.513 "data_size": 63488 00:09:43.513 } 00:09:43.513 ] 00:09:43.513 }' 00:09:43.513 18:46:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:43.513 18:46:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:44.079 18:46:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:09:44.080 18:46:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:09:44.080 [2024-07-24 18:46:28.972291] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x269bac0 00:09:45.015 18:46:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:09:45.273 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:09:45.273 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:09:45.273 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:09:45.273 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:45.273 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:45.273 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:45.273 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:45.273 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:45.273 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:45.273 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:45.273 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:45.273 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:45.273 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:45.273 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:45.273 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:45.273 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:45.273 "name": "raid_bdev1", 00:09:45.273 "uuid": "ab7289b0-c8c1-47c6-bc57-73c75a6e909b", 00:09:45.273 "strip_size_kb": 64, 00:09:45.273 "state": "online", 00:09:45.273 "raid_level": "raid0", 00:09:45.273 "superblock": true, 00:09:45.273 "num_base_bdevs": 2, 00:09:45.273 "num_base_bdevs_discovered": 2, 00:09:45.273 "num_base_bdevs_operational": 2, 00:09:45.273 "base_bdevs_list": [ 00:09:45.273 { 00:09:45.273 "name": "BaseBdev1", 00:09:45.273 "uuid": "4e059268-daa9-5783-ac3f-e22854aec585", 00:09:45.273 "is_configured": true, 00:09:45.273 "data_offset": 2048, 00:09:45.273 "data_size": 63488 00:09:45.273 }, 00:09:45.273 { 00:09:45.273 "name": "BaseBdev2", 00:09:45.273 "uuid": "21707c2c-b54e-5431-b26b-50f21a19ccab", 00:09:45.273 "is_configured": true, 00:09:45.273 "data_offset": 2048, 00:09:45.273 "data_size": 63488 00:09:45.273 } 00:09:45.273 ] 00:09:45.273 }' 00:09:45.273 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:45.273 18:46:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:45.840 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:09:46.099 [2024-07-24 18:46:30.900960] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:46.099 [2024-07-24 18:46:30.900994] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:46.099 [2024-07-24 18:46:30.903068] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:46.099 [2024-07-24 18:46:30.903090] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:46.099 [2024-07-24 18:46:30.903106] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:46.099 [2024-07-24 18:46:30.903111] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26a00e0 name raid_bdev1, state offline 00:09:46.099 0 00:09:46.099 18:46:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2052106 00:09:46.099 18:46:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2052106 ']' 00:09:46.099 18:46:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2052106 00:09:46.099 18:46:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:09:46.099 18:46:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:46.099 18:46:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2052106 00:09:46.099 18:46:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:46.099 18:46:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:46.099 18:46:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2052106' 00:09:46.099 killing process with pid 2052106 00:09:46.099 18:46:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2052106 00:09:46.099 [2024-07-24 18:46:30.957310] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:46.099 18:46:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2052106 00:09:46.099 [2024-07-24 18:46:30.966522] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:46.358 18:46:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.DjXybWyRLh 00:09:46.358 18:46:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:09:46.358 18:46:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:09:46.358 18:46:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:09:46.358 18:46:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:09:46.358 18:46:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:46.358 18:46:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:46.358 18:46:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:09:46.358 00:09:46.358 real 0m4.964s 00:09:46.358 user 0m7.602s 00:09:46.358 sys 0m0.720s 00:09:46.358 18:46:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:46.358 18:46:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:46.358 ************************************ 00:09:46.358 END TEST raid_write_error_test 00:09:46.358 ************************************ 00:09:46.358 18:46:31 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:09:46.358 18:46:31 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:09:46.358 18:46:31 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:46.358 18:46:31 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:46.358 18:46:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:46.358 ************************************ 00:09:46.358 START TEST raid_state_function_test 00:09:46.358 ************************************ 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2052916 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2052916' 00:09:46.358 Process raid pid: 2052916 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2052916 /var/tmp/spdk-raid.sock 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2052916 ']' 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:46.358 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:46.358 18:46:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:46.358 [2024-07-24 18:46:31.277079] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:09:46.358 [2024-07-24 18:46:31.277116] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:46.358 [2024-07-24 18:46:31.341244] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:46.615 [2024-07-24 18:46:31.420441] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:46.615 [2024-07-24 18:46:31.473029] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:46.615 [2024-07-24 18:46:31.473054] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:47.183 18:46:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:47.183 18:46:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:09:47.183 18:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:47.441 [2024-07-24 18:46:32.216321] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:47.441 [2024-07-24 18:46:32.216349] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:47.441 [2024-07-24 18:46:32.216354] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:47.441 [2024-07-24 18:46:32.216359] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:47.441 18:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:47.441 18:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:47.441 18:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:47.441 18:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:47.441 18:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:47.441 18:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:47.441 18:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:47.441 18:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:47.442 18:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:47.442 18:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:47.442 18:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:47.442 18:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:47.442 18:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:47.442 "name": "Existed_Raid", 00:09:47.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:47.442 "strip_size_kb": 64, 00:09:47.442 "state": "configuring", 00:09:47.442 "raid_level": "concat", 00:09:47.442 "superblock": false, 00:09:47.442 "num_base_bdevs": 2, 00:09:47.442 "num_base_bdevs_discovered": 0, 00:09:47.442 "num_base_bdevs_operational": 2, 00:09:47.442 "base_bdevs_list": [ 00:09:47.442 { 00:09:47.442 "name": "BaseBdev1", 00:09:47.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:47.442 "is_configured": false, 00:09:47.442 "data_offset": 0, 00:09:47.442 "data_size": 0 00:09:47.442 }, 00:09:47.442 { 00:09:47.442 "name": "BaseBdev2", 00:09:47.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:47.442 "is_configured": false, 00:09:47.442 "data_offset": 0, 00:09:47.442 "data_size": 0 00:09:47.442 } 00:09:47.442 ] 00:09:47.442 }' 00:09:47.442 18:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:47.442 18:46:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:48.009 18:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:48.268 [2024-07-24 18:46:33.042387] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:48.268 [2024-07-24 18:46:33.042405] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16ebb80 name Existed_Raid, state configuring 00:09:48.268 18:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:48.268 [2024-07-24 18:46:33.214854] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:48.268 [2024-07-24 18:46:33.214871] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:48.268 [2024-07-24 18:46:33.214875] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:48.268 [2024-07-24 18:46:33.214880] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:48.268 18:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:48.526 [2024-07-24 18:46:33.391417] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:48.526 BaseBdev1 00:09:48.526 18:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:09:48.526 18:46:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:09:48.526 18:46:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:48.526 18:46:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:09:48.526 18:46:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:48.527 18:46:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:48.527 18:46:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:48.784 18:46:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:48.784 [ 00:09:48.784 { 00:09:48.784 "name": "BaseBdev1", 00:09:48.784 "aliases": [ 00:09:48.784 "4fb5e6eb-9fd4-4745-8fb9-aa907d6b8ae5" 00:09:48.784 ], 00:09:48.784 "product_name": "Malloc disk", 00:09:48.784 "block_size": 512, 00:09:48.784 "num_blocks": 65536, 00:09:48.784 "uuid": "4fb5e6eb-9fd4-4745-8fb9-aa907d6b8ae5", 00:09:48.784 "assigned_rate_limits": { 00:09:48.784 "rw_ios_per_sec": 0, 00:09:48.784 "rw_mbytes_per_sec": 0, 00:09:48.784 "r_mbytes_per_sec": 0, 00:09:48.784 "w_mbytes_per_sec": 0 00:09:48.784 }, 00:09:48.784 "claimed": true, 00:09:48.784 "claim_type": "exclusive_write", 00:09:48.784 "zoned": false, 00:09:48.784 "supported_io_types": { 00:09:48.784 "read": true, 00:09:48.784 "write": true, 00:09:48.784 "unmap": true, 00:09:48.784 "flush": true, 00:09:48.784 "reset": true, 00:09:48.784 "nvme_admin": false, 00:09:48.784 "nvme_io": false, 00:09:48.784 "nvme_io_md": false, 00:09:48.784 "write_zeroes": true, 00:09:48.784 "zcopy": true, 00:09:48.784 "get_zone_info": false, 00:09:48.784 "zone_management": false, 00:09:48.784 "zone_append": false, 00:09:48.784 "compare": false, 00:09:48.784 "compare_and_write": false, 00:09:48.784 "abort": true, 00:09:48.784 "seek_hole": false, 00:09:48.784 "seek_data": false, 00:09:48.784 "copy": true, 00:09:48.784 "nvme_iov_md": false 00:09:48.784 }, 00:09:48.784 "memory_domains": [ 00:09:48.784 { 00:09:48.784 "dma_device_id": "system", 00:09:48.784 "dma_device_type": 1 00:09:48.784 }, 00:09:48.784 { 00:09:48.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:48.784 "dma_device_type": 2 00:09:48.784 } 00:09:48.784 ], 00:09:48.784 "driver_specific": {} 00:09:48.784 } 00:09:48.784 ] 00:09:48.784 18:46:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:09:48.784 18:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:48.784 18:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:48.784 18:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:48.784 18:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:48.784 18:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:48.784 18:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:48.784 18:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:48.784 18:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:48.784 18:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:48.784 18:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:48.784 18:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:48.784 18:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:49.042 18:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:49.042 "name": "Existed_Raid", 00:09:49.042 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:49.042 "strip_size_kb": 64, 00:09:49.042 "state": "configuring", 00:09:49.042 "raid_level": "concat", 00:09:49.042 "superblock": false, 00:09:49.042 "num_base_bdevs": 2, 00:09:49.042 "num_base_bdevs_discovered": 1, 00:09:49.042 "num_base_bdevs_operational": 2, 00:09:49.042 "base_bdevs_list": [ 00:09:49.042 { 00:09:49.042 "name": "BaseBdev1", 00:09:49.042 "uuid": "4fb5e6eb-9fd4-4745-8fb9-aa907d6b8ae5", 00:09:49.042 "is_configured": true, 00:09:49.042 "data_offset": 0, 00:09:49.042 "data_size": 65536 00:09:49.042 }, 00:09:49.042 { 00:09:49.042 "name": "BaseBdev2", 00:09:49.042 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:49.042 "is_configured": false, 00:09:49.042 "data_offset": 0, 00:09:49.042 "data_size": 0 00:09:49.042 } 00:09:49.042 ] 00:09:49.042 }' 00:09:49.043 18:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:49.043 18:46:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:49.610 18:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:49.610 [2024-07-24 18:46:34.570501] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:49.610 [2024-07-24 18:46:34.570535] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16eb470 name Existed_Raid, state configuring 00:09:49.610 18:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:49.869 [2024-07-24 18:46:34.738948] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:49.869 [2024-07-24 18:46:34.739947] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:49.869 [2024-07-24 18:46:34.739970] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:49.869 18:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:09:49.869 18:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:49.869 18:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:49.869 18:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:49.869 18:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:49.869 18:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:49.869 18:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:49.869 18:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:49.869 18:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:49.869 18:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:49.869 18:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:49.869 18:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:49.869 18:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:49.869 18:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:50.127 18:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:50.127 "name": "Existed_Raid", 00:09:50.127 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:50.127 "strip_size_kb": 64, 00:09:50.127 "state": "configuring", 00:09:50.127 "raid_level": "concat", 00:09:50.127 "superblock": false, 00:09:50.127 "num_base_bdevs": 2, 00:09:50.127 "num_base_bdevs_discovered": 1, 00:09:50.127 "num_base_bdevs_operational": 2, 00:09:50.127 "base_bdevs_list": [ 00:09:50.127 { 00:09:50.127 "name": "BaseBdev1", 00:09:50.127 "uuid": "4fb5e6eb-9fd4-4745-8fb9-aa907d6b8ae5", 00:09:50.127 "is_configured": true, 00:09:50.127 "data_offset": 0, 00:09:50.127 "data_size": 65536 00:09:50.127 }, 00:09:50.127 { 00:09:50.127 "name": "BaseBdev2", 00:09:50.127 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:50.127 "is_configured": false, 00:09:50.127 "data_offset": 0, 00:09:50.127 "data_size": 0 00:09:50.127 } 00:09:50.127 ] 00:09:50.127 }' 00:09:50.127 18:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:50.127 18:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:50.725 18:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:50.725 [2024-07-24 18:46:35.551706] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:50.725 [2024-07-24 18:46:35.551732] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x16ec260 00:09:50.725 [2024-07-24 18:46:35.551736] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:50.725 [2024-07-24 18:46:35.551869] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18953d0 00:09:50.725 [2024-07-24 18:46:35.551952] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16ec260 00:09:50.725 [2024-07-24 18:46:35.551958] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16ec260 00:09:50.725 [2024-07-24 18:46:35.552100] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:50.725 BaseBdev2 00:09:50.725 18:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:09:50.725 18:46:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:09:50.725 18:46:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:50.725 18:46:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:09:50.725 18:46:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:50.725 18:46:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:50.725 18:46:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:50.984 18:46:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:50.984 [ 00:09:50.984 { 00:09:50.984 "name": "BaseBdev2", 00:09:50.984 "aliases": [ 00:09:50.984 "eb9de965-03a3-4944-abf6-c224edccfe3e" 00:09:50.984 ], 00:09:50.984 "product_name": "Malloc disk", 00:09:50.984 "block_size": 512, 00:09:50.984 "num_blocks": 65536, 00:09:50.984 "uuid": "eb9de965-03a3-4944-abf6-c224edccfe3e", 00:09:50.984 "assigned_rate_limits": { 00:09:50.984 "rw_ios_per_sec": 0, 00:09:50.984 "rw_mbytes_per_sec": 0, 00:09:50.984 "r_mbytes_per_sec": 0, 00:09:50.984 "w_mbytes_per_sec": 0 00:09:50.984 }, 00:09:50.984 "claimed": true, 00:09:50.984 "claim_type": "exclusive_write", 00:09:50.984 "zoned": false, 00:09:50.984 "supported_io_types": { 00:09:50.984 "read": true, 00:09:50.984 "write": true, 00:09:50.984 "unmap": true, 00:09:50.984 "flush": true, 00:09:50.984 "reset": true, 00:09:50.984 "nvme_admin": false, 00:09:50.984 "nvme_io": false, 00:09:50.984 "nvme_io_md": false, 00:09:50.984 "write_zeroes": true, 00:09:50.984 "zcopy": true, 00:09:50.984 "get_zone_info": false, 00:09:50.984 "zone_management": false, 00:09:50.984 "zone_append": false, 00:09:50.984 "compare": false, 00:09:50.984 "compare_and_write": false, 00:09:50.984 "abort": true, 00:09:50.984 "seek_hole": false, 00:09:50.984 "seek_data": false, 00:09:50.984 "copy": true, 00:09:50.984 "nvme_iov_md": false 00:09:50.984 }, 00:09:50.985 "memory_domains": [ 00:09:50.985 { 00:09:50.985 "dma_device_id": "system", 00:09:50.985 "dma_device_type": 1 00:09:50.985 }, 00:09:50.985 { 00:09:50.985 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:50.985 "dma_device_type": 2 00:09:50.985 } 00:09:50.985 ], 00:09:50.985 "driver_specific": {} 00:09:50.985 } 00:09:50.985 ] 00:09:50.985 18:46:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:09:50.985 18:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:09:50.985 18:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:50.985 18:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:09:50.985 18:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:50.985 18:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:50.985 18:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:50.985 18:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:50.985 18:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:50.985 18:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:50.985 18:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:50.985 18:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:50.985 18:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:50.985 18:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:50.985 18:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:51.244 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:51.244 "name": "Existed_Raid", 00:09:51.244 "uuid": "3a5cea8d-8b7d-4d33-ab26-5539831c1106", 00:09:51.244 "strip_size_kb": 64, 00:09:51.244 "state": "online", 00:09:51.244 "raid_level": "concat", 00:09:51.244 "superblock": false, 00:09:51.244 "num_base_bdevs": 2, 00:09:51.244 "num_base_bdevs_discovered": 2, 00:09:51.244 "num_base_bdevs_operational": 2, 00:09:51.244 "base_bdevs_list": [ 00:09:51.244 { 00:09:51.244 "name": "BaseBdev1", 00:09:51.244 "uuid": "4fb5e6eb-9fd4-4745-8fb9-aa907d6b8ae5", 00:09:51.244 "is_configured": true, 00:09:51.244 "data_offset": 0, 00:09:51.244 "data_size": 65536 00:09:51.244 }, 00:09:51.244 { 00:09:51.244 "name": "BaseBdev2", 00:09:51.244 "uuid": "eb9de965-03a3-4944-abf6-c224edccfe3e", 00:09:51.244 "is_configured": true, 00:09:51.244 "data_offset": 0, 00:09:51.244 "data_size": 65536 00:09:51.244 } 00:09:51.244 ] 00:09:51.244 }' 00:09:51.244 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:51.244 18:46:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:51.811 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:09:51.811 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:09:51.811 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:51.811 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:51.811 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:51.811 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:51.811 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:51.811 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:51.811 [2024-07-24 18:46:36.702848] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:51.811 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:51.811 "name": "Existed_Raid", 00:09:51.811 "aliases": [ 00:09:51.811 "3a5cea8d-8b7d-4d33-ab26-5539831c1106" 00:09:51.811 ], 00:09:51.811 "product_name": "Raid Volume", 00:09:51.811 "block_size": 512, 00:09:51.811 "num_blocks": 131072, 00:09:51.811 "uuid": "3a5cea8d-8b7d-4d33-ab26-5539831c1106", 00:09:51.811 "assigned_rate_limits": { 00:09:51.811 "rw_ios_per_sec": 0, 00:09:51.811 "rw_mbytes_per_sec": 0, 00:09:51.811 "r_mbytes_per_sec": 0, 00:09:51.811 "w_mbytes_per_sec": 0 00:09:51.811 }, 00:09:51.811 "claimed": false, 00:09:51.811 "zoned": false, 00:09:51.811 "supported_io_types": { 00:09:51.811 "read": true, 00:09:51.811 "write": true, 00:09:51.811 "unmap": true, 00:09:51.811 "flush": true, 00:09:51.811 "reset": true, 00:09:51.811 "nvme_admin": false, 00:09:51.811 "nvme_io": false, 00:09:51.811 "nvme_io_md": false, 00:09:51.811 "write_zeroes": true, 00:09:51.811 "zcopy": false, 00:09:51.811 "get_zone_info": false, 00:09:51.811 "zone_management": false, 00:09:51.811 "zone_append": false, 00:09:51.811 "compare": false, 00:09:51.811 "compare_and_write": false, 00:09:51.811 "abort": false, 00:09:51.811 "seek_hole": false, 00:09:51.811 "seek_data": false, 00:09:51.811 "copy": false, 00:09:51.811 "nvme_iov_md": false 00:09:51.811 }, 00:09:51.811 "memory_domains": [ 00:09:51.811 { 00:09:51.811 "dma_device_id": "system", 00:09:51.811 "dma_device_type": 1 00:09:51.811 }, 00:09:51.811 { 00:09:51.811 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:51.811 "dma_device_type": 2 00:09:51.811 }, 00:09:51.811 { 00:09:51.811 "dma_device_id": "system", 00:09:51.811 "dma_device_type": 1 00:09:51.811 }, 00:09:51.811 { 00:09:51.811 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:51.811 "dma_device_type": 2 00:09:51.811 } 00:09:51.811 ], 00:09:51.811 "driver_specific": { 00:09:51.811 "raid": { 00:09:51.811 "uuid": "3a5cea8d-8b7d-4d33-ab26-5539831c1106", 00:09:51.811 "strip_size_kb": 64, 00:09:51.811 "state": "online", 00:09:51.811 "raid_level": "concat", 00:09:51.811 "superblock": false, 00:09:51.811 "num_base_bdevs": 2, 00:09:51.811 "num_base_bdevs_discovered": 2, 00:09:51.811 "num_base_bdevs_operational": 2, 00:09:51.811 "base_bdevs_list": [ 00:09:51.811 { 00:09:51.811 "name": "BaseBdev1", 00:09:51.811 "uuid": "4fb5e6eb-9fd4-4745-8fb9-aa907d6b8ae5", 00:09:51.811 "is_configured": true, 00:09:51.811 "data_offset": 0, 00:09:51.811 "data_size": 65536 00:09:51.811 }, 00:09:51.811 { 00:09:51.811 "name": "BaseBdev2", 00:09:51.811 "uuid": "eb9de965-03a3-4944-abf6-c224edccfe3e", 00:09:51.811 "is_configured": true, 00:09:51.811 "data_offset": 0, 00:09:51.811 "data_size": 65536 00:09:51.811 } 00:09:51.811 ] 00:09:51.811 } 00:09:51.811 } 00:09:51.811 }' 00:09:51.811 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:51.811 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:09:51.811 BaseBdev2' 00:09:51.811 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:51.811 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:51.811 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:52.071 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:52.071 "name": "BaseBdev1", 00:09:52.071 "aliases": [ 00:09:52.071 "4fb5e6eb-9fd4-4745-8fb9-aa907d6b8ae5" 00:09:52.071 ], 00:09:52.071 "product_name": "Malloc disk", 00:09:52.071 "block_size": 512, 00:09:52.071 "num_blocks": 65536, 00:09:52.071 "uuid": "4fb5e6eb-9fd4-4745-8fb9-aa907d6b8ae5", 00:09:52.071 "assigned_rate_limits": { 00:09:52.071 "rw_ios_per_sec": 0, 00:09:52.071 "rw_mbytes_per_sec": 0, 00:09:52.071 "r_mbytes_per_sec": 0, 00:09:52.071 "w_mbytes_per_sec": 0 00:09:52.071 }, 00:09:52.071 "claimed": true, 00:09:52.071 "claim_type": "exclusive_write", 00:09:52.071 "zoned": false, 00:09:52.071 "supported_io_types": { 00:09:52.071 "read": true, 00:09:52.071 "write": true, 00:09:52.071 "unmap": true, 00:09:52.071 "flush": true, 00:09:52.071 "reset": true, 00:09:52.071 "nvme_admin": false, 00:09:52.071 "nvme_io": false, 00:09:52.071 "nvme_io_md": false, 00:09:52.071 "write_zeroes": true, 00:09:52.071 "zcopy": true, 00:09:52.071 "get_zone_info": false, 00:09:52.071 "zone_management": false, 00:09:52.071 "zone_append": false, 00:09:52.071 "compare": false, 00:09:52.071 "compare_and_write": false, 00:09:52.071 "abort": true, 00:09:52.071 "seek_hole": false, 00:09:52.071 "seek_data": false, 00:09:52.071 "copy": true, 00:09:52.071 "nvme_iov_md": false 00:09:52.071 }, 00:09:52.071 "memory_domains": [ 00:09:52.071 { 00:09:52.071 "dma_device_id": "system", 00:09:52.071 "dma_device_type": 1 00:09:52.071 }, 00:09:52.071 { 00:09:52.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:52.071 "dma_device_type": 2 00:09:52.071 } 00:09:52.071 ], 00:09:52.071 "driver_specific": {} 00:09:52.071 }' 00:09:52.071 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:52.071 18:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:52.071 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:52.071 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:52.071 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:52.330 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:52.330 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:52.330 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:52.330 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:52.330 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:52.330 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:52.330 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:52.330 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:52.330 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:52.330 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:52.589 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:52.589 "name": "BaseBdev2", 00:09:52.589 "aliases": [ 00:09:52.589 "eb9de965-03a3-4944-abf6-c224edccfe3e" 00:09:52.589 ], 00:09:52.589 "product_name": "Malloc disk", 00:09:52.589 "block_size": 512, 00:09:52.589 "num_blocks": 65536, 00:09:52.589 "uuid": "eb9de965-03a3-4944-abf6-c224edccfe3e", 00:09:52.589 "assigned_rate_limits": { 00:09:52.589 "rw_ios_per_sec": 0, 00:09:52.589 "rw_mbytes_per_sec": 0, 00:09:52.589 "r_mbytes_per_sec": 0, 00:09:52.589 "w_mbytes_per_sec": 0 00:09:52.589 }, 00:09:52.589 "claimed": true, 00:09:52.589 "claim_type": "exclusive_write", 00:09:52.589 "zoned": false, 00:09:52.589 "supported_io_types": { 00:09:52.589 "read": true, 00:09:52.589 "write": true, 00:09:52.589 "unmap": true, 00:09:52.589 "flush": true, 00:09:52.589 "reset": true, 00:09:52.589 "nvme_admin": false, 00:09:52.589 "nvme_io": false, 00:09:52.589 "nvme_io_md": false, 00:09:52.589 "write_zeroes": true, 00:09:52.589 "zcopy": true, 00:09:52.589 "get_zone_info": false, 00:09:52.589 "zone_management": false, 00:09:52.589 "zone_append": false, 00:09:52.589 "compare": false, 00:09:52.589 "compare_and_write": false, 00:09:52.589 "abort": true, 00:09:52.589 "seek_hole": false, 00:09:52.589 "seek_data": false, 00:09:52.589 "copy": true, 00:09:52.589 "nvme_iov_md": false 00:09:52.589 }, 00:09:52.589 "memory_domains": [ 00:09:52.589 { 00:09:52.589 "dma_device_id": "system", 00:09:52.589 "dma_device_type": 1 00:09:52.589 }, 00:09:52.589 { 00:09:52.589 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:52.589 "dma_device_type": 2 00:09:52.589 } 00:09:52.589 ], 00:09:52.589 "driver_specific": {} 00:09:52.589 }' 00:09:52.589 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:52.589 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:52.589 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:52.589 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:52.589 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:52.589 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:52.589 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:52.848 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:52.848 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:52.848 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:52.848 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:52.848 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:52.848 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:53.108 [2024-07-24 18:46:37.885776] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:53.108 [2024-07-24 18:46:37.885794] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:53.108 [2024-07-24 18:46:37.885822] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:53.108 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:09:53.108 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:09:53.108 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:53.108 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:53.108 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:09:53.108 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:09:53.108 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:53.108 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:09:53.108 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:53.108 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:53.108 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:09:53.108 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:53.108 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:53.108 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:53.108 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:53.108 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:53.108 18:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:53.108 18:46:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:53.108 "name": "Existed_Raid", 00:09:53.108 "uuid": "3a5cea8d-8b7d-4d33-ab26-5539831c1106", 00:09:53.108 "strip_size_kb": 64, 00:09:53.108 "state": "offline", 00:09:53.108 "raid_level": "concat", 00:09:53.108 "superblock": false, 00:09:53.108 "num_base_bdevs": 2, 00:09:53.108 "num_base_bdevs_discovered": 1, 00:09:53.108 "num_base_bdevs_operational": 1, 00:09:53.108 "base_bdevs_list": [ 00:09:53.108 { 00:09:53.108 "name": null, 00:09:53.108 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:53.108 "is_configured": false, 00:09:53.108 "data_offset": 0, 00:09:53.108 "data_size": 65536 00:09:53.108 }, 00:09:53.108 { 00:09:53.108 "name": "BaseBdev2", 00:09:53.108 "uuid": "eb9de965-03a3-4944-abf6-c224edccfe3e", 00:09:53.108 "is_configured": true, 00:09:53.108 "data_offset": 0, 00:09:53.108 "data_size": 65536 00:09:53.108 } 00:09:53.108 ] 00:09:53.108 }' 00:09:53.108 18:46:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:53.108 18:46:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:53.676 18:46:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:09:53.676 18:46:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:53.676 18:46:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:53.676 18:46:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:09:53.935 18:46:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:09:53.935 18:46:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:53.935 18:46:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:53.935 [2024-07-24 18:46:38.845123] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:53.935 [2024-07-24 18:46:38.845160] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16ec260 name Existed_Raid, state offline 00:09:53.935 18:46:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:09:53.935 18:46:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:53.935 18:46:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:53.935 18:46:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:09:54.194 18:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:09:54.194 18:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:09:54.194 18:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:09:54.194 18:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2052916 00:09:54.194 18:46:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2052916 ']' 00:09:54.194 18:46:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2052916 00:09:54.194 18:46:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:09:54.194 18:46:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:54.194 18:46:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2052916 00:09:54.194 18:46:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:54.194 18:46:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:54.194 18:46:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2052916' 00:09:54.194 killing process with pid 2052916 00:09:54.194 18:46:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2052916 00:09:54.194 [2024-07-24 18:46:39.098642] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:54.194 18:46:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2052916 00:09:54.194 [2024-07-24 18:46:39.099399] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:54.453 18:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:09:54.453 00:09:54.453 real 0m8.049s 00:09:54.453 user 0m14.483s 00:09:54.453 sys 0m1.271s 00:09:54.453 18:46:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:54.453 18:46:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:54.453 ************************************ 00:09:54.453 END TEST raid_state_function_test 00:09:54.453 ************************************ 00:09:54.453 18:46:39 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:09:54.453 18:46:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:54.453 18:46:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:54.453 18:46:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:54.453 ************************************ 00:09:54.453 START TEST raid_state_function_test_sb 00:09:54.453 ************************************ 00:09:54.453 18:46:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:09:54.453 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:09:54.453 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:09:54.453 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:09:54.453 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:09:54.453 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:09:54.453 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:54.453 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:09:54.453 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:54.453 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:54.453 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:09:54.453 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:54.453 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:54.453 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2054517 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2054517' 00:09:54.454 Process raid pid: 2054517 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2054517 /var/tmp/spdk-raid.sock 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2054517 ']' 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:54.454 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:54.454 18:46:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:54.454 [2024-07-24 18:46:39.396339] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:09:54.454 [2024-07-24 18:46:39.396376] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:54.454 [2024-07-24 18:46:39.459178] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:54.712 [2024-07-24 18:46:39.535259] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:54.712 [2024-07-24 18:46:39.583685] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:54.712 [2024-07-24 18:46:39.583707] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:55.280 18:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:55.280 18:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:09:55.280 18:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:55.539 [2024-07-24 18:46:40.326787] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:55.539 [2024-07-24 18:46:40.326819] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:55.539 [2024-07-24 18:46:40.326825] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:55.539 [2024-07-24 18:46:40.326830] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:55.539 18:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:55.539 18:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:55.539 18:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:55.539 18:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:55.539 18:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:55.539 18:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:55.539 18:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:55.539 18:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:55.539 18:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:55.539 18:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:55.540 18:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:55.540 18:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:55.540 18:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:55.540 "name": "Existed_Raid", 00:09:55.540 "uuid": "26022c5c-0b57-4b7e-bebf-856537c04420", 00:09:55.540 "strip_size_kb": 64, 00:09:55.540 "state": "configuring", 00:09:55.540 "raid_level": "concat", 00:09:55.540 "superblock": true, 00:09:55.540 "num_base_bdevs": 2, 00:09:55.540 "num_base_bdevs_discovered": 0, 00:09:55.540 "num_base_bdevs_operational": 2, 00:09:55.540 "base_bdevs_list": [ 00:09:55.540 { 00:09:55.540 "name": "BaseBdev1", 00:09:55.540 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:55.540 "is_configured": false, 00:09:55.540 "data_offset": 0, 00:09:55.540 "data_size": 0 00:09:55.540 }, 00:09:55.540 { 00:09:55.540 "name": "BaseBdev2", 00:09:55.540 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:55.540 "is_configured": false, 00:09:55.540 "data_offset": 0, 00:09:55.540 "data_size": 0 00:09:55.540 } 00:09:55.540 ] 00:09:55.540 }' 00:09:55.540 18:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:55.540 18:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:56.107 18:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:56.366 [2024-07-24 18:46:41.144815] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:56.366 [2024-07-24 18:46:41.144841] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x957b80 name Existed_Raid, state configuring 00:09:56.366 18:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:56.366 [2024-07-24 18:46:41.329306] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:56.366 [2024-07-24 18:46:41.329330] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:56.366 [2024-07-24 18:46:41.329335] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:56.366 [2024-07-24 18:46:41.329340] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:56.366 18:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:56.625 [2024-07-24 18:46:41.505786] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:56.625 BaseBdev1 00:09:56.625 18:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:09:56.625 18:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:09:56.625 18:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:56.625 18:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:09:56.625 18:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:56.625 18:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:56.625 18:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:56.884 18:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:56.884 [ 00:09:56.884 { 00:09:56.884 "name": "BaseBdev1", 00:09:56.884 "aliases": [ 00:09:56.884 "0315760c-aee6-49d0-b61a-6170a2ba0863" 00:09:56.884 ], 00:09:56.884 "product_name": "Malloc disk", 00:09:56.884 "block_size": 512, 00:09:56.884 "num_blocks": 65536, 00:09:56.884 "uuid": "0315760c-aee6-49d0-b61a-6170a2ba0863", 00:09:56.884 "assigned_rate_limits": { 00:09:56.884 "rw_ios_per_sec": 0, 00:09:56.884 "rw_mbytes_per_sec": 0, 00:09:56.884 "r_mbytes_per_sec": 0, 00:09:56.884 "w_mbytes_per_sec": 0 00:09:56.884 }, 00:09:56.884 "claimed": true, 00:09:56.884 "claim_type": "exclusive_write", 00:09:56.884 "zoned": false, 00:09:56.884 "supported_io_types": { 00:09:56.884 "read": true, 00:09:56.884 "write": true, 00:09:56.884 "unmap": true, 00:09:56.884 "flush": true, 00:09:56.884 "reset": true, 00:09:56.884 "nvme_admin": false, 00:09:56.884 "nvme_io": false, 00:09:56.884 "nvme_io_md": false, 00:09:56.884 "write_zeroes": true, 00:09:56.884 "zcopy": true, 00:09:56.884 "get_zone_info": false, 00:09:56.884 "zone_management": false, 00:09:56.884 "zone_append": false, 00:09:56.884 "compare": false, 00:09:56.884 "compare_and_write": false, 00:09:56.884 "abort": true, 00:09:56.884 "seek_hole": false, 00:09:56.884 "seek_data": false, 00:09:56.884 "copy": true, 00:09:56.884 "nvme_iov_md": false 00:09:56.884 }, 00:09:56.884 "memory_domains": [ 00:09:56.884 { 00:09:56.884 "dma_device_id": "system", 00:09:56.884 "dma_device_type": 1 00:09:56.884 }, 00:09:56.884 { 00:09:56.884 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:56.884 "dma_device_type": 2 00:09:56.884 } 00:09:56.884 ], 00:09:56.884 "driver_specific": {} 00:09:56.884 } 00:09:56.884 ] 00:09:56.884 18:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:09:56.884 18:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:56.884 18:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:56.884 18:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:56.884 18:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:56.884 18:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:56.884 18:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:56.884 18:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:56.884 18:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:56.884 18:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:56.884 18:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:56.884 18:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:56.884 18:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:57.143 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:57.143 "name": "Existed_Raid", 00:09:57.143 "uuid": "a629e70f-d9d2-4648-af81-f3161e793436", 00:09:57.143 "strip_size_kb": 64, 00:09:57.143 "state": "configuring", 00:09:57.143 "raid_level": "concat", 00:09:57.143 "superblock": true, 00:09:57.143 "num_base_bdevs": 2, 00:09:57.143 "num_base_bdevs_discovered": 1, 00:09:57.143 "num_base_bdevs_operational": 2, 00:09:57.143 "base_bdevs_list": [ 00:09:57.143 { 00:09:57.143 "name": "BaseBdev1", 00:09:57.143 "uuid": "0315760c-aee6-49d0-b61a-6170a2ba0863", 00:09:57.143 "is_configured": true, 00:09:57.143 "data_offset": 2048, 00:09:57.143 "data_size": 63488 00:09:57.143 }, 00:09:57.143 { 00:09:57.143 "name": "BaseBdev2", 00:09:57.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:57.143 "is_configured": false, 00:09:57.143 "data_offset": 0, 00:09:57.143 "data_size": 0 00:09:57.143 } 00:09:57.143 ] 00:09:57.143 }' 00:09:57.143 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:57.143 18:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:57.711 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:57.711 [2024-07-24 18:46:42.628692] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:57.711 [2024-07-24 18:46:42.628729] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x957470 name Existed_Raid, state configuring 00:09:57.711 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:57.970 [2024-07-24 18:46:42.797154] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:57.970 [2024-07-24 18:46:42.798199] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:57.970 [2024-07-24 18:46:42.798223] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:57.970 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:09:57.970 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:57.970 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:57.970 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:57.970 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:57.970 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:57.970 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:57.970 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:57.970 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:57.970 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:57.970 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:57.970 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:57.970 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:57.970 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:58.229 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:58.229 "name": "Existed_Raid", 00:09:58.229 "uuid": "a22a7315-87f2-41ca-b656-2d171920404e", 00:09:58.229 "strip_size_kb": 64, 00:09:58.229 "state": "configuring", 00:09:58.229 "raid_level": "concat", 00:09:58.229 "superblock": true, 00:09:58.229 "num_base_bdevs": 2, 00:09:58.229 "num_base_bdevs_discovered": 1, 00:09:58.229 "num_base_bdevs_operational": 2, 00:09:58.229 "base_bdevs_list": [ 00:09:58.229 { 00:09:58.229 "name": "BaseBdev1", 00:09:58.229 "uuid": "0315760c-aee6-49d0-b61a-6170a2ba0863", 00:09:58.229 "is_configured": true, 00:09:58.229 "data_offset": 2048, 00:09:58.229 "data_size": 63488 00:09:58.229 }, 00:09:58.229 { 00:09:58.229 "name": "BaseBdev2", 00:09:58.229 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:58.229 "is_configured": false, 00:09:58.229 "data_offset": 0, 00:09:58.229 "data_size": 0 00:09:58.229 } 00:09:58.229 ] 00:09:58.229 }' 00:09:58.229 18:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:58.229 18:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:58.487 18:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:58.746 [2024-07-24 18:46:43.597951] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:58.746 [2024-07-24 18:46:43.598059] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x958260 00:09:58.746 [2024-07-24 18:46:43.598068] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:58.746 [2024-07-24 18:46:43.598185] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9573c0 00:09:58.746 [2024-07-24 18:46:43.598270] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x958260 00:09:58.746 [2024-07-24 18:46:43.598276] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x958260 00:09:58.746 [2024-07-24 18:46:43.598341] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:58.746 BaseBdev2 00:09:58.746 18:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:09:58.746 18:46:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:09:58.746 18:46:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:58.746 18:46:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:09:58.746 18:46:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:58.746 18:46:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:58.746 18:46:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:59.005 18:46:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:59.005 [ 00:09:59.005 { 00:09:59.005 "name": "BaseBdev2", 00:09:59.005 "aliases": [ 00:09:59.005 "b8644925-11e8-4526-8ed6-2d5e19361d77" 00:09:59.005 ], 00:09:59.005 "product_name": "Malloc disk", 00:09:59.005 "block_size": 512, 00:09:59.005 "num_blocks": 65536, 00:09:59.005 "uuid": "b8644925-11e8-4526-8ed6-2d5e19361d77", 00:09:59.005 "assigned_rate_limits": { 00:09:59.005 "rw_ios_per_sec": 0, 00:09:59.005 "rw_mbytes_per_sec": 0, 00:09:59.005 "r_mbytes_per_sec": 0, 00:09:59.005 "w_mbytes_per_sec": 0 00:09:59.005 }, 00:09:59.005 "claimed": true, 00:09:59.005 "claim_type": "exclusive_write", 00:09:59.005 "zoned": false, 00:09:59.005 "supported_io_types": { 00:09:59.005 "read": true, 00:09:59.005 "write": true, 00:09:59.005 "unmap": true, 00:09:59.005 "flush": true, 00:09:59.005 "reset": true, 00:09:59.005 "nvme_admin": false, 00:09:59.005 "nvme_io": false, 00:09:59.005 "nvme_io_md": false, 00:09:59.005 "write_zeroes": true, 00:09:59.005 "zcopy": true, 00:09:59.005 "get_zone_info": false, 00:09:59.005 "zone_management": false, 00:09:59.005 "zone_append": false, 00:09:59.005 "compare": false, 00:09:59.005 "compare_and_write": false, 00:09:59.005 "abort": true, 00:09:59.005 "seek_hole": false, 00:09:59.005 "seek_data": false, 00:09:59.005 "copy": true, 00:09:59.005 "nvme_iov_md": false 00:09:59.005 }, 00:09:59.005 "memory_domains": [ 00:09:59.005 { 00:09:59.005 "dma_device_id": "system", 00:09:59.005 "dma_device_type": 1 00:09:59.005 }, 00:09:59.005 { 00:09:59.005 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:59.005 "dma_device_type": 2 00:09:59.005 } 00:09:59.005 ], 00:09:59.005 "driver_specific": {} 00:09:59.005 } 00:09:59.005 ] 00:09:59.005 18:46:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:09:59.005 18:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:09:59.005 18:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:59.005 18:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:09:59.005 18:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:59.005 18:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:59.005 18:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:59.006 18:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:59.006 18:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:59.006 18:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:59.006 18:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:59.006 18:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:59.006 18:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:59.006 18:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:59.006 18:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:59.265 18:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:59.265 "name": "Existed_Raid", 00:09:59.265 "uuid": "a22a7315-87f2-41ca-b656-2d171920404e", 00:09:59.265 "strip_size_kb": 64, 00:09:59.265 "state": "online", 00:09:59.265 "raid_level": "concat", 00:09:59.265 "superblock": true, 00:09:59.265 "num_base_bdevs": 2, 00:09:59.265 "num_base_bdevs_discovered": 2, 00:09:59.265 "num_base_bdevs_operational": 2, 00:09:59.265 "base_bdevs_list": [ 00:09:59.265 { 00:09:59.265 "name": "BaseBdev1", 00:09:59.265 "uuid": "0315760c-aee6-49d0-b61a-6170a2ba0863", 00:09:59.265 "is_configured": true, 00:09:59.265 "data_offset": 2048, 00:09:59.265 "data_size": 63488 00:09:59.265 }, 00:09:59.265 { 00:09:59.265 "name": "BaseBdev2", 00:09:59.265 "uuid": "b8644925-11e8-4526-8ed6-2d5e19361d77", 00:09:59.265 "is_configured": true, 00:09:59.265 "data_offset": 2048, 00:09:59.265 "data_size": 63488 00:09:59.265 } 00:09:59.265 ] 00:09:59.265 }' 00:09:59.265 18:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:59.265 18:46:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:59.846 18:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:09:59.846 18:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:09:59.846 18:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:59.846 18:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:59.846 18:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:59.846 18:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:09:59.846 18:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:59.846 18:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:59.846 [2024-07-24 18:46:44.761133] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:59.846 18:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:59.846 "name": "Existed_Raid", 00:09:59.846 "aliases": [ 00:09:59.846 "a22a7315-87f2-41ca-b656-2d171920404e" 00:09:59.846 ], 00:09:59.846 "product_name": "Raid Volume", 00:09:59.846 "block_size": 512, 00:09:59.846 "num_blocks": 126976, 00:09:59.846 "uuid": "a22a7315-87f2-41ca-b656-2d171920404e", 00:09:59.846 "assigned_rate_limits": { 00:09:59.846 "rw_ios_per_sec": 0, 00:09:59.846 "rw_mbytes_per_sec": 0, 00:09:59.846 "r_mbytes_per_sec": 0, 00:09:59.846 "w_mbytes_per_sec": 0 00:09:59.846 }, 00:09:59.846 "claimed": false, 00:09:59.846 "zoned": false, 00:09:59.846 "supported_io_types": { 00:09:59.846 "read": true, 00:09:59.846 "write": true, 00:09:59.846 "unmap": true, 00:09:59.846 "flush": true, 00:09:59.846 "reset": true, 00:09:59.846 "nvme_admin": false, 00:09:59.846 "nvme_io": false, 00:09:59.846 "nvme_io_md": false, 00:09:59.846 "write_zeroes": true, 00:09:59.846 "zcopy": false, 00:09:59.846 "get_zone_info": false, 00:09:59.846 "zone_management": false, 00:09:59.846 "zone_append": false, 00:09:59.847 "compare": false, 00:09:59.847 "compare_and_write": false, 00:09:59.847 "abort": false, 00:09:59.847 "seek_hole": false, 00:09:59.847 "seek_data": false, 00:09:59.847 "copy": false, 00:09:59.847 "nvme_iov_md": false 00:09:59.847 }, 00:09:59.847 "memory_domains": [ 00:09:59.847 { 00:09:59.847 "dma_device_id": "system", 00:09:59.847 "dma_device_type": 1 00:09:59.847 }, 00:09:59.847 { 00:09:59.847 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:59.847 "dma_device_type": 2 00:09:59.847 }, 00:09:59.847 { 00:09:59.847 "dma_device_id": "system", 00:09:59.847 "dma_device_type": 1 00:09:59.847 }, 00:09:59.847 { 00:09:59.847 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:59.847 "dma_device_type": 2 00:09:59.847 } 00:09:59.847 ], 00:09:59.847 "driver_specific": { 00:09:59.847 "raid": { 00:09:59.847 "uuid": "a22a7315-87f2-41ca-b656-2d171920404e", 00:09:59.847 "strip_size_kb": 64, 00:09:59.847 "state": "online", 00:09:59.847 "raid_level": "concat", 00:09:59.847 "superblock": true, 00:09:59.847 "num_base_bdevs": 2, 00:09:59.847 "num_base_bdevs_discovered": 2, 00:09:59.847 "num_base_bdevs_operational": 2, 00:09:59.847 "base_bdevs_list": [ 00:09:59.847 { 00:09:59.847 "name": "BaseBdev1", 00:09:59.847 "uuid": "0315760c-aee6-49d0-b61a-6170a2ba0863", 00:09:59.847 "is_configured": true, 00:09:59.847 "data_offset": 2048, 00:09:59.847 "data_size": 63488 00:09:59.847 }, 00:09:59.847 { 00:09:59.847 "name": "BaseBdev2", 00:09:59.847 "uuid": "b8644925-11e8-4526-8ed6-2d5e19361d77", 00:09:59.847 "is_configured": true, 00:09:59.847 "data_offset": 2048, 00:09:59.847 "data_size": 63488 00:09:59.847 } 00:09:59.847 ] 00:09:59.847 } 00:09:59.847 } 00:09:59.847 }' 00:09:59.847 18:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:59.847 18:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:09:59.847 BaseBdev2' 00:09:59.847 18:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:59.847 18:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:59.847 18:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:00.106 18:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:00.106 "name": "BaseBdev1", 00:10:00.106 "aliases": [ 00:10:00.106 "0315760c-aee6-49d0-b61a-6170a2ba0863" 00:10:00.106 ], 00:10:00.106 "product_name": "Malloc disk", 00:10:00.106 "block_size": 512, 00:10:00.106 "num_blocks": 65536, 00:10:00.106 "uuid": "0315760c-aee6-49d0-b61a-6170a2ba0863", 00:10:00.106 "assigned_rate_limits": { 00:10:00.106 "rw_ios_per_sec": 0, 00:10:00.106 "rw_mbytes_per_sec": 0, 00:10:00.106 "r_mbytes_per_sec": 0, 00:10:00.106 "w_mbytes_per_sec": 0 00:10:00.106 }, 00:10:00.106 "claimed": true, 00:10:00.106 "claim_type": "exclusive_write", 00:10:00.106 "zoned": false, 00:10:00.106 "supported_io_types": { 00:10:00.106 "read": true, 00:10:00.106 "write": true, 00:10:00.106 "unmap": true, 00:10:00.106 "flush": true, 00:10:00.106 "reset": true, 00:10:00.106 "nvme_admin": false, 00:10:00.106 "nvme_io": false, 00:10:00.106 "nvme_io_md": false, 00:10:00.106 "write_zeroes": true, 00:10:00.106 "zcopy": true, 00:10:00.106 "get_zone_info": false, 00:10:00.106 "zone_management": false, 00:10:00.106 "zone_append": false, 00:10:00.106 "compare": false, 00:10:00.106 "compare_and_write": false, 00:10:00.106 "abort": true, 00:10:00.106 "seek_hole": false, 00:10:00.106 "seek_data": false, 00:10:00.106 "copy": true, 00:10:00.106 "nvme_iov_md": false 00:10:00.106 }, 00:10:00.106 "memory_domains": [ 00:10:00.106 { 00:10:00.106 "dma_device_id": "system", 00:10:00.106 "dma_device_type": 1 00:10:00.106 }, 00:10:00.106 { 00:10:00.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:00.106 "dma_device_type": 2 00:10:00.106 } 00:10:00.106 ], 00:10:00.106 "driver_specific": {} 00:10:00.106 }' 00:10:00.106 18:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:00.106 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:00.106 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:00.106 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:00.365 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:00.365 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:00.365 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:00.365 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:00.365 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:00.365 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:00.365 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:00.365 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:00.365 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:00.365 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:00.365 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:00.624 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:00.624 "name": "BaseBdev2", 00:10:00.624 "aliases": [ 00:10:00.624 "b8644925-11e8-4526-8ed6-2d5e19361d77" 00:10:00.624 ], 00:10:00.625 "product_name": "Malloc disk", 00:10:00.625 "block_size": 512, 00:10:00.625 "num_blocks": 65536, 00:10:00.625 "uuid": "b8644925-11e8-4526-8ed6-2d5e19361d77", 00:10:00.625 "assigned_rate_limits": { 00:10:00.625 "rw_ios_per_sec": 0, 00:10:00.625 "rw_mbytes_per_sec": 0, 00:10:00.625 "r_mbytes_per_sec": 0, 00:10:00.625 "w_mbytes_per_sec": 0 00:10:00.625 }, 00:10:00.625 "claimed": true, 00:10:00.625 "claim_type": "exclusive_write", 00:10:00.625 "zoned": false, 00:10:00.625 "supported_io_types": { 00:10:00.625 "read": true, 00:10:00.625 "write": true, 00:10:00.625 "unmap": true, 00:10:00.625 "flush": true, 00:10:00.625 "reset": true, 00:10:00.625 "nvme_admin": false, 00:10:00.625 "nvme_io": false, 00:10:00.625 "nvme_io_md": false, 00:10:00.625 "write_zeroes": true, 00:10:00.625 "zcopy": true, 00:10:00.625 "get_zone_info": false, 00:10:00.625 "zone_management": false, 00:10:00.625 "zone_append": false, 00:10:00.625 "compare": false, 00:10:00.625 "compare_and_write": false, 00:10:00.625 "abort": true, 00:10:00.625 "seek_hole": false, 00:10:00.625 "seek_data": false, 00:10:00.625 "copy": true, 00:10:00.625 "nvme_iov_md": false 00:10:00.625 }, 00:10:00.625 "memory_domains": [ 00:10:00.625 { 00:10:00.625 "dma_device_id": "system", 00:10:00.625 "dma_device_type": 1 00:10:00.625 }, 00:10:00.625 { 00:10:00.625 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:00.625 "dma_device_type": 2 00:10:00.625 } 00:10:00.625 ], 00:10:00.625 "driver_specific": {} 00:10:00.625 }' 00:10:00.625 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:00.625 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:00.625 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:00.625 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:00.625 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:00.884 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:00.884 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:00.884 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:00.884 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:00.884 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:00.884 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:00.884 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:00.884 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:01.143 [2024-07-24 18:46:45.912003] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:01.143 [2024-07-24 18:46:45.912024] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:01.143 [2024-07-24 18:46:45.912054] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:01.143 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:01.143 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:10:01.143 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:01.143 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:10:01.143 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:01.143 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:10:01.143 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:01.143 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:01.143 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:01.143 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:01.143 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:01.143 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:01.143 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:01.143 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:01.143 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:01.143 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:01.143 18:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:01.143 18:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:01.143 "name": "Existed_Raid", 00:10:01.143 "uuid": "a22a7315-87f2-41ca-b656-2d171920404e", 00:10:01.143 "strip_size_kb": 64, 00:10:01.143 "state": "offline", 00:10:01.143 "raid_level": "concat", 00:10:01.143 "superblock": true, 00:10:01.143 "num_base_bdevs": 2, 00:10:01.143 "num_base_bdevs_discovered": 1, 00:10:01.143 "num_base_bdevs_operational": 1, 00:10:01.143 "base_bdevs_list": [ 00:10:01.143 { 00:10:01.143 "name": null, 00:10:01.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:01.143 "is_configured": false, 00:10:01.143 "data_offset": 2048, 00:10:01.143 "data_size": 63488 00:10:01.143 }, 00:10:01.143 { 00:10:01.143 "name": "BaseBdev2", 00:10:01.143 "uuid": "b8644925-11e8-4526-8ed6-2d5e19361d77", 00:10:01.143 "is_configured": true, 00:10:01.143 "data_offset": 2048, 00:10:01.143 "data_size": 63488 00:10:01.143 } 00:10:01.143 ] 00:10:01.143 }' 00:10:01.143 18:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:01.143 18:46:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:01.711 18:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:01.711 18:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:01.711 18:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:01.711 18:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:01.970 18:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:01.970 18:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:01.970 18:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:01.970 [2024-07-24 18:46:46.899430] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:01.970 [2024-07-24 18:46:46.899485] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x958260 name Existed_Raid, state offline 00:10:01.970 18:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:01.970 18:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:01.970 18:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:01.970 18:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:02.230 18:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:02.230 18:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:02.230 18:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:02.230 18:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2054517 00:10:02.230 18:46:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2054517 ']' 00:10:02.230 18:46:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2054517 00:10:02.230 18:46:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:10:02.230 18:46:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:02.230 18:46:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2054517 00:10:02.230 18:46:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:02.230 18:46:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:02.230 18:46:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2054517' 00:10:02.230 killing process with pid 2054517 00:10:02.230 18:46:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2054517 00:10:02.230 [2024-07-24 18:46:47.139220] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:02.230 18:46:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2054517 00:10:02.230 [2024-07-24 18:46:47.140029] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:02.489 18:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:10:02.489 00:10:02.489 real 0m7.977s 00:10:02.489 user 0m14.286s 00:10:02.489 sys 0m1.318s 00:10:02.489 18:46:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:02.489 18:46:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:02.489 ************************************ 00:10:02.489 END TEST raid_state_function_test_sb 00:10:02.489 ************************************ 00:10:02.489 18:46:47 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:10:02.489 18:46:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:10:02.489 18:46:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:02.489 18:46:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:02.489 ************************************ 00:10:02.489 START TEST raid_superblock_test 00:10:02.489 ************************************ 00:10:02.489 18:46:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:10:02.489 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:10:02.489 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:10:02.489 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:10:02.489 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:10:02.489 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:10:02.489 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:10:02.489 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:10:02.489 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:10:02.489 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:10:02.489 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:10:02.489 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:10:02.489 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:10:02.490 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:10:02.490 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:10:02.490 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:10:02.490 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:10:02.490 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2056105 00:10:02.490 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2056105 /var/tmp/spdk-raid.sock 00:10:02.490 18:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:10:02.490 18:46:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2056105 ']' 00:10:02.490 18:46:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:02.490 18:46:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:02.490 18:46:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:02.490 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:02.490 18:46:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:02.490 18:46:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:02.490 [2024-07-24 18:46:47.433489] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:10:02.490 [2024-07-24 18:46:47.433528] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2056105 ] 00:10:02.490 [2024-07-24 18:46:47.497637] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:02.749 [2024-07-24 18:46:47.575448] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:02.749 [2024-07-24 18:46:47.629358] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:02.749 [2024-07-24 18:46:47.629386] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:03.317 18:46:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:03.317 18:46:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:10:03.317 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:10:03.317 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:03.317 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:10:03.317 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:10:03.317 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:10:03.317 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:03.317 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:03.317 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:03.317 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:10:03.576 malloc1 00:10:03.576 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:03.576 [2024-07-24 18:46:48.581455] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:03.576 [2024-07-24 18:46:48.581494] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:03.576 [2024-07-24 18:46:48.581507] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23b6e20 00:10:03.576 [2024-07-24 18:46:48.581513] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:03.576 [2024-07-24 18:46:48.582730] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:03.576 [2024-07-24 18:46:48.582751] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:03.835 pt1 00:10:03.835 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:03.835 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:03.835 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:10:03.835 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:10:03.835 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:10:03.835 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:03.835 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:03.835 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:03.836 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:10:03.836 malloc2 00:10:03.836 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:04.094 [2024-07-24 18:46:48.926084] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:04.094 [2024-07-24 18:46:48.926116] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:04.094 [2024-07-24 18:46:48.926126] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2560ed0 00:10:04.094 [2024-07-24 18:46:48.926132] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:04.094 [2024-07-24 18:46:48.927179] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:04.094 [2024-07-24 18:46:48.927199] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:04.094 pt2 00:10:04.094 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:04.094 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:04.094 18:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:10:04.094 [2024-07-24 18:46:49.082520] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:04.094 [2024-07-24 18:46:49.083356] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:04.094 [2024-07-24 18:46:49.083456] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2560170 00:10:04.094 [2024-07-24 18:46:49.083464] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:04.094 [2024-07-24 18:46:49.083597] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23b6750 00:10:04.094 [2024-07-24 18:46:49.083691] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2560170 00:10:04.094 [2024-07-24 18:46:49.083696] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2560170 00:10:04.094 [2024-07-24 18:46:49.083757] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:04.094 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:04.094 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:04.094 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:04.095 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:04.095 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:04.095 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:04.095 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:04.095 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:04.095 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:04.095 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:04.095 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:04.095 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:04.353 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:04.353 "name": "raid_bdev1", 00:10:04.353 "uuid": "ddbe01f2-a5a8-40b3-8925-71ec7a080976", 00:10:04.353 "strip_size_kb": 64, 00:10:04.353 "state": "online", 00:10:04.353 "raid_level": "concat", 00:10:04.353 "superblock": true, 00:10:04.353 "num_base_bdevs": 2, 00:10:04.353 "num_base_bdevs_discovered": 2, 00:10:04.353 "num_base_bdevs_operational": 2, 00:10:04.353 "base_bdevs_list": [ 00:10:04.353 { 00:10:04.353 "name": "pt1", 00:10:04.353 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:04.354 "is_configured": true, 00:10:04.354 "data_offset": 2048, 00:10:04.354 "data_size": 63488 00:10:04.354 }, 00:10:04.354 { 00:10:04.354 "name": "pt2", 00:10:04.354 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:04.354 "is_configured": true, 00:10:04.354 "data_offset": 2048, 00:10:04.354 "data_size": 63488 00:10:04.354 } 00:10:04.354 ] 00:10:04.354 }' 00:10:04.354 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:04.354 18:46:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:04.921 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:10:04.921 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:04.921 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:04.921 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:04.921 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:04.921 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:04.921 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:04.921 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:04.921 [2024-07-24 18:46:49.916829] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:05.179 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:05.179 "name": "raid_bdev1", 00:10:05.179 "aliases": [ 00:10:05.179 "ddbe01f2-a5a8-40b3-8925-71ec7a080976" 00:10:05.179 ], 00:10:05.179 "product_name": "Raid Volume", 00:10:05.179 "block_size": 512, 00:10:05.179 "num_blocks": 126976, 00:10:05.179 "uuid": "ddbe01f2-a5a8-40b3-8925-71ec7a080976", 00:10:05.179 "assigned_rate_limits": { 00:10:05.179 "rw_ios_per_sec": 0, 00:10:05.179 "rw_mbytes_per_sec": 0, 00:10:05.179 "r_mbytes_per_sec": 0, 00:10:05.179 "w_mbytes_per_sec": 0 00:10:05.179 }, 00:10:05.179 "claimed": false, 00:10:05.179 "zoned": false, 00:10:05.179 "supported_io_types": { 00:10:05.179 "read": true, 00:10:05.179 "write": true, 00:10:05.179 "unmap": true, 00:10:05.179 "flush": true, 00:10:05.179 "reset": true, 00:10:05.179 "nvme_admin": false, 00:10:05.179 "nvme_io": false, 00:10:05.179 "nvme_io_md": false, 00:10:05.179 "write_zeroes": true, 00:10:05.179 "zcopy": false, 00:10:05.179 "get_zone_info": false, 00:10:05.179 "zone_management": false, 00:10:05.179 "zone_append": false, 00:10:05.179 "compare": false, 00:10:05.179 "compare_and_write": false, 00:10:05.179 "abort": false, 00:10:05.179 "seek_hole": false, 00:10:05.179 "seek_data": false, 00:10:05.179 "copy": false, 00:10:05.179 "nvme_iov_md": false 00:10:05.179 }, 00:10:05.179 "memory_domains": [ 00:10:05.179 { 00:10:05.179 "dma_device_id": "system", 00:10:05.179 "dma_device_type": 1 00:10:05.179 }, 00:10:05.179 { 00:10:05.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:05.179 "dma_device_type": 2 00:10:05.179 }, 00:10:05.179 { 00:10:05.179 "dma_device_id": "system", 00:10:05.179 "dma_device_type": 1 00:10:05.179 }, 00:10:05.179 { 00:10:05.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:05.179 "dma_device_type": 2 00:10:05.179 } 00:10:05.179 ], 00:10:05.179 "driver_specific": { 00:10:05.179 "raid": { 00:10:05.179 "uuid": "ddbe01f2-a5a8-40b3-8925-71ec7a080976", 00:10:05.179 "strip_size_kb": 64, 00:10:05.179 "state": "online", 00:10:05.179 "raid_level": "concat", 00:10:05.179 "superblock": true, 00:10:05.179 "num_base_bdevs": 2, 00:10:05.179 "num_base_bdevs_discovered": 2, 00:10:05.179 "num_base_bdevs_operational": 2, 00:10:05.180 "base_bdevs_list": [ 00:10:05.180 { 00:10:05.180 "name": "pt1", 00:10:05.180 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:05.180 "is_configured": true, 00:10:05.180 "data_offset": 2048, 00:10:05.180 "data_size": 63488 00:10:05.180 }, 00:10:05.180 { 00:10:05.180 "name": "pt2", 00:10:05.180 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:05.180 "is_configured": true, 00:10:05.180 "data_offset": 2048, 00:10:05.180 "data_size": 63488 00:10:05.180 } 00:10:05.180 ] 00:10:05.180 } 00:10:05.180 } 00:10:05.180 }' 00:10:05.180 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:05.180 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:05.180 pt2' 00:10:05.180 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:05.180 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:05.180 18:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:05.180 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:05.180 "name": "pt1", 00:10:05.180 "aliases": [ 00:10:05.180 "00000000-0000-0000-0000-000000000001" 00:10:05.180 ], 00:10:05.180 "product_name": "passthru", 00:10:05.180 "block_size": 512, 00:10:05.180 "num_blocks": 65536, 00:10:05.180 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:05.180 "assigned_rate_limits": { 00:10:05.180 "rw_ios_per_sec": 0, 00:10:05.180 "rw_mbytes_per_sec": 0, 00:10:05.180 "r_mbytes_per_sec": 0, 00:10:05.180 "w_mbytes_per_sec": 0 00:10:05.180 }, 00:10:05.180 "claimed": true, 00:10:05.180 "claim_type": "exclusive_write", 00:10:05.180 "zoned": false, 00:10:05.180 "supported_io_types": { 00:10:05.180 "read": true, 00:10:05.180 "write": true, 00:10:05.180 "unmap": true, 00:10:05.180 "flush": true, 00:10:05.180 "reset": true, 00:10:05.180 "nvme_admin": false, 00:10:05.180 "nvme_io": false, 00:10:05.180 "nvme_io_md": false, 00:10:05.180 "write_zeroes": true, 00:10:05.180 "zcopy": true, 00:10:05.180 "get_zone_info": false, 00:10:05.180 "zone_management": false, 00:10:05.180 "zone_append": false, 00:10:05.180 "compare": false, 00:10:05.180 "compare_and_write": false, 00:10:05.180 "abort": true, 00:10:05.180 "seek_hole": false, 00:10:05.180 "seek_data": false, 00:10:05.180 "copy": true, 00:10:05.180 "nvme_iov_md": false 00:10:05.180 }, 00:10:05.180 "memory_domains": [ 00:10:05.180 { 00:10:05.180 "dma_device_id": "system", 00:10:05.180 "dma_device_type": 1 00:10:05.180 }, 00:10:05.180 { 00:10:05.180 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:05.180 "dma_device_type": 2 00:10:05.180 } 00:10:05.180 ], 00:10:05.180 "driver_specific": { 00:10:05.180 "passthru": { 00:10:05.180 "name": "pt1", 00:10:05.180 "base_bdev_name": "malloc1" 00:10:05.180 } 00:10:05.180 } 00:10:05.180 }' 00:10:05.180 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:05.438 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:05.438 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:05.438 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:05.438 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:05.438 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:05.438 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:05.438 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:05.438 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:05.438 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:05.438 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:05.438 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:05.438 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:05.438 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:05.438 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:05.696 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:05.696 "name": "pt2", 00:10:05.696 "aliases": [ 00:10:05.696 "00000000-0000-0000-0000-000000000002" 00:10:05.696 ], 00:10:05.696 "product_name": "passthru", 00:10:05.696 "block_size": 512, 00:10:05.696 "num_blocks": 65536, 00:10:05.696 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:05.696 "assigned_rate_limits": { 00:10:05.696 "rw_ios_per_sec": 0, 00:10:05.696 "rw_mbytes_per_sec": 0, 00:10:05.696 "r_mbytes_per_sec": 0, 00:10:05.696 "w_mbytes_per_sec": 0 00:10:05.696 }, 00:10:05.696 "claimed": true, 00:10:05.696 "claim_type": "exclusive_write", 00:10:05.696 "zoned": false, 00:10:05.696 "supported_io_types": { 00:10:05.696 "read": true, 00:10:05.696 "write": true, 00:10:05.696 "unmap": true, 00:10:05.696 "flush": true, 00:10:05.696 "reset": true, 00:10:05.696 "nvme_admin": false, 00:10:05.696 "nvme_io": false, 00:10:05.696 "nvme_io_md": false, 00:10:05.696 "write_zeroes": true, 00:10:05.696 "zcopy": true, 00:10:05.696 "get_zone_info": false, 00:10:05.696 "zone_management": false, 00:10:05.696 "zone_append": false, 00:10:05.696 "compare": false, 00:10:05.696 "compare_and_write": false, 00:10:05.696 "abort": true, 00:10:05.696 "seek_hole": false, 00:10:05.696 "seek_data": false, 00:10:05.696 "copy": true, 00:10:05.696 "nvme_iov_md": false 00:10:05.696 }, 00:10:05.696 "memory_domains": [ 00:10:05.696 { 00:10:05.696 "dma_device_id": "system", 00:10:05.696 "dma_device_type": 1 00:10:05.696 }, 00:10:05.696 { 00:10:05.696 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:05.696 "dma_device_type": 2 00:10:05.696 } 00:10:05.696 ], 00:10:05.696 "driver_specific": { 00:10:05.696 "passthru": { 00:10:05.696 "name": "pt2", 00:10:05.696 "base_bdev_name": "malloc2" 00:10:05.696 } 00:10:05.696 } 00:10:05.696 }' 00:10:05.696 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:05.696 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:05.696 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:05.696 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:05.955 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:05.955 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:05.955 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:05.955 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:05.955 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:05.955 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:05.955 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:05.955 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:05.955 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:05.955 18:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:10:06.214 [2024-07-24 18:46:51.059766] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:06.214 18:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=ddbe01f2-a5a8-40b3-8925-71ec7a080976 00:10:06.214 18:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z ddbe01f2-a5a8-40b3-8925-71ec7a080976 ']' 00:10:06.214 18:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:06.472 [2024-07-24 18:46:51.228047] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:06.473 [2024-07-24 18:46:51.228061] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:06.473 [2024-07-24 18:46:51.228104] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:06.473 [2024-07-24 18:46:51.228134] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:06.473 [2024-07-24 18:46:51.228140] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2560170 name raid_bdev1, state offline 00:10:06.473 18:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:10:06.473 18:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:06.473 18:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:10:06.473 18:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:10:06.473 18:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:06.473 18:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:06.732 18:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:06.732 18:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:06.732 18:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:06.993 18:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:06.993 18:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:10:06.993 18:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:10:06.993 18:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:10:06.993 18:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:10:06.993 18:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:06.993 18:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:06.993 18:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:06.993 18:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:06.993 18:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:06.993 18:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:06.993 18:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:06.993 18:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:06.993 18:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:10:07.299 [2024-07-24 18:46:52.070354] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:07.299 [2024-07-24 18:46:52.071310] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:07.299 [2024-07-24 18:46:52.071353] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:07.299 [2024-07-24 18:46:52.071380] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:07.299 [2024-07-24 18:46:52.071390] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:07.299 [2024-07-24 18:46:52.071395] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2561900 name raid_bdev1, state configuring 00:10:07.299 request: 00:10:07.299 { 00:10:07.299 "name": "raid_bdev1", 00:10:07.299 "raid_level": "concat", 00:10:07.299 "base_bdevs": [ 00:10:07.299 "malloc1", 00:10:07.299 "malloc2" 00:10:07.299 ], 00:10:07.299 "strip_size_kb": 64, 00:10:07.299 "superblock": false, 00:10:07.299 "method": "bdev_raid_create", 00:10:07.299 "req_id": 1 00:10:07.299 } 00:10:07.299 Got JSON-RPC error response 00:10:07.299 response: 00:10:07.299 { 00:10:07.299 "code": -17, 00:10:07.299 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:07.299 } 00:10:07.299 18:46:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:10:07.299 18:46:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:07.299 18:46:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:10:07.299 18:46:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:07.299 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:07.299 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:10:07.299 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:10:07.299 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:10:07.299 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:07.558 [2024-07-24 18:46:52.415209] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:07.558 [2024-07-24 18:46:52.415246] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:07.558 [2024-07-24 18:46:52.415257] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23b5c10 00:10:07.558 [2024-07-24 18:46:52.415279] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:07.558 [2024-07-24 18:46:52.416513] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:07.558 [2024-07-24 18:46:52.416534] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:07.558 [2024-07-24 18:46:52.416584] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:07.558 [2024-07-24 18:46:52.416601] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:07.558 pt1 00:10:07.558 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:10:07.558 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:07.558 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:07.558 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:07.558 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:07.558 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:07.558 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:07.558 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:07.558 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:07.558 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:07.558 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:07.558 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:07.817 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:07.817 "name": "raid_bdev1", 00:10:07.817 "uuid": "ddbe01f2-a5a8-40b3-8925-71ec7a080976", 00:10:07.817 "strip_size_kb": 64, 00:10:07.817 "state": "configuring", 00:10:07.817 "raid_level": "concat", 00:10:07.817 "superblock": true, 00:10:07.817 "num_base_bdevs": 2, 00:10:07.817 "num_base_bdevs_discovered": 1, 00:10:07.817 "num_base_bdevs_operational": 2, 00:10:07.817 "base_bdevs_list": [ 00:10:07.817 { 00:10:07.817 "name": "pt1", 00:10:07.817 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:07.817 "is_configured": true, 00:10:07.817 "data_offset": 2048, 00:10:07.817 "data_size": 63488 00:10:07.817 }, 00:10:07.817 { 00:10:07.817 "name": null, 00:10:07.817 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:07.817 "is_configured": false, 00:10:07.817 "data_offset": 2048, 00:10:07.817 "data_size": 63488 00:10:07.817 } 00:10:07.817 ] 00:10:07.817 }' 00:10:07.817 18:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:07.817 18:46:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:08.383 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:10:08.383 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:10:08.383 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:08.383 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:08.383 [2024-07-24 18:46:53.233336] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:08.383 [2024-07-24 18:46:53.233374] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:08.383 [2024-07-24 18:46:53.233385] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23b7050 00:10:08.383 [2024-07-24 18:46:53.233391] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:08.383 [2024-07-24 18:46:53.233657] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:08.383 [2024-07-24 18:46:53.233667] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:08.383 [2024-07-24 18:46:53.233713] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:08.383 [2024-07-24 18:46:53.233725] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:08.383 [2024-07-24 18:46:53.233791] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23b60c0 00:10:08.383 [2024-07-24 18:46:53.233797] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:08.383 [2024-07-24 18:46:53.233908] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2565090 00:10:08.383 [2024-07-24 18:46:53.233990] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23b60c0 00:10:08.383 [2024-07-24 18:46:53.233995] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23b60c0 00:10:08.383 [2024-07-24 18:46:53.234060] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:08.383 pt2 00:10:08.383 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:10:08.383 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:08.383 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:08.383 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:08.383 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:08.383 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:08.383 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:08.383 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:08.383 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:08.383 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:08.383 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:08.383 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:08.383 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:08.383 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:08.641 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:08.641 "name": "raid_bdev1", 00:10:08.641 "uuid": "ddbe01f2-a5a8-40b3-8925-71ec7a080976", 00:10:08.641 "strip_size_kb": 64, 00:10:08.641 "state": "online", 00:10:08.641 "raid_level": "concat", 00:10:08.641 "superblock": true, 00:10:08.641 "num_base_bdevs": 2, 00:10:08.641 "num_base_bdevs_discovered": 2, 00:10:08.641 "num_base_bdevs_operational": 2, 00:10:08.641 "base_bdevs_list": [ 00:10:08.641 { 00:10:08.641 "name": "pt1", 00:10:08.641 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:08.641 "is_configured": true, 00:10:08.641 "data_offset": 2048, 00:10:08.641 "data_size": 63488 00:10:08.641 }, 00:10:08.641 { 00:10:08.641 "name": "pt2", 00:10:08.641 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:08.641 "is_configured": true, 00:10:08.641 "data_offset": 2048, 00:10:08.641 "data_size": 63488 00:10:08.641 } 00:10:08.641 ] 00:10:08.641 }' 00:10:08.641 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:08.642 18:46:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:09.207 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:10:09.207 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:09.207 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:09.207 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:09.207 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:09.207 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:09.207 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:09.207 18:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:09.207 [2024-07-24 18:46:54.075703] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:09.207 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:09.207 "name": "raid_bdev1", 00:10:09.207 "aliases": [ 00:10:09.207 "ddbe01f2-a5a8-40b3-8925-71ec7a080976" 00:10:09.207 ], 00:10:09.207 "product_name": "Raid Volume", 00:10:09.207 "block_size": 512, 00:10:09.207 "num_blocks": 126976, 00:10:09.207 "uuid": "ddbe01f2-a5a8-40b3-8925-71ec7a080976", 00:10:09.207 "assigned_rate_limits": { 00:10:09.207 "rw_ios_per_sec": 0, 00:10:09.207 "rw_mbytes_per_sec": 0, 00:10:09.207 "r_mbytes_per_sec": 0, 00:10:09.207 "w_mbytes_per_sec": 0 00:10:09.207 }, 00:10:09.207 "claimed": false, 00:10:09.207 "zoned": false, 00:10:09.207 "supported_io_types": { 00:10:09.207 "read": true, 00:10:09.207 "write": true, 00:10:09.207 "unmap": true, 00:10:09.207 "flush": true, 00:10:09.207 "reset": true, 00:10:09.207 "nvme_admin": false, 00:10:09.207 "nvme_io": false, 00:10:09.207 "nvme_io_md": false, 00:10:09.207 "write_zeroes": true, 00:10:09.207 "zcopy": false, 00:10:09.207 "get_zone_info": false, 00:10:09.207 "zone_management": false, 00:10:09.207 "zone_append": false, 00:10:09.207 "compare": false, 00:10:09.207 "compare_and_write": false, 00:10:09.207 "abort": false, 00:10:09.207 "seek_hole": false, 00:10:09.207 "seek_data": false, 00:10:09.207 "copy": false, 00:10:09.207 "nvme_iov_md": false 00:10:09.207 }, 00:10:09.207 "memory_domains": [ 00:10:09.207 { 00:10:09.207 "dma_device_id": "system", 00:10:09.207 "dma_device_type": 1 00:10:09.207 }, 00:10:09.207 { 00:10:09.207 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:09.207 "dma_device_type": 2 00:10:09.207 }, 00:10:09.207 { 00:10:09.207 "dma_device_id": "system", 00:10:09.207 "dma_device_type": 1 00:10:09.207 }, 00:10:09.207 { 00:10:09.207 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:09.207 "dma_device_type": 2 00:10:09.207 } 00:10:09.207 ], 00:10:09.207 "driver_specific": { 00:10:09.207 "raid": { 00:10:09.207 "uuid": "ddbe01f2-a5a8-40b3-8925-71ec7a080976", 00:10:09.207 "strip_size_kb": 64, 00:10:09.207 "state": "online", 00:10:09.207 "raid_level": "concat", 00:10:09.207 "superblock": true, 00:10:09.207 "num_base_bdevs": 2, 00:10:09.207 "num_base_bdevs_discovered": 2, 00:10:09.207 "num_base_bdevs_operational": 2, 00:10:09.207 "base_bdevs_list": [ 00:10:09.207 { 00:10:09.207 "name": "pt1", 00:10:09.207 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:09.207 "is_configured": true, 00:10:09.207 "data_offset": 2048, 00:10:09.207 "data_size": 63488 00:10:09.207 }, 00:10:09.207 { 00:10:09.207 "name": "pt2", 00:10:09.207 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:09.207 "is_configured": true, 00:10:09.207 "data_offset": 2048, 00:10:09.207 "data_size": 63488 00:10:09.207 } 00:10:09.207 ] 00:10:09.207 } 00:10:09.207 } 00:10:09.207 }' 00:10:09.207 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:09.207 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:09.207 pt2' 00:10:09.207 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:09.207 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:09.207 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:09.465 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:09.465 "name": "pt1", 00:10:09.465 "aliases": [ 00:10:09.465 "00000000-0000-0000-0000-000000000001" 00:10:09.465 ], 00:10:09.465 "product_name": "passthru", 00:10:09.465 "block_size": 512, 00:10:09.465 "num_blocks": 65536, 00:10:09.465 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:09.465 "assigned_rate_limits": { 00:10:09.465 "rw_ios_per_sec": 0, 00:10:09.465 "rw_mbytes_per_sec": 0, 00:10:09.465 "r_mbytes_per_sec": 0, 00:10:09.465 "w_mbytes_per_sec": 0 00:10:09.465 }, 00:10:09.465 "claimed": true, 00:10:09.465 "claim_type": "exclusive_write", 00:10:09.465 "zoned": false, 00:10:09.465 "supported_io_types": { 00:10:09.465 "read": true, 00:10:09.465 "write": true, 00:10:09.465 "unmap": true, 00:10:09.465 "flush": true, 00:10:09.465 "reset": true, 00:10:09.465 "nvme_admin": false, 00:10:09.465 "nvme_io": false, 00:10:09.465 "nvme_io_md": false, 00:10:09.465 "write_zeroes": true, 00:10:09.465 "zcopy": true, 00:10:09.465 "get_zone_info": false, 00:10:09.465 "zone_management": false, 00:10:09.465 "zone_append": false, 00:10:09.465 "compare": false, 00:10:09.465 "compare_and_write": false, 00:10:09.465 "abort": true, 00:10:09.465 "seek_hole": false, 00:10:09.465 "seek_data": false, 00:10:09.465 "copy": true, 00:10:09.465 "nvme_iov_md": false 00:10:09.465 }, 00:10:09.465 "memory_domains": [ 00:10:09.465 { 00:10:09.465 "dma_device_id": "system", 00:10:09.465 "dma_device_type": 1 00:10:09.465 }, 00:10:09.465 { 00:10:09.465 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:09.465 "dma_device_type": 2 00:10:09.465 } 00:10:09.465 ], 00:10:09.465 "driver_specific": { 00:10:09.465 "passthru": { 00:10:09.465 "name": "pt1", 00:10:09.465 "base_bdev_name": "malloc1" 00:10:09.465 } 00:10:09.465 } 00:10:09.465 }' 00:10:09.465 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:09.465 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:09.465 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:09.465 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:09.465 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:09.465 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:09.465 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:09.723 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:09.723 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:09.723 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:09.723 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:09.723 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:09.723 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:09.723 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:09.723 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:09.980 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:09.980 "name": "pt2", 00:10:09.980 "aliases": [ 00:10:09.980 "00000000-0000-0000-0000-000000000002" 00:10:09.980 ], 00:10:09.980 "product_name": "passthru", 00:10:09.980 "block_size": 512, 00:10:09.980 "num_blocks": 65536, 00:10:09.980 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:09.980 "assigned_rate_limits": { 00:10:09.980 "rw_ios_per_sec": 0, 00:10:09.980 "rw_mbytes_per_sec": 0, 00:10:09.980 "r_mbytes_per_sec": 0, 00:10:09.980 "w_mbytes_per_sec": 0 00:10:09.980 }, 00:10:09.980 "claimed": true, 00:10:09.980 "claim_type": "exclusive_write", 00:10:09.980 "zoned": false, 00:10:09.980 "supported_io_types": { 00:10:09.980 "read": true, 00:10:09.980 "write": true, 00:10:09.980 "unmap": true, 00:10:09.980 "flush": true, 00:10:09.980 "reset": true, 00:10:09.980 "nvme_admin": false, 00:10:09.980 "nvme_io": false, 00:10:09.980 "nvme_io_md": false, 00:10:09.980 "write_zeroes": true, 00:10:09.980 "zcopy": true, 00:10:09.980 "get_zone_info": false, 00:10:09.980 "zone_management": false, 00:10:09.980 "zone_append": false, 00:10:09.980 "compare": false, 00:10:09.980 "compare_and_write": false, 00:10:09.980 "abort": true, 00:10:09.980 "seek_hole": false, 00:10:09.980 "seek_data": false, 00:10:09.980 "copy": true, 00:10:09.980 "nvme_iov_md": false 00:10:09.980 }, 00:10:09.980 "memory_domains": [ 00:10:09.980 { 00:10:09.980 "dma_device_id": "system", 00:10:09.980 "dma_device_type": 1 00:10:09.980 }, 00:10:09.980 { 00:10:09.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:09.980 "dma_device_type": 2 00:10:09.980 } 00:10:09.980 ], 00:10:09.980 "driver_specific": { 00:10:09.980 "passthru": { 00:10:09.980 "name": "pt2", 00:10:09.980 "base_bdev_name": "malloc2" 00:10:09.980 } 00:10:09.980 } 00:10:09.980 }' 00:10:09.980 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:09.980 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:09.980 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:09.980 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:09.980 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:09.980 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:09.980 18:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:10.238 18:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:10.238 18:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:10.238 18:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:10.238 18:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:10.238 18:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:10.238 18:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:10.238 18:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:10:10.496 [2024-07-24 18:46:55.258809] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:10.496 18:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' ddbe01f2-a5a8-40b3-8925-71ec7a080976 '!=' ddbe01f2-a5a8-40b3-8925-71ec7a080976 ']' 00:10:10.496 18:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:10:10.496 18:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:10.496 18:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:10.496 18:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2056105 00:10:10.496 18:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2056105 ']' 00:10:10.496 18:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2056105 00:10:10.496 18:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:10:10.496 18:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:10.496 18:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2056105 00:10:10.496 18:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:10.496 18:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:10.496 18:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2056105' 00:10:10.496 killing process with pid 2056105 00:10:10.496 18:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2056105 00:10:10.496 [2024-07-24 18:46:55.318024] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:10.496 [2024-07-24 18:46:55.318062] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:10.496 [2024-07-24 18:46:55.318093] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:10.496 [2024-07-24 18:46:55.318099] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23b60c0 name raid_bdev1, state offline 00:10:10.496 18:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2056105 00:10:10.496 [2024-07-24 18:46:55.333102] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:10.754 18:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:10:10.754 00:10:10.754 real 0m8.126s 00:10:10.754 user 0m14.619s 00:10:10.754 sys 0m1.310s 00:10:10.754 18:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:10.754 18:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:10.754 ************************************ 00:10:10.754 END TEST raid_superblock_test 00:10:10.754 ************************************ 00:10:10.754 18:46:55 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:10:10.754 18:46:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:10.754 18:46:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:10.754 18:46:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:10.754 ************************************ 00:10:10.754 START TEST raid_read_error_test 00:10:10.754 ************************************ 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.DrqkMLZ3rS 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2057712 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2057712 /var/tmp/spdk-raid.sock 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2057712 ']' 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:10.754 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:10.754 18:46:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:10.754 [2024-07-24 18:46:55.631009] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:10:10.754 [2024-07-24 18:46:55.631047] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2057712 ] 00:10:10.754 [2024-07-24 18:46:55.695332] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:11.012 [2024-07-24 18:46:55.774080] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:11.012 [2024-07-24 18:46:55.822603] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:11.012 [2024-07-24 18:46:55.822627] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:11.577 18:46:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:11.577 18:46:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:10:11.577 18:46:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:11.577 18:46:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:11.835 BaseBdev1_malloc 00:10:11.835 18:46:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:11.835 true 00:10:11.835 18:46:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:12.093 [2024-07-24 18:46:56.926558] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:12.093 [2024-07-24 18:46:56.926591] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:12.093 [2024-07-24 18:46:56.926603] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x168ad20 00:10:12.093 [2024-07-24 18:46:56.926610] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:12.093 [2024-07-24 18:46:56.927800] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:12.093 [2024-07-24 18:46:56.927820] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:12.093 BaseBdev1 00:10:12.093 18:46:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:12.093 18:46:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:12.093 BaseBdev2_malloc 00:10:12.351 18:46:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:12.351 true 00:10:12.351 18:46:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:12.610 [2024-07-24 18:46:57.423513] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:12.610 [2024-07-24 18:46:57.423544] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:12.610 [2024-07-24 18:46:57.423555] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x168fd50 00:10:12.610 [2024-07-24 18:46:57.423561] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:12.610 [2024-07-24 18:46:57.424646] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:12.610 [2024-07-24 18:46:57.424666] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:12.610 BaseBdev2 00:10:12.610 18:46:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:12.610 [2024-07-24 18:46:57.595975] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:12.610 [2024-07-24 18:46:57.596769] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:12.610 [2024-07-24 18:46:57.596897] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x16910e0 00:10:12.610 [2024-07-24 18:46:57.596905] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:12.610 [2024-07-24 18:46:57.597027] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16997d0 00:10:12.610 [2024-07-24 18:46:57.597124] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16910e0 00:10:12.610 [2024-07-24 18:46:57.597133] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16910e0 00:10:12.610 [2024-07-24 18:46:57.597201] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:12.867 18:46:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:12.867 18:46:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:12.867 18:46:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:12.867 18:46:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:12.867 18:46:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:12.867 18:46:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:12.867 18:46:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:12.867 18:46:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:12.867 18:46:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:12.867 18:46:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:12.867 18:46:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:12.867 18:46:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:12.867 18:46:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:12.867 "name": "raid_bdev1", 00:10:12.867 "uuid": "c2ec402d-2240-493c-b231-633f2fa91d51", 00:10:12.867 "strip_size_kb": 64, 00:10:12.867 "state": "online", 00:10:12.867 "raid_level": "concat", 00:10:12.867 "superblock": true, 00:10:12.867 "num_base_bdevs": 2, 00:10:12.867 "num_base_bdevs_discovered": 2, 00:10:12.867 "num_base_bdevs_operational": 2, 00:10:12.867 "base_bdevs_list": [ 00:10:12.867 { 00:10:12.867 "name": "BaseBdev1", 00:10:12.867 "uuid": "3ffa3b77-d9f7-52b1-807d-e1f5f9dec09a", 00:10:12.867 "is_configured": true, 00:10:12.867 "data_offset": 2048, 00:10:12.867 "data_size": 63488 00:10:12.867 }, 00:10:12.867 { 00:10:12.867 "name": "BaseBdev2", 00:10:12.867 "uuid": "9080a753-feb2-5ae7-a38a-3883f8fcc653", 00:10:12.867 "is_configured": true, 00:10:12.867 "data_offset": 2048, 00:10:12.867 "data_size": 63488 00:10:12.867 } 00:10:12.867 ] 00:10:12.867 }' 00:10:12.867 18:46:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:12.867 18:46:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:13.432 18:46:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:13.433 18:46:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:13.433 [2024-07-24 18:46:58.374190] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x168cac0 00:10:14.369 18:46:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:10:14.627 18:46:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:14.627 18:46:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:10:14.627 18:46:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:14.627 18:46:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:14.627 18:46:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:14.627 18:46:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:14.627 18:46:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:14.627 18:46:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:14.627 18:46:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:14.627 18:46:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:14.627 18:46:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:14.627 18:46:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:14.627 18:46:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:14.627 18:46:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:14.627 18:46:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:14.627 18:46:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:14.627 "name": "raid_bdev1", 00:10:14.627 "uuid": "c2ec402d-2240-493c-b231-633f2fa91d51", 00:10:14.627 "strip_size_kb": 64, 00:10:14.627 "state": "online", 00:10:14.627 "raid_level": "concat", 00:10:14.627 "superblock": true, 00:10:14.627 "num_base_bdevs": 2, 00:10:14.627 "num_base_bdevs_discovered": 2, 00:10:14.627 "num_base_bdevs_operational": 2, 00:10:14.628 "base_bdevs_list": [ 00:10:14.628 { 00:10:14.628 "name": "BaseBdev1", 00:10:14.628 "uuid": "3ffa3b77-d9f7-52b1-807d-e1f5f9dec09a", 00:10:14.628 "is_configured": true, 00:10:14.628 "data_offset": 2048, 00:10:14.628 "data_size": 63488 00:10:14.628 }, 00:10:14.628 { 00:10:14.628 "name": "BaseBdev2", 00:10:14.628 "uuid": "9080a753-feb2-5ae7-a38a-3883f8fcc653", 00:10:14.628 "is_configured": true, 00:10:14.628 "data_offset": 2048, 00:10:14.628 "data_size": 63488 00:10:14.628 } 00:10:14.628 ] 00:10:14.628 }' 00:10:14.628 18:46:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:14.628 18:46:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:15.194 18:47:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:15.452 [2024-07-24 18:47:00.245713] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:15.452 [2024-07-24 18:47:00.245745] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:15.452 [2024-07-24 18:47:00.247789] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:15.452 [2024-07-24 18:47:00.247810] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:15.452 [2024-07-24 18:47:00.247826] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:15.452 [2024-07-24 18:47:00.247832] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16910e0 name raid_bdev1, state offline 00:10:15.452 0 00:10:15.452 18:47:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2057712 00:10:15.452 18:47:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2057712 ']' 00:10:15.452 18:47:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2057712 00:10:15.452 18:47:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:10:15.452 18:47:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:15.452 18:47:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2057712 00:10:15.452 18:47:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:15.452 18:47:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:15.452 18:47:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2057712' 00:10:15.452 killing process with pid 2057712 00:10:15.452 18:47:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2057712 00:10:15.452 [2024-07-24 18:47:00.297089] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:15.452 18:47:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2057712 00:10:15.452 [2024-07-24 18:47:00.306747] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:15.710 18:47:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:15.710 18:47:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.DrqkMLZ3rS 00:10:15.710 18:47:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:15.710 18:47:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.54 00:10:15.710 18:47:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:10:15.710 18:47:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:15.710 18:47:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:15.710 18:47:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.54 != \0\.\0\0 ]] 00:10:15.710 00:10:15.710 real 0m4.924s 00:10:15.710 user 0m7.521s 00:10:15.710 sys 0m0.718s 00:10:15.710 18:47:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:15.710 18:47:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:15.710 ************************************ 00:10:15.710 END TEST raid_read_error_test 00:10:15.710 ************************************ 00:10:15.710 18:47:00 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:10:15.710 18:47:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:15.710 18:47:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:15.710 18:47:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:15.710 ************************************ 00:10:15.710 START TEST raid_write_error_test 00:10:15.710 ************************************ 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:15.710 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:15.711 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.mT5M1NYSfZ 00:10:15.711 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2058687 00:10:15.711 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2058687 /var/tmp/spdk-raid.sock 00:10:15.711 18:47:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:15.711 18:47:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2058687 ']' 00:10:15.711 18:47:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:15.711 18:47:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:15.711 18:47:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:15.711 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:15.711 18:47:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:15.711 18:47:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:15.711 [2024-07-24 18:47:00.615895] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:10:15.711 [2024-07-24 18:47:00.615933] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2058687 ] 00:10:15.711 [2024-07-24 18:47:00.679945] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:15.969 [2024-07-24 18:47:00.759856] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:15.969 [2024-07-24 18:47:00.810375] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:15.969 [2024-07-24 18:47:00.810403] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:16.535 18:47:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:16.535 18:47:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:10:16.535 18:47:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:16.535 18:47:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:16.793 BaseBdev1_malloc 00:10:16.793 18:47:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:16.793 true 00:10:16.793 18:47:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:17.052 [2024-07-24 18:47:01.898776] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:17.052 [2024-07-24 18:47:01.898808] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:17.052 [2024-07-24 18:47:01.898820] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2784d20 00:10:17.052 [2024-07-24 18:47:01.898825] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:17.052 [2024-07-24 18:47:01.900021] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:17.052 [2024-07-24 18:47:01.900043] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:17.052 BaseBdev1 00:10:17.052 18:47:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:17.052 18:47:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:17.311 BaseBdev2_malloc 00:10:17.311 18:47:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:17.311 true 00:10:17.311 18:47:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:17.569 [2024-07-24 18:47:02.379657] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:17.569 [2024-07-24 18:47:02.379686] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:17.569 [2024-07-24 18:47:02.379697] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2789d50 00:10:17.569 [2024-07-24 18:47:02.379702] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:17.569 [2024-07-24 18:47:02.380714] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:17.569 [2024-07-24 18:47:02.380734] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:17.569 BaseBdev2 00:10:17.569 18:47:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:17.569 [2024-07-24 18:47:02.532076] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:17.569 [2024-07-24 18:47:02.532922] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:17.569 [2024-07-24 18:47:02.533052] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x278b0e0 00:10:17.569 [2024-07-24 18:47:02.533061] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:17.569 [2024-07-24 18:47:02.533184] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27937d0 00:10:17.569 [2024-07-24 18:47:02.533283] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x278b0e0 00:10:17.569 [2024-07-24 18:47:02.533288] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x278b0e0 00:10:17.569 [2024-07-24 18:47:02.533358] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:17.569 18:47:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:17.569 18:47:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:17.569 18:47:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:17.569 18:47:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:17.569 18:47:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:17.569 18:47:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:17.569 18:47:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:17.569 18:47:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:17.569 18:47:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:17.569 18:47:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:17.569 18:47:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:17.569 18:47:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:17.827 18:47:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:17.827 "name": "raid_bdev1", 00:10:17.827 "uuid": "adb81524-e3b8-42cd-8bfb-a04647477692", 00:10:17.827 "strip_size_kb": 64, 00:10:17.827 "state": "online", 00:10:17.827 "raid_level": "concat", 00:10:17.827 "superblock": true, 00:10:17.827 "num_base_bdevs": 2, 00:10:17.827 "num_base_bdevs_discovered": 2, 00:10:17.827 "num_base_bdevs_operational": 2, 00:10:17.827 "base_bdevs_list": [ 00:10:17.827 { 00:10:17.827 "name": "BaseBdev1", 00:10:17.827 "uuid": "f250df4a-f816-5d97-aec6-a58b395b10a9", 00:10:17.827 "is_configured": true, 00:10:17.827 "data_offset": 2048, 00:10:17.827 "data_size": 63488 00:10:17.827 }, 00:10:17.827 { 00:10:17.827 "name": "BaseBdev2", 00:10:17.827 "uuid": "a42bd017-8224-5c99-b9f7-3bf0e151cd57", 00:10:17.827 "is_configured": true, 00:10:17.827 "data_offset": 2048, 00:10:17.827 "data_size": 63488 00:10:17.827 } 00:10:17.827 ] 00:10:17.827 }' 00:10:17.827 18:47:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:17.827 18:47:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:18.392 18:47:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:18.392 18:47:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:18.393 [2024-07-24 18:47:03.286259] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2786ac0 00:10:19.326 18:47:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:10:19.584 18:47:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:19.584 18:47:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:10:19.584 18:47:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:19.584 18:47:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:19.584 18:47:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:19.584 18:47:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:19.584 18:47:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:19.584 18:47:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:19.584 18:47:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:19.584 18:47:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:19.584 18:47:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:19.584 18:47:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:19.584 18:47:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:19.584 18:47:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:19.584 18:47:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:19.584 18:47:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:19.584 "name": "raid_bdev1", 00:10:19.584 "uuid": "adb81524-e3b8-42cd-8bfb-a04647477692", 00:10:19.584 "strip_size_kb": 64, 00:10:19.584 "state": "online", 00:10:19.584 "raid_level": "concat", 00:10:19.584 "superblock": true, 00:10:19.584 "num_base_bdevs": 2, 00:10:19.584 "num_base_bdevs_discovered": 2, 00:10:19.584 "num_base_bdevs_operational": 2, 00:10:19.585 "base_bdevs_list": [ 00:10:19.585 { 00:10:19.585 "name": "BaseBdev1", 00:10:19.585 "uuid": "f250df4a-f816-5d97-aec6-a58b395b10a9", 00:10:19.585 "is_configured": true, 00:10:19.585 "data_offset": 2048, 00:10:19.585 "data_size": 63488 00:10:19.585 }, 00:10:19.585 { 00:10:19.585 "name": "BaseBdev2", 00:10:19.585 "uuid": "a42bd017-8224-5c99-b9f7-3bf0e151cd57", 00:10:19.585 "is_configured": true, 00:10:19.585 "data_offset": 2048, 00:10:19.585 "data_size": 63488 00:10:19.585 } 00:10:19.585 ] 00:10:19.585 }' 00:10:19.585 18:47:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:19.585 18:47:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:20.149 18:47:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:20.407 [2024-07-24 18:47:05.194366] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:20.407 [2024-07-24 18:47:05.194398] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:20.407 [2024-07-24 18:47:05.196493] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:20.407 [2024-07-24 18:47:05.196514] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:20.407 [2024-07-24 18:47:05.196530] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:20.407 [2024-07-24 18:47:05.196535] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x278b0e0 name raid_bdev1, state offline 00:10:20.407 0 00:10:20.407 18:47:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2058687 00:10:20.407 18:47:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2058687 ']' 00:10:20.407 18:47:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2058687 00:10:20.407 18:47:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:10:20.407 18:47:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:20.407 18:47:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2058687 00:10:20.407 18:47:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:20.407 18:47:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:20.407 18:47:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2058687' 00:10:20.407 killing process with pid 2058687 00:10:20.407 18:47:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2058687 00:10:20.407 [2024-07-24 18:47:05.255667] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:20.407 18:47:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2058687 00:10:20.407 [2024-07-24 18:47:05.264868] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:20.666 18:47:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.mT5M1NYSfZ 00:10:20.666 18:47:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:20.666 18:47:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:20.666 18:47:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.53 00:10:20.666 18:47:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:10:20.666 18:47:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:20.666 18:47:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:20.666 18:47:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.53 != \0\.\0\0 ]] 00:10:20.666 00:10:20.666 real 0m4.896s 00:10:20.666 user 0m7.499s 00:10:20.666 sys 0m0.683s 00:10:20.666 18:47:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:20.666 18:47:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:20.666 ************************************ 00:10:20.666 END TEST raid_write_error_test 00:10:20.666 ************************************ 00:10:20.666 18:47:05 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:20.666 18:47:05 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:10:20.666 18:47:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:20.666 18:47:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:20.666 18:47:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:20.666 ************************************ 00:10:20.666 START TEST raid_state_function_test 00:10:20.666 ************************************ 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2059508 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2059508' 00:10:20.666 Process raid pid: 2059508 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2059508 /var/tmp/spdk-raid.sock 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2059508 ']' 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:20.666 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:20.666 18:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:20.666 [2024-07-24 18:47:05.582246] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:10:20.666 [2024-07-24 18:47:05.582284] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:20.666 [2024-07-24 18:47:05.646993] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:20.925 [2024-07-24 18:47:05.725881] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:20.925 [2024-07-24 18:47:05.782949] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:20.925 [2024-07-24 18:47:05.782971] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:21.491 18:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:21.491 18:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:10:21.491 18:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:21.881 [2024-07-24 18:47:06.526262] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:21.881 [2024-07-24 18:47:06.526290] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:21.881 [2024-07-24 18:47:06.526296] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:21.881 [2024-07-24 18:47:06.526301] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:21.881 18:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:21.881 18:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:21.881 18:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:21.881 18:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:21.881 18:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:21.881 18:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:21.881 18:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:21.881 18:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:21.881 18:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:21.881 18:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:21.881 18:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:21.881 18:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:21.881 18:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:21.881 "name": "Existed_Raid", 00:10:21.881 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:21.881 "strip_size_kb": 0, 00:10:21.881 "state": "configuring", 00:10:21.881 "raid_level": "raid1", 00:10:21.881 "superblock": false, 00:10:21.881 "num_base_bdevs": 2, 00:10:21.882 "num_base_bdevs_discovered": 0, 00:10:21.882 "num_base_bdevs_operational": 2, 00:10:21.882 "base_bdevs_list": [ 00:10:21.882 { 00:10:21.882 "name": "BaseBdev1", 00:10:21.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:21.882 "is_configured": false, 00:10:21.882 "data_offset": 0, 00:10:21.882 "data_size": 0 00:10:21.882 }, 00:10:21.882 { 00:10:21.882 "name": "BaseBdev2", 00:10:21.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:21.882 "is_configured": false, 00:10:21.882 "data_offset": 0, 00:10:21.882 "data_size": 0 00:10:21.882 } 00:10:21.882 ] 00:10:21.882 }' 00:10:21.882 18:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:21.882 18:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:22.447 18:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:22.447 [2024-07-24 18:47:07.376393] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:22.447 [2024-07-24 18:47:07.376411] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f88b80 name Existed_Raid, state configuring 00:10:22.447 18:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:22.704 [2024-07-24 18:47:07.556867] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:22.704 [2024-07-24 18:47:07.556884] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:22.704 [2024-07-24 18:47:07.556888] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:22.704 [2024-07-24 18:47:07.556893] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:22.704 18:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:22.963 [2024-07-24 18:47:07.733390] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:22.963 BaseBdev1 00:10:22.963 18:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:22.963 18:47:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:22.963 18:47:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:22.963 18:47:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:22.963 18:47:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:22.963 18:47:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:22.963 18:47:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:22.963 18:47:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:23.221 [ 00:10:23.221 { 00:10:23.221 "name": "BaseBdev1", 00:10:23.221 "aliases": [ 00:10:23.221 "2b5b5502-57d4-45f0-b826-232bbb6859c2" 00:10:23.221 ], 00:10:23.221 "product_name": "Malloc disk", 00:10:23.221 "block_size": 512, 00:10:23.221 "num_blocks": 65536, 00:10:23.221 "uuid": "2b5b5502-57d4-45f0-b826-232bbb6859c2", 00:10:23.221 "assigned_rate_limits": { 00:10:23.221 "rw_ios_per_sec": 0, 00:10:23.221 "rw_mbytes_per_sec": 0, 00:10:23.221 "r_mbytes_per_sec": 0, 00:10:23.221 "w_mbytes_per_sec": 0 00:10:23.221 }, 00:10:23.221 "claimed": true, 00:10:23.221 "claim_type": "exclusive_write", 00:10:23.221 "zoned": false, 00:10:23.221 "supported_io_types": { 00:10:23.221 "read": true, 00:10:23.221 "write": true, 00:10:23.221 "unmap": true, 00:10:23.221 "flush": true, 00:10:23.221 "reset": true, 00:10:23.221 "nvme_admin": false, 00:10:23.221 "nvme_io": false, 00:10:23.221 "nvme_io_md": false, 00:10:23.221 "write_zeroes": true, 00:10:23.221 "zcopy": true, 00:10:23.221 "get_zone_info": false, 00:10:23.221 "zone_management": false, 00:10:23.221 "zone_append": false, 00:10:23.221 "compare": false, 00:10:23.221 "compare_and_write": false, 00:10:23.221 "abort": true, 00:10:23.221 "seek_hole": false, 00:10:23.221 "seek_data": false, 00:10:23.221 "copy": true, 00:10:23.221 "nvme_iov_md": false 00:10:23.221 }, 00:10:23.221 "memory_domains": [ 00:10:23.221 { 00:10:23.221 "dma_device_id": "system", 00:10:23.221 "dma_device_type": 1 00:10:23.221 }, 00:10:23.221 { 00:10:23.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:23.221 "dma_device_type": 2 00:10:23.221 } 00:10:23.221 ], 00:10:23.221 "driver_specific": {} 00:10:23.221 } 00:10:23.221 ] 00:10:23.221 18:47:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:23.221 18:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:23.221 18:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:23.221 18:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:23.221 18:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:23.221 18:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:23.221 18:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:23.221 18:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:23.221 18:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:23.221 18:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:23.221 18:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:23.221 18:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:23.221 18:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:23.479 18:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:23.480 "name": "Existed_Raid", 00:10:23.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:23.480 "strip_size_kb": 0, 00:10:23.480 "state": "configuring", 00:10:23.480 "raid_level": "raid1", 00:10:23.480 "superblock": false, 00:10:23.480 "num_base_bdevs": 2, 00:10:23.480 "num_base_bdevs_discovered": 1, 00:10:23.480 "num_base_bdevs_operational": 2, 00:10:23.480 "base_bdevs_list": [ 00:10:23.480 { 00:10:23.480 "name": "BaseBdev1", 00:10:23.480 "uuid": "2b5b5502-57d4-45f0-b826-232bbb6859c2", 00:10:23.480 "is_configured": true, 00:10:23.480 "data_offset": 0, 00:10:23.480 "data_size": 65536 00:10:23.480 }, 00:10:23.480 { 00:10:23.480 "name": "BaseBdev2", 00:10:23.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:23.480 "is_configured": false, 00:10:23.480 "data_offset": 0, 00:10:23.480 "data_size": 0 00:10:23.480 } 00:10:23.480 ] 00:10:23.480 }' 00:10:23.480 18:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:23.480 18:47:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:24.047 18:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:24.047 [2024-07-24 18:47:08.900408] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:24.047 [2024-07-24 18:47:08.900437] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f88470 name Existed_Raid, state configuring 00:10:24.047 18:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:24.306 [2024-07-24 18:47:09.068859] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:24.306 [2024-07-24 18:47:09.069912] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:24.306 [2024-07-24 18:47:09.069936] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:24.306 18:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:24.306 18:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:24.306 18:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:24.306 18:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:24.306 18:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:24.306 18:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:24.306 18:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:24.306 18:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:24.306 18:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:24.306 18:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:24.306 18:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:24.306 18:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:24.306 18:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:24.306 18:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:24.306 18:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:24.306 "name": "Existed_Raid", 00:10:24.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:24.306 "strip_size_kb": 0, 00:10:24.306 "state": "configuring", 00:10:24.306 "raid_level": "raid1", 00:10:24.306 "superblock": false, 00:10:24.306 "num_base_bdevs": 2, 00:10:24.306 "num_base_bdevs_discovered": 1, 00:10:24.306 "num_base_bdevs_operational": 2, 00:10:24.306 "base_bdevs_list": [ 00:10:24.306 { 00:10:24.306 "name": "BaseBdev1", 00:10:24.306 "uuid": "2b5b5502-57d4-45f0-b826-232bbb6859c2", 00:10:24.306 "is_configured": true, 00:10:24.306 "data_offset": 0, 00:10:24.306 "data_size": 65536 00:10:24.306 }, 00:10:24.306 { 00:10:24.306 "name": "BaseBdev2", 00:10:24.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:24.306 "is_configured": false, 00:10:24.306 "data_offset": 0, 00:10:24.306 "data_size": 0 00:10:24.307 } 00:10:24.307 ] 00:10:24.307 }' 00:10:24.307 18:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:24.307 18:47:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:24.874 18:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:24.874 [2024-07-24 18:47:09.869627] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:24.874 [2024-07-24 18:47:09.869654] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f89260 00:10:24.874 [2024-07-24 18:47:09.869658] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:10:24.874 [2024-07-24 18:47:09.869782] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21324f0 00:10:24.874 [2024-07-24 18:47:09.869863] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f89260 00:10:24.874 [2024-07-24 18:47:09.869869] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f89260 00:10:24.874 [2024-07-24 18:47:09.869981] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:24.874 BaseBdev2 00:10:24.874 18:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:24.874 18:47:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:25.133 18:47:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:25.133 18:47:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:25.133 18:47:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:25.133 18:47:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:25.133 18:47:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:25.133 18:47:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:25.392 [ 00:10:25.392 { 00:10:25.392 "name": "BaseBdev2", 00:10:25.392 "aliases": [ 00:10:25.392 "01d56e26-3d15-4d44-93e6-c71c71d89ea2" 00:10:25.392 ], 00:10:25.392 "product_name": "Malloc disk", 00:10:25.392 "block_size": 512, 00:10:25.392 "num_blocks": 65536, 00:10:25.392 "uuid": "01d56e26-3d15-4d44-93e6-c71c71d89ea2", 00:10:25.392 "assigned_rate_limits": { 00:10:25.392 "rw_ios_per_sec": 0, 00:10:25.392 "rw_mbytes_per_sec": 0, 00:10:25.392 "r_mbytes_per_sec": 0, 00:10:25.392 "w_mbytes_per_sec": 0 00:10:25.392 }, 00:10:25.392 "claimed": true, 00:10:25.392 "claim_type": "exclusive_write", 00:10:25.392 "zoned": false, 00:10:25.392 "supported_io_types": { 00:10:25.392 "read": true, 00:10:25.392 "write": true, 00:10:25.392 "unmap": true, 00:10:25.392 "flush": true, 00:10:25.392 "reset": true, 00:10:25.392 "nvme_admin": false, 00:10:25.392 "nvme_io": false, 00:10:25.392 "nvme_io_md": false, 00:10:25.392 "write_zeroes": true, 00:10:25.392 "zcopy": true, 00:10:25.392 "get_zone_info": false, 00:10:25.392 "zone_management": false, 00:10:25.392 "zone_append": false, 00:10:25.392 "compare": false, 00:10:25.392 "compare_and_write": false, 00:10:25.392 "abort": true, 00:10:25.392 "seek_hole": false, 00:10:25.392 "seek_data": false, 00:10:25.392 "copy": true, 00:10:25.392 "nvme_iov_md": false 00:10:25.392 }, 00:10:25.392 "memory_domains": [ 00:10:25.392 { 00:10:25.392 "dma_device_id": "system", 00:10:25.392 "dma_device_type": 1 00:10:25.392 }, 00:10:25.392 { 00:10:25.392 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:25.392 "dma_device_type": 2 00:10:25.392 } 00:10:25.392 ], 00:10:25.392 "driver_specific": {} 00:10:25.392 } 00:10:25.392 ] 00:10:25.392 18:47:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:25.392 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:25.392 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:25.392 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:10:25.392 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:25.392 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:25.392 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:25.392 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:25.392 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:25.392 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:25.392 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:25.392 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:25.392 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:25.392 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:25.392 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:25.392 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:25.392 "name": "Existed_Raid", 00:10:25.392 "uuid": "bf999866-32e1-488a-9386-4a4af3c77dd1", 00:10:25.392 "strip_size_kb": 0, 00:10:25.392 "state": "online", 00:10:25.392 "raid_level": "raid1", 00:10:25.392 "superblock": false, 00:10:25.392 "num_base_bdevs": 2, 00:10:25.392 "num_base_bdevs_discovered": 2, 00:10:25.392 "num_base_bdevs_operational": 2, 00:10:25.392 "base_bdevs_list": [ 00:10:25.392 { 00:10:25.392 "name": "BaseBdev1", 00:10:25.392 "uuid": "2b5b5502-57d4-45f0-b826-232bbb6859c2", 00:10:25.392 "is_configured": true, 00:10:25.392 "data_offset": 0, 00:10:25.392 "data_size": 65536 00:10:25.392 }, 00:10:25.392 { 00:10:25.392 "name": "BaseBdev2", 00:10:25.392 "uuid": "01d56e26-3d15-4d44-93e6-c71c71d89ea2", 00:10:25.392 "is_configured": true, 00:10:25.392 "data_offset": 0, 00:10:25.392 "data_size": 65536 00:10:25.392 } 00:10:25.392 ] 00:10:25.392 }' 00:10:25.392 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:25.392 18:47:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:25.992 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:25.992 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:25.992 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:25.992 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:25.992 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:25.992 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:25.992 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:25.992 18:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:26.251 [2024-07-24 18:47:11.020766] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:26.251 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:26.251 "name": "Existed_Raid", 00:10:26.251 "aliases": [ 00:10:26.251 "bf999866-32e1-488a-9386-4a4af3c77dd1" 00:10:26.251 ], 00:10:26.251 "product_name": "Raid Volume", 00:10:26.251 "block_size": 512, 00:10:26.251 "num_blocks": 65536, 00:10:26.251 "uuid": "bf999866-32e1-488a-9386-4a4af3c77dd1", 00:10:26.251 "assigned_rate_limits": { 00:10:26.251 "rw_ios_per_sec": 0, 00:10:26.251 "rw_mbytes_per_sec": 0, 00:10:26.251 "r_mbytes_per_sec": 0, 00:10:26.251 "w_mbytes_per_sec": 0 00:10:26.251 }, 00:10:26.251 "claimed": false, 00:10:26.251 "zoned": false, 00:10:26.251 "supported_io_types": { 00:10:26.251 "read": true, 00:10:26.251 "write": true, 00:10:26.251 "unmap": false, 00:10:26.251 "flush": false, 00:10:26.251 "reset": true, 00:10:26.251 "nvme_admin": false, 00:10:26.251 "nvme_io": false, 00:10:26.251 "nvme_io_md": false, 00:10:26.251 "write_zeroes": true, 00:10:26.251 "zcopy": false, 00:10:26.251 "get_zone_info": false, 00:10:26.251 "zone_management": false, 00:10:26.251 "zone_append": false, 00:10:26.251 "compare": false, 00:10:26.251 "compare_and_write": false, 00:10:26.251 "abort": false, 00:10:26.251 "seek_hole": false, 00:10:26.251 "seek_data": false, 00:10:26.251 "copy": false, 00:10:26.251 "nvme_iov_md": false 00:10:26.251 }, 00:10:26.251 "memory_domains": [ 00:10:26.251 { 00:10:26.251 "dma_device_id": "system", 00:10:26.251 "dma_device_type": 1 00:10:26.251 }, 00:10:26.251 { 00:10:26.251 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:26.251 "dma_device_type": 2 00:10:26.251 }, 00:10:26.251 { 00:10:26.251 "dma_device_id": "system", 00:10:26.251 "dma_device_type": 1 00:10:26.251 }, 00:10:26.251 { 00:10:26.251 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:26.251 "dma_device_type": 2 00:10:26.251 } 00:10:26.251 ], 00:10:26.251 "driver_specific": { 00:10:26.251 "raid": { 00:10:26.251 "uuid": "bf999866-32e1-488a-9386-4a4af3c77dd1", 00:10:26.251 "strip_size_kb": 0, 00:10:26.251 "state": "online", 00:10:26.251 "raid_level": "raid1", 00:10:26.251 "superblock": false, 00:10:26.251 "num_base_bdevs": 2, 00:10:26.251 "num_base_bdevs_discovered": 2, 00:10:26.251 "num_base_bdevs_operational": 2, 00:10:26.251 "base_bdevs_list": [ 00:10:26.251 { 00:10:26.251 "name": "BaseBdev1", 00:10:26.251 "uuid": "2b5b5502-57d4-45f0-b826-232bbb6859c2", 00:10:26.251 "is_configured": true, 00:10:26.251 "data_offset": 0, 00:10:26.251 "data_size": 65536 00:10:26.251 }, 00:10:26.251 { 00:10:26.251 "name": "BaseBdev2", 00:10:26.251 "uuid": "01d56e26-3d15-4d44-93e6-c71c71d89ea2", 00:10:26.251 "is_configured": true, 00:10:26.251 "data_offset": 0, 00:10:26.251 "data_size": 65536 00:10:26.251 } 00:10:26.251 ] 00:10:26.251 } 00:10:26.251 } 00:10:26.251 }' 00:10:26.251 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:26.251 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:26.251 BaseBdev2' 00:10:26.251 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:26.251 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:26.251 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:26.251 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:26.251 "name": "BaseBdev1", 00:10:26.251 "aliases": [ 00:10:26.251 "2b5b5502-57d4-45f0-b826-232bbb6859c2" 00:10:26.251 ], 00:10:26.251 "product_name": "Malloc disk", 00:10:26.251 "block_size": 512, 00:10:26.251 "num_blocks": 65536, 00:10:26.251 "uuid": "2b5b5502-57d4-45f0-b826-232bbb6859c2", 00:10:26.251 "assigned_rate_limits": { 00:10:26.251 "rw_ios_per_sec": 0, 00:10:26.251 "rw_mbytes_per_sec": 0, 00:10:26.251 "r_mbytes_per_sec": 0, 00:10:26.251 "w_mbytes_per_sec": 0 00:10:26.251 }, 00:10:26.251 "claimed": true, 00:10:26.251 "claim_type": "exclusive_write", 00:10:26.251 "zoned": false, 00:10:26.251 "supported_io_types": { 00:10:26.251 "read": true, 00:10:26.251 "write": true, 00:10:26.251 "unmap": true, 00:10:26.251 "flush": true, 00:10:26.251 "reset": true, 00:10:26.251 "nvme_admin": false, 00:10:26.251 "nvme_io": false, 00:10:26.251 "nvme_io_md": false, 00:10:26.251 "write_zeroes": true, 00:10:26.251 "zcopy": true, 00:10:26.251 "get_zone_info": false, 00:10:26.251 "zone_management": false, 00:10:26.251 "zone_append": false, 00:10:26.251 "compare": false, 00:10:26.251 "compare_and_write": false, 00:10:26.251 "abort": true, 00:10:26.251 "seek_hole": false, 00:10:26.251 "seek_data": false, 00:10:26.251 "copy": true, 00:10:26.251 "nvme_iov_md": false 00:10:26.251 }, 00:10:26.251 "memory_domains": [ 00:10:26.251 { 00:10:26.251 "dma_device_id": "system", 00:10:26.251 "dma_device_type": 1 00:10:26.251 }, 00:10:26.251 { 00:10:26.251 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:26.251 "dma_device_type": 2 00:10:26.251 } 00:10:26.251 ], 00:10:26.251 "driver_specific": {} 00:10:26.251 }' 00:10:26.251 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:26.510 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:26.510 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:26.510 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:26.510 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:26.510 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:26.510 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:26.510 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:26.510 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:26.510 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:26.769 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:26.769 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:26.769 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:26.769 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:26.769 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:26.769 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:26.769 "name": "BaseBdev2", 00:10:26.769 "aliases": [ 00:10:26.769 "01d56e26-3d15-4d44-93e6-c71c71d89ea2" 00:10:26.769 ], 00:10:26.769 "product_name": "Malloc disk", 00:10:26.769 "block_size": 512, 00:10:26.769 "num_blocks": 65536, 00:10:26.769 "uuid": "01d56e26-3d15-4d44-93e6-c71c71d89ea2", 00:10:26.769 "assigned_rate_limits": { 00:10:26.769 "rw_ios_per_sec": 0, 00:10:26.769 "rw_mbytes_per_sec": 0, 00:10:26.769 "r_mbytes_per_sec": 0, 00:10:26.769 "w_mbytes_per_sec": 0 00:10:26.769 }, 00:10:26.769 "claimed": true, 00:10:26.769 "claim_type": "exclusive_write", 00:10:26.769 "zoned": false, 00:10:26.769 "supported_io_types": { 00:10:26.769 "read": true, 00:10:26.769 "write": true, 00:10:26.769 "unmap": true, 00:10:26.769 "flush": true, 00:10:26.769 "reset": true, 00:10:26.769 "nvme_admin": false, 00:10:26.769 "nvme_io": false, 00:10:26.769 "nvme_io_md": false, 00:10:26.769 "write_zeroes": true, 00:10:26.769 "zcopy": true, 00:10:26.769 "get_zone_info": false, 00:10:26.769 "zone_management": false, 00:10:26.769 "zone_append": false, 00:10:26.769 "compare": false, 00:10:26.769 "compare_and_write": false, 00:10:26.769 "abort": true, 00:10:26.769 "seek_hole": false, 00:10:26.769 "seek_data": false, 00:10:26.769 "copy": true, 00:10:26.769 "nvme_iov_md": false 00:10:26.769 }, 00:10:26.769 "memory_domains": [ 00:10:26.769 { 00:10:26.769 "dma_device_id": "system", 00:10:26.769 "dma_device_type": 1 00:10:26.769 }, 00:10:26.769 { 00:10:26.769 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:26.769 "dma_device_type": 2 00:10:26.769 } 00:10:26.769 ], 00:10:26.769 "driver_specific": {} 00:10:26.769 }' 00:10:26.769 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:27.028 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:27.028 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:27.028 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:27.028 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:27.028 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:27.028 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:27.028 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:27.028 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:27.028 18:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:27.028 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:27.287 [2024-07-24 18:47:12.199672] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:27.287 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:27.546 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:27.546 "name": "Existed_Raid", 00:10:27.546 "uuid": "bf999866-32e1-488a-9386-4a4af3c77dd1", 00:10:27.546 "strip_size_kb": 0, 00:10:27.546 "state": "online", 00:10:27.546 "raid_level": "raid1", 00:10:27.546 "superblock": false, 00:10:27.546 "num_base_bdevs": 2, 00:10:27.546 "num_base_bdevs_discovered": 1, 00:10:27.546 "num_base_bdevs_operational": 1, 00:10:27.546 "base_bdevs_list": [ 00:10:27.546 { 00:10:27.546 "name": null, 00:10:27.546 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:27.546 "is_configured": false, 00:10:27.546 "data_offset": 0, 00:10:27.546 "data_size": 65536 00:10:27.546 }, 00:10:27.546 { 00:10:27.546 "name": "BaseBdev2", 00:10:27.546 "uuid": "01d56e26-3d15-4d44-93e6-c71c71d89ea2", 00:10:27.546 "is_configured": true, 00:10:27.546 "data_offset": 0, 00:10:27.546 "data_size": 65536 00:10:27.546 } 00:10:27.546 ] 00:10:27.546 }' 00:10:27.546 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:27.546 18:47:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:28.114 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:28.114 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:28.114 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:28.114 18:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:28.114 18:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:28.114 18:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:28.114 18:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:28.373 [2024-07-24 18:47:13.183041] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:28.373 [2024-07-24 18:47:13.183102] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:28.373 [2024-07-24 18:47:13.192943] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:28.373 [2024-07-24 18:47:13.192985] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:28.373 [2024-07-24 18:47:13.192990] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f89260 name Existed_Raid, state offline 00:10:28.373 18:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:28.373 18:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:28.373 18:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:28.373 18:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:28.631 18:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:28.631 18:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:28.631 18:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:28.631 18:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2059508 00:10:28.631 18:47:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2059508 ']' 00:10:28.631 18:47:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2059508 00:10:28.631 18:47:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:10:28.631 18:47:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:28.631 18:47:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2059508 00:10:28.631 18:47:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:28.631 18:47:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:28.631 18:47:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2059508' 00:10:28.631 killing process with pid 2059508 00:10:28.631 18:47:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2059508 00:10:28.631 [2024-07-24 18:47:13.434052] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:28.631 18:47:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2059508 00:10:28.631 [2024-07-24 18:47:13.434839] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:28.631 18:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:10:28.631 00:10:28.631 real 0m8.080s 00:10:28.631 user 0m14.478s 00:10:28.631 sys 0m1.329s 00:10:28.631 18:47:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:28.631 18:47:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:28.631 ************************************ 00:10:28.631 END TEST raid_state_function_test 00:10:28.631 ************************************ 00:10:28.889 18:47:13 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:10:28.889 18:47:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:28.889 18:47:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:28.889 18:47:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:28.889 ************************************ 00:10:28.889 START TEST raid_state_function_test_sb 00:10:28.889 ************************************ 00:10:28.889 18:47:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:10:28.889 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:10:28.889 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:28.889 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:10:28.889 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:28.889 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:28.889 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:28.889 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:28.889 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:28.889 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:28.889 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:28.889 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:28.889 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2061108 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2061108' 00:10:28.890 Process raid pid: 2061108 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2061108 /var/tmp/spdk-raid.sock 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2061108 ']' 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:28.890 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:28.890 18:47:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:28.890 [2024-07-24 18:47:13.733587] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:10:28.890 [2024-07-24 18:47:13.733626] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:28.890 [2024-07-24 18:47:13.796809] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:28.890 [2024-07-24 18:47:13.874428] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:29.148 [2024-07-24 18:47:13.928935] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:29.148 [2024-07-24 18:47:13.928960] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:29.715 18:47:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:29.715 18:47:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:10:29.715 18:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:29.715 [2024-07-24 18:47:14.687506] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:29.715 [2024-07-24 18:47:14.687534] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:29.716 [2024-07-24 18:47:14.687540] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:29.716 [2024-07-24 18:47:14.687546] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:29.716 18:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:29.716 18:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:29.716 18:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:29.716 18:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:29.716 18:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:29.716 18:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:29.716 18:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:29.716 18:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:29.716 18:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:29.716 18:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:29.716 18:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:29.716 18:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:29.974 18:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:29.974 "name": "Existed_Raid", 00:10:29.974 "uuid": "2e4fcfa9-37f3-41ed-a464-3ef37a92850b", 00:10:29.974 "strip_size_kb": 0, 00:10:29.974 "state": "configuring", 00:10:29.974 "raid_level": "raid1", 00:10:29.974 "superblock": true, 00:10:29.974 "num_base_bdevs": 2, 00:10:29.974 "num_base_bdevs_discovered": 0, 00:10:29.974 "num_base_bdevs_operational": 2, 00:10:29.974 "base_bdevs_list": [ 00:10:29.974 { 00:10:29.974 "name": "BaseBdev1", 00:10:29.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:29.974 "is_configured": false, 00:10:29.974 "data_offset": 0, 00:10:29.974 "data_size": 0 00:10:29.974 }, 00:10:29.974 { 00:10:29.974 "name": "BaseBdev2", 00:10:29.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:29.974 "is_configured": false, 00:10:29.974 "data_offset": 0, 00:10:29.974 "data_size": 0 00:10:29.974 } 00:10:29.974 ] 00:10:29.974 }' 00:10:29.974 18:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:29.974 18:47:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:30.541 18:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:30.541 [2024-07-24 18:47:15.485517] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:30.541 [2024-07-24 18:47:15.485538] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9d3b80 name Existed_Raid, state configuring 00:10:30.541 18:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:30.798 [2024-07-24 18:47:15.649952] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:30.798 [2024-07-24 18:47:15.649971] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:30.798 [2024-07-24 18:47:15.649976] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:30.798 [2024-07-24 18:47:15.649982] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:30.798 18:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:31.056 [2024-07-24 18:47:15.834416] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:31.056 BaseBdev1 00:10:31.056 18:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:31.056 18:47:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:31.056 18:47:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:31.056 18:47:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:31.056 18:47:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:31.056 18:47:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:31.056 18:47:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:31.056 18:47:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:31.314 [ 00:10:31.314 { 00:10:31.314 "name": "BaseBdev1", 00:10:31.314 "aliases": [ 00:10:31.314 "8d65389b-a95c-432f-bc25-fbb5a4e123fc" 00:10:31.314 ], 00:10:31.314 "product_name": "Malloc disk", 00:10:31.314 "block_size": 512, 00:10:31.314 "num_blocks": 65536, 00:10:31.314 "uuid": "8d65389b-a95c-432f-bc25-fbb5a4e123fc", 00:10:31.314 "assigned_rate_limits": { 00:10:31.314 "rw_ios_per_sec": 0, 00:10:31.314 "rw_mbytes_per_sec": 0, 00:10:31.314 "r_mbytes_per_sec": 0, 00:10:31.314 "w_mbytes_per_sec": 0 00:10:31.314 }, 00:10:31.314 "claimed": true, 00:10:31.314 "claim_type": "exclusive_write", 00:10:31.314 "zoned": false, 00:10:31.314 "supported_io_types": { 00:10:31.314 "read": true, 00:10:31.314 "write": true, 00:10:31.314 "unmap": true, 00:10:31.314 "flush": true, 00:10:31.314 "reset": true, 00:10:31.314 "nvme_admin": false, 00:10:31.314 "nvme_io": false, 00:10:31.314 "nvme_io_md": false, 00:10:31.314 "write_zeroes": true, 00:10:31.314 "zcopy": true, 00:10:31.314 "get_zone_info": false, 00:10:31.314 "zone_management": false, 00:10:31.314 "zone_append": false, 00:10:31.314 "compare": false, 00:10:31.314 "compare_and_write": false, 00:10:31.314 "abort": true, 00:10:31.314 "seek_hole": false, 00:10:31.314 "seek_data": false, 00:10:31.314 "copy": true, 00:10:31.314 "nvme_iov_md": false 00:10:31.314 }, 00:10:31.314 "memory_domains": [ 00:10:31.314 { 00:10:31.314 "dma_device_id": "system", 00:10:31.314 "dma_device_type": 1 00:10:31.314 }, 00:10:31.314 { 00:10:31.314 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:31.314 "dma_device_type": 2 00:10:31.314 } 00:10:31.314 ], 00:10:31.314 "driver_specific": {} 00:10:31.314 } 00:10:31.314 ] 00:10:31.314 18:47:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:31.314 18:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:31.314 18:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:31.314 18:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:31.314 18:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:31.314 18:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:31.314 18:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:31.314 18:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:31.314 18:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:31.314 18:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:31.314 18:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:31.314 18:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:31.314 18:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:31.572 18:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:31.572 "name": "Existed_Raid", 00:10:31.572 "uuid": "bb331038-6241-4340-ac4b-731a29f51214", 00:10:31.572 "strip_size_kb": 0, 00:10:31.572 "state": "configuring", 00:10:31.572 "raid_level": "raid1", 00:10:31.572 "superblock": true, 00:10:31.572 "num_base_bdevs": 2, 00:10:31.572 "num_base_bdevs_discovered": 1, 00:10:31.572 "num_base_bdevs_operational": 2, 00:10:31.572 "base_bdevs_list": [ 00:10:31.572 { 00:10:31.572 "name": "BaseBdev1", 00:10:31.572 "uuid": "8d65389b-a95c-432f-bc25-fbb5a4e123fc", 00:10:31.572 "is_configured": true, 00:10:31.572 "data_offset": 2048, 00:10:31.572 "data_size": 63488 00:10:31.572 }, 00:10:31.572 { 00:10:31.572 "name": "BaseBdev2", 00:10:31.572 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:31.572 "is_configured": false, 00:10:31.572 "data_offset": 0, 00:10:31.572 "data_size": 0 00:10:31.572 } 00:10:31.572 ] 00:10:31.572 }' 00:10:31.572 18:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:31.572 18:47:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:32.139 18:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:32.139 [2024-07-24 18:47:17.005434] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:32.139 [2024-07-24 18:47:17.005464] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9d3470 name Existed_Raid, state configuring 00:10:32.139 18:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:32.399 [2024-07-24 18:47:17.173898] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:32.399 [2024-07-24 18:47:17.174945] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:32.399 [2024-07-24 18:47:17.174969] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:32.399 18:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:32.399 18:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:32.399 18:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:32.399 18:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:32.399 18:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:32.399 18:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:32.399 18:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:32.399 18:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:32.399 18:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:32.399 18:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:32.399 18:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:32.399 18:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:32.399 18:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:32.399 18:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:32.399 18:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:32.399 "name": "Existed_Raid", 00:10:32.399 "uuid": "08e1e3c6-e8ca-4e4b-b11c-fc7e5a5e793a", 00:10:32.399 "strip_size_kb": 0, 00:10:32.399 "state": "configuring", 00:10:32.399 "raid_level": "raid1", 00:10:32.399 "superblock": true, 00:10:32.399 "num_base_bdevs": 2, 00:10:32.399 "num_base_bdevs_discovered": 1, 00:10:32.399 "num_base_bdevs_operational": 2, 00:10:32.399 "base_bdevs_list": [ 00:10:32.399 { 00:10:32.399 "name": "BaseBdev1", 00:10:32.399 "uuid": "8d65389b-a95c-432f-bc25-fbb5a4e123fc", 00:10:32.399 "is_configured": true, 00:10:32.399 "data_offset": 2048, 00:10:32.399 "data_size": 63488 00:10:32.399 }, 00:10:32.399 { 00:10:32.399 "name": "BaseBdev2", 00:10:32.399 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:32.399 "is_configured": false, 00:10:32.399 "data_offset": 0, 00:10:32.399 "data_size": 0 00:10:32.399 } 00:10:32.399 ] 00:10:32.399 }' 00:10:32.399 18:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:32.399 18:47:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:32.965 18:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:33.224 [2024-07-24 18:47:18.002693] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:33.225 [2024-07-24 18:47:18.002795] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x9d4260 00:10:33.225 [2024-07-24 18:47:18.002802] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:33.225 [2024-07-24 18:47:18.002915] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9d33c0 00:10:33.225 [2024-07-24 18:47:18.002993] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9d4260 00:10:33.225 [2024-07-24 18:47:18.002999] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x9d4260 00:10:33.225 [2024-07-24 18:47:18.003056] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:33.225 BaseBdev2 00:10:33.225 18:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:33.225 18:47:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:33.225 18:47:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:33.225 18:47:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:33.225 18:47:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:33.225 18:47:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:33.225 18:47:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:33.225 18:47:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:33.483 [ 00:10:33.483 { 00:10:33.483 "name": "BaseBdev2", 00:10:33.483 "aliases": [ 00:10:33.483 "94c4f485-5e2e-4362-b73d-6139de52ac88" 00:10:33.483 ], 00:10:33.483 "product_name": "Malloc disk", 00:10:33.483 "block_size": 512, 00:10:33.483 "num_blocks": 65536, 00:10:33.483 "uuid": "94c4f485-5e2e-4362-b73d-6139de52ac88", 00:10:33.483 "assigned_rate_limits": { 00:10:33.483 "rw_ios_per_sec": 0, 00:10:33.483 "rw_mbytes_per_sec": 0, 00:10:33.483 "r_mbytes_per_sec": 0, 00:10:33.483 "w_mbytes_per_sec": 0 00:10:33.483 }, 00:10:33.483 "claimed": true, 00:10:33.483 "claim_type": "exclusive_write", 00:10:33.483 "zoned": false, 00:10:33.483 "supported_io_types": { 00:10:33.483 "read": true, 00:10:33.483 "write": true, 00:10:33.483 "unmap": true, 00:10:33.483 "flush": true, 00:10:33.483 "reset": true, 00:10:33.483 "nvme_admin": false, 00:10:33.483 "nvme_io": false, 00:10:33.483 "nvme_io_md": false, 00:10:33.483 "write_zeroes": true, 00:10:33.483 "zcopy": true, 00:10:33.483 "get_zone_info": false, 00:10:33.483 "zone_management": false, 00:10:33.483 "zone_append": false, 00:10:33.483 "compare": false, 00:10:33.483 "compare_and_write": false, 00:10:33.483 "abort": true, 00:10:33.483 "seek_hole": false, 00:10:33.483 "seek_data": false, 00:10:33.483 "copy": true, 00:10:33.483 "nvme_iov_md": false 00:10:33.483 }, 00:10:33.483 "memory_domains": [ 00:10:33.483 { 00:10:33.483 "dma_device_id": "system", 00:10:33.483 "dma_device_type": 1 00:10:33.483 }, 00:10:33.483 { 00:10:33.483 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:33.483 "dma_device_type": 2 00:10:33.483 } 00:10:33.483 ], 00:10:33.483 "driver_specific": {} 00:10:33.483 } 00:10:33.483 ] 00:10:33.483 18:47:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:33.483 18:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:33.483 18:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:33.483 18:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:10:33.483 18:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:33.483 18:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:33.483 18:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:33.483 18:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:33.483 18:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:33.483 18:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:33.483 18:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:33.483 18:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:33.483 18:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:33.483 18:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:33.483 18:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:33.740 18:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:33.740 "name": "Existed_Raid", 00:10:33.740 "uuid": "08e1e3c6-e8ca-4e4b-b11c-fc7e5a5e793a", 00:10:33.740 "strip_size_kb": 0, 00:10:33.740 "state": "online", 00:10:33.740 "raid_level": "raid1", 00:10:33.740 "superblock": true, 00:10:33.740 "num_base_bdevs": 2, 00:10:33.740 "num_base_bdevs_discovered": 2, 00:10:33.740 "num_base_bdevs_operational": 2, 00:10:33.740 "base_bdevs_list": [ 00:10:33.740 { 00:10:33.740 "name": "BaseBdev1", 00:10:33.740 "uuid": "8d65389b-a95c-432f-bc25-fbb5a4e123fc", 00:10:33.740 "is_configured": true, 00:10:33.740 "data_offset": 2048, 00:10:33.740 "data_size": 63488 00:10:33.740 }, 00:10:33.740 { 00:10:33.740 "name": "BaseBdev2", 00:10:33.740 "uuid": "94c4f485-5e2e-4362-b73d-6139de52ac88", 00:10:33.740 "is_configured": true, 00:10:33.740 "data_offset": 2048, 00:10:33.740 "data_size": 63488 00:10:33.740 } 00:10:33.740 ] 00:10:33.740 }' 00:10:33.740 18:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:33.740 18:47:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:34.305 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:34.305 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:34.305 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:34.305 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:34.305 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:34.305 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:10:34.305 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:34.305 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:34.305 [2024-07-24 18:47:19.169890] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:34.305 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:34.305 "name": "Existed_Raid", 00:10:34.305 "aliases": [ 00:10:34.305 "08e1e3c6-e8ca-4e4b-b11c-fc7e5a5e793a" 00:10:34.305 ], 00:10:34.305 "product_name": "Raid Volume", 00:10:34.305 "block_size": 512, 00:10:34.305 "num_blocks": 63488, 00:10:34.305 "uuid": "08e1e3c6-e8ca-4e4b-b11c-fc7e5a5e793a", 00:10:34.305 "assigned_rate_limits": { 00:10:34.305 "rw_ios_per_sec": 0, 00:10:34.305 "rw_mbytes_per_sec": 0, 00:10:34.305 "r_mbytes_per_sec": 0, 00:10:34.305 "w_mbytes_per_sec": 0 00:10:34.305 }, 00:10:34.305 "claimed": false, 00:10:34.305 "zoned": false, 00:10:34.305 "supported_io_types": { 00:10:34.305 "read": true, 00:10:34.305 "write": true, 00:10:34.305 "unmap": false, 00:10:34.305 "flush": false, 00:10:34.305 "reset": true, 00:10:34.305 "nvme_admin": false, 00:10:34.305 "nvme_io": false, 00:10:34.305 "nvme_io_md": false, 00:10:34.305 "write_zeroes": true, 00:10:34.305 "zcopy": false, 00:10:34.305 "get_zone_info": false, 00:10:34.305 "zone_management": false, 00:10:34.305 "zone_append": false, 00:10:34.305 "compare": false, 00:10:34.305 "compare_and_write": false, 00:10:34.305 "abort": false, 00:10:34.305 "seek_hole": false, 00:10:34.305 "seek_data": false, 00:10:34.305 "copy": false, 00:10:34.305 "nvme_iov_md": false 00:10:34.305 }, 00:10:34.305 "memory_domains": [ 00:10:34.305 { 00:10:34.305 "dma_device_id": "system", 00:10:34.305 "dma_device_type": 1 00:10:34.305 }, 00:10:34.305 { 00:10:34.305 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:34.305 "dma_device_type": 2 00:10:34.305 }, 00:10:34.305 { 00:10:34.305 "dma_device_id": "system", 00:10:34.305 "dma_device_type": 1 00:10:34.305 }, 00:10:34.305 { 00:10:34.305 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:34.305 "dma_device_type": 2 00:10:34.305 } 00:10:34.305 ], 00:10:34.305 "driver_specific": { 00:10:34.305 "raid": { 00:10:34.305 "uuid": "08e1e3c6-e8ca-4e4b-b11c-fc7e5a5e793a", 00:10:34.305 "strip_size_kb": 0, 00:10:34.305 "state": "online", 00:10:34.305 "raid_level": "raid1", 00:10:34.305 "superblock": true, 00:10:34.305 "num_base_bdevs": 2, 00:10:34.305 "num_base_bdevs_discovered": 2, 00:10:34.305 "num_base_bdevs_operational": 2, 00:10:34.305 "base_bdevs_list": [ 00:10:34.305 { 00:10:34.305 "name": "BaseBdev1", 00:10:34.305 "uuid": "8d65389b-a95c-432f-bc25-fbb5a4e123fc", 00:10:34.305 "is_configured": true, 00:10:34.305 "data_offset": 2048, 00:10:34.305 "data_size": 63488 00:10:34.305 }, 00:10:34.305 { 00:10:34.305 "name": "BaseBdev2", 00:10:34.305 "uuid": "94c4f485-5e2e-4362-b73d-6139de52ac88", 00:10:34.305 "is_configured": true, 00:10:34.305 "data_offset": 2048, 00:10:34.305 "data_size": 63488 00:10:34.305 } 00:10:34.305 ] 00:10:34.305 } 00:10:34.305 } 00:10:34.305 }' 00:10:34.305 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:34.305 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:34.305 BaseBdev2' 00:10:34.305 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:34.305 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:34.305 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:34.562 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:34.562 "name": "BaseBdev1", 00:10:34.562 "aliases": [ 00:10:34.562 "8d65389b-a95c-432f-bc25-fbb5a4e123fc" 00:10:34.562 ], 00:10:34.562 "product_name": "Malloc disk", 00:10:34.562 "block_size": 512, 00:10:34.562 "num_blocks": 65536, 00:10:34.562 "uuid": "8d65389b-a95c-432f-bc25-fbb5a4e123fc", 00:10:34.562 "assigned_rate_limits": { 00:10:34.562 "rw_ios_per_sec": 0, 00:10:34.562 "rw_mbytes_per_sec": 0, 00:10:34.562 "r_mbytes_per_sec": 0, 00:10:34.562 "w_mbytes_per_sec": 0 00:10:34.562 }, 00:10:34.562 "claimed": true, 00:10:34.562 "claim_type": "exclusive_write", 00:10:34.562 "zoned": false, 00:10:34.562 "supported_io_types": { 00:10:34.562 "read": true, 00:10:34.562 "write": true, 00:10:34.562 "unmap": true, 00:10:34.562 "flush": true, 00:10:34.562 "reset": true, 00:10:34.562 "nvme_admin": false, 00:10:34.562 "nvme_io": false, 00:10:34.562 "nvme_io_md": false, 00:10:34.562 "write_zeroes": true, 00:10:34.562 "zcopy": true, 00:10:34.562 "get_zone_info": false, 00:10:34.562 "zone_management": false, 00:10:34.562 "zone_append": false, 00:10:34.562 "compare": false, 00:10:34.562 "compare_and_write": false, 00:10:34.562 "abort": true, 00:10:34.562 "seek_hole": false, 00:10:34.562 "seek_data": false, 00:10:34.562 "copy": true, 00:10:34.562 "nvme_iov_md": false 00:10:34.562 }, 00:10:34.562 "memory_domains": [ 00:10:34.562 { 00:10:34.562 "dma_device_id": "system", 00:10:34.562 "dma_device_type": 1 00:10:34.562 }, 00:10:34.562 { 00:10:34.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:34.562 "dma_device_type": 2 00:10:34.562 } 00:10:34.562 ], 00:10:34.562 "driver_specific": {} 00:10:34.562 }' 00:10:34.562 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:34.562 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:34.562 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:34.562 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:34.562 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:34.562 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:34.562 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:34.819 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:34.819 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:34.819 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:34.819 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:34.820 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:34.820 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:34.820 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:34.820 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:35.078 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:35.078 "name": "BaseBdev2", 00:10:35.078 "aliases": [ 00:10:35.078 "94c4f485-5e2e-4362-b73d-6139de52ac88" 00:10:35.078 ], 00:10:35.078 "product_name": "Malloc disk", 00:10:35.078 "block_size": 512, 00:10:35.078 "num_blocks": 65536, 00:10:35.078 "uuid": "94c4f485-5e2e-4362-b73d-6139de52ac88", 00:10:35.078 "assigned_rate_limits": { 00:10:35.078 "rw_ios_per_sec": 0, 00:10:35.078 "rw_mbytes_per_sec": 0, 00:10:35.078 "r_mbytes_per_sec": 0, 00:10:35.078 "w_mbytes_per_sec": 0 00:10:35.078 }, 00:10:35.078 "claimed": true, 00:10:35.078 "claim_type": "exclusive_write", 00:10:35.078 "zoned": false, 00:10:35.078 "supported_io_types": { 00:10:35.078 "read": true, 00:10:35.078 "write": true, 00:10:35.078 "unmap": true, 00:10:35.078 "flush": true, 00:10:35.078 "reset": true, 00:10:35.078 "nvme_admin": false, 00:10:35.078 "nvme_io": false, 00:10:35.078 "nvme_io_md": false, 00:10:35.078 "write_zeroes": true, 00:10:35.078 "zcopy": true, 00:10:35.078 "get_zone_info": false, 00:10:35.078 "zone_management": false, 00:10:35.078 "zone_append": false, 00:10:35.078 "compare": false, 00:10:35.078 "compare_and_write": false, 00:10:35.078 "abort": true, 00:10:35.078 "seek_hole": false, 00:10:35.078 "seek_data": false, 00:10:35.078 "copy": true, 00:10:35.078 "nvme_iov_md": false 00:10:35.078 }, 00:10:35.078 "memory_domains": [ 00:10:35.078 { 00:10:35.078 "dma_device_id": "system", 00:10:35.078 "dma_device_type": 1 00:10:35.078 }, 00:10:35.078 { 00:10:35.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.078 "dma_device_type": 2 00:10:35.078 } 00:10:35.078 ], 00:10:35.078 "driver_specific": {} 00:10:35.078 }' 00:10:35.078 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:35.078 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:35.078 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:35.078 18:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:35.078 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:35.078 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:35.078 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:35.335 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:35.335 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:35.335 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:35.335 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:35.335 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:35.335 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:35.593 [2024-07-24 18:47:20.380884] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:35.593 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:35.593 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:10:35.593 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:35.594 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:10:35.594 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:10:35.594 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:10:35.594 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:35.594 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:35.594 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:35.594 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:35.594 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:35.594 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:35.594 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:35.594 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:35.594 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:35.594 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:35.594 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:35.594 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:35.594 "name": "Existed_Raid", 00:10:35.594 "uuid": "08e1e3c6-e8ca-4e4b-b11c-fc7e5a5e793a", 00:10:35.594 "strip_size_kb": 0, 00:10:35.594 "state": "online", 00:10:35.594 "raid_level": "raid1", 00:10:35.594 "superblock": true, 00:10:35.594 "num_base_bdevs": 2, 00:10:35.594 "num_base_bdevs_discovered": 1, 00:10:35.594 "num_base_bdevs_operational": 1, 00:10:35.594 "base_bdevs_list": [ 00:10:35.594 { 00:10:35.594 "name": null, 00:10:35.594 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:35.594 "is_configured": false, 00:10:35.594 "data_offset": 2048, 00:10:35.594 "data_size": 63488 00:10:35.594 }, 00:10:35.594 { 00:10:35.594 "name": "BaseBdev2", 00:10:35.594 "uuid": "94c4f485-5e2e-4362-b73d-6139de52ac88", 00:10:35.594 "is_configured": true, 00:10:35.594 "data_offset": 2048, 00:10:35.594 "data_size": 63488 00:10:35.594 } 00:10:35.594 ] 00:10:35.594 }' 00:10:35.594 18:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:35.594 18:47:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:36.158 18:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:36.158 18:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:36.158 18:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:36.158 18:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:36.416 18:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:36.416 18:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:36.416 18:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:36.416 [2024-07-24 18:47:21.388368] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:36.416 [2024-07-24 18:47:21.388428] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:36.416 [2024-07-24 18:47:21.398512] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:36.416 [2024-07-24 18:47:21.398552] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:36.416 [2024-07-24 18:47:21.398558] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9d4260 name Existed_Raid, state offline 00:10:36.416 18:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:36.416 18:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:36.416 18:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:36.416 18:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:36.706 18:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:36.706 18:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:36.706 18:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:36.706 18:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2061108 00:10:36.706 18:47:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2061108 ']' 00:10:36.706 18:47:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2061108 00:10:36.706 18:47:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:10:36.706 18:47:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:36.706 18:47:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2061108 00:10:36.706 18:47:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:36.706 18:47:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:36.706 18:47:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2061108' 00:10:36.706 killing process with pid 2061108 00:10:36.706 18:47:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2061108 00:10:36.706 [2024-07-24 18:47:21.618970] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:36.706 18:47:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2061108 00:10:36.706 [2024-07-24 18:47:21.619767] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:36.965 18:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:10:36.965 00:10:36.965 real 0m8.114s 00:10:36.965 user 0m14.454s 00:10:36.965 sys 0m1.391s 00:10:36.965 18:47:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:36.965 18:47:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:36.965 ************************************ 00:10:36.965 END TEST raid_state_function_test_sb 00:10:36.965 ************************************ 00:10:36.965 18:47:21 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:10:36.965 18:47:21 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:10:36.965 18:47:21 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:36.965 18:47:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:36.965 ************************************ 00:10:36.965 START TEST raid_superblock_test 00:10:36.965 ************************************ 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2062698 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2062698 /var/tmp/spdk-raid.sock 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2062698 ']' 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:36.966 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:36.966 18:47:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:36.966 [2024-07-24 18:47:21.899494] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:10:36.966 [2024-07-24 18:47:21.899533] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2062698 ] 00:10:36.966 [2024-07-24 18:47:21.960830] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:37.224 [2024-07-24 18:47:22.039873] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:37.224 [2024-07-24 18:47:22.092398] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:37.224 [2024-07-24 18:47:22.092425] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:37.791 18:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:37.791 18:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:10:37.791 18:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:10:37.791 18:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:37.791 18:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:10:37.791 18:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:10:37.791 18:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:10:37.791 18:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:37.791 18:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:37.791 18:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:37.791 18:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:10:38.050 malloc1 00:10:38.050 18:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:38.050 [2024-07-24 18:47:23.024481] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:38.050 [2024-07-24 18:47:23.024516] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:38.050 [2024-07-24 18:47:23.024527] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe13e20 00:10:38.050 [2024-07-24 18:47:23.024548] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:38.050 [2024-07-24 18:47:23.025702] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:38.050 [2024-07-24 18:47:23.025727] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:38.050 pt1 00:10:38.050 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:38.050 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:38.050 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:10:38.050 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:10:38.050 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:10:38.050 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:38.050 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:38.050 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:38.050 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:10:38.308 malloc2 00:10:38.308 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:38.574 [2024-07-24 18:47:23.360995] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:38.574 [2024-07-24 18:47:23.361026] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:38.574 [2024-07-24 18:47:23.361036] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfbded0 00:10:38.574 [2024-07-24 18:47:23.361041] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:38.575 [2024-07-24 18:47:23.362089] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:38.575 [2024-07-24 18:47:23.362109] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:38.575 pt2 00:10:38.575 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:38.575 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:38.575 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:10:38.575 [2024-07-24 18:47:23.529454] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:38.575 [2024-07-24 18:47:23.530377] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:38.575 [2024-07-24 18:47:23.530491] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfbd170 00:10:38.575 [2024-07-24 18:47:23.530500] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:38.575 [2024-07-24 18:47:23.530637] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfbf5d0 00:10:38.575 [2024-07-24 18:47:23.530740] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfbd170 00:10:38.575 [2024-07-24 18:47:23.530746] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfbd170 00:10:38.575 [2024-07-24 18:47:23.530816] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:38.575 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:10:38.575 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:38.575 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:38.575 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:38.575 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:38.575 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:38.575 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:38.575 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:38.575 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:38.575 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:38.575 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:38.575 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:38.836 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:38.836 "name": "raid_bdev1", 00:10:38.836 "uuid": "78157c23-9574-4718-b1e2-7be8b41739e2", 00:10:38.836 "strip_size_kb": 0, 00:10:38.836 "state": "online", 00:10:38.836 "raid_level": "raid1", 00:10:38.836 "superblock": true, 00:10:38.836 "num_base_bdevs": 2, 00:10:38.836 "num_base_bdevs_discovered": 2, 00:10:38.836 "num_base_bdevs_operational": 2, 00:10:38.836 "base_bdevs_list": [ 00:10:38.836 { 00:10:38.836 "name": "pt1", 00:10:38.836 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:38.836 "is_configured": true, 00:10:38.836 "data_offset": 2048, 00:10:38.836 "data_size": 63488 00:10:38.836 }, 00:10:38.836 { 00:10:38.836 "name": "pt2", 00:10:38.836 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:38.836 "is_configured": true, 00:10:38.836 "data_offset": 2048, 00:10:38.836 "data_size": 63488 00:10:38.836 } 00:10:38.836 ] 00:10:38.836 }' 00:10:38.836 18:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:38.836 18:47:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:39.404 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:10:39.404 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:39.404 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:39.404 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:39.404 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:39.404 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:39.404 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:39.404 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:39.404 [2024-07-24 18:47:24.275500] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:39.404 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:39.404 "name": "raid_bdev1", 00:10:39.404 "aliases": [ 00:10:39.404 "78157c23-9574-4718-b1e2-7be8b41739e2" 00:10:39.404 ], 00:10:39.404 "product_name": "Raid Volume", 00:10:39.404 "block_size": 512, 00:10:39.404 "num_blocks": 63488, 00:10:39.404 "uuid": "78157c23-9574-4718-b1e2-7be8b41739e2", 00:10:39.404 "assigned_rate_limits": { 00:10:39.404 "rw_ios_per_sec": 0, 00:10:39.404 "rw_mbytes_per_sec": 0, 00:10:39.404 "r_mbytes_per_sec": 0, 00:10:39.404 "w_mbytes_per_sec": 0 00:10:39.404 }, 00:10:39.404 "claimed": false, 00:10:39.404 "zoned": false, 00:10:39.404 "supported_io_types": { 00:10:39.404 "read": true, 00:10:39.404 "write": true, 00:10:39.404 "unmap": false, 00:10:39.404 "flush": false, 00:10:39.404 "reset": true, 00:10:39.404 "nvme_admin": false, 00:10:39.404 "nvme_io": false, 00:10:39.404 "nvme_io_md": false, 00:10:39.404 "write_zeroes": true, 00:10:39.404 "zcopy": false, 00:10:39.404 "get_zone_info": false, 00:10:39.404 "zone_management": false, 00:10:39.404 "zone_append": false, 00:10:39.404 "compare": false, 00:10:39.404 "compare_and_write": false, 00:10:39.404 "abort": false, 00:10:39.404 "seek_hole": false, 00:10:39.404 "seek_data": false, 00:10:39.404 "copy": false, 00:10:39.404 "nvme_iov_md": false 00:10:39.404 }, 00:10:39.404 "memory_domains": [ 00:10:39.404 { 00:10:39.404 "dma_device_id": "system", 00:10:39.404 "dma_device_type": 1 00:10:39.404 }, 00:10:39.404 { 00:10:39.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:39.404 "dma_device_type": 2 00:10:39.404 }, 00:10:39.404 { 00:10:39.404 "dma_device_id": "system", 00:10:39.404 "dma_device_type": 1 00:10:39.404 }, 00:10:39.404 { 00:10:39.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:39.404 "dma_device_type": 2 00:10:39.404 } 00:10:39.404 ], 00:10:39.404 "driver_specific": { 00:10:39.404 "raid": { 00:10:39.404 "uuid": "78157c23-9574-4718-b1e2-7be8b41739e2", 00:10:39.404 "strip_size_kb": 0, 00:10:39.404 "state": "online", 00:10:39.404 "raid_level": "raid1", 00:10:39.404 "superblock": true, 00:10:39.404 "num_base_bdevs": 2, 00:10:39.404 "num_base_bdevs_discovered": 2, 00:10:39.404 "num_base_bdevs_operational": 2, 00:10:39.404 "base_bdevs_list": [ 00:10:39.404 { 00:10:39.404 "name": "pt1", 00:10:39.404 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:39.404 "is_configured": true, 00:10:39.404 "data_offset": 2048, 00:10:39.404 "data_size": 63488 00:10:39.404 }, 00:10:39.404 { 00:10:39.404 "name": "pt2", 00:10:39.404 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:39.404 "is_configured": true, 00:10:39.404 "data_offset": 2048, 00:10:39.404 "data_size": 63488 00:10:39.404 } 00:10:39.404 ] 00:10:39.404 } 00:10:39.404 } 00:10:39.404 }' 00:10:39.404 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:39.404 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:39.404 pt2' 00:10:39.404 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:39.404 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:39.404 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:39.664 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:39.664 "name": "pt1", 00:10:39.664 "aliases": [ 00:10:39.664 "00000000-0000-0000-0000-000000000001" 00:10:39.664 ], 00:10:39.664 "product_name": "passthru", 00:10:39.664 "block_size": 512, 00:10:39.664 "num_blocks": 65536, 00:10:39.664 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:39.664 "assigned_rate_limits": { 00:10:39.664 "rw_ios_per_sec": 0, 00:10:39.664 "rw_mbytes_per_sec": 0, 00:10:39.664 "r_mbytes_per_sec": 0, 00:10:39.664 "w_mbytes_per_sec": 0 00:10:39.664 }, 00:10:39.664 "claimed": true, 00:10:39.664 "claim_type": "exclusive_write", 00:10:39.664 "zoned": false, 00:10:39.664 "supported_io_types": { 00:10:39.664 "read": true, 00:10:39.664 "write": true, 00:10:39.664 "unmap": true, 00:10:39.664 "flush": true, 00:10:39.664 "reset": true, 00:10:39.664 "nvme_admin": false, 00:10:39.664 "nvme_io": false, 00:10:39.664 "nvme_io_md": false, 00:10:39.664 "write_zeroes": true, 00:10:39.664 "zcopy": true, 00:10:39.664 "get_zone_info": false, 00:10:39.664 "zone_management": false, 00:10:39.664 "zone_append": false, 00:10:39.664 "compare": false, 00:10:39.664 "compare_and_write": false, 00:10:39.664 "abort": true, 00:10:39.664 "seek_hole": false, 00:10:39.664 "seek_data": false, 00:10:39.664 "copy": true, 00:10:39.664 "nvme_iov_md": false 00:10:39.664 }, 00:10:39.664 "memory_domains": [ 00:10:39.664 { 00:10:39.664 "dma_device_id": "system", 00:10:39.664 "dma_device_type": 1 00:10:39.664 }, 00:10:39.664 { 00:10:39.664 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:39.664 "dma_device_type": 2 00:10:39.664 } 00:10:39.664 ], 00:10:39.664 "driver_specific": { 00:10:39.664 "passthru": { 00:10:39.664 "name": "pt1", 00:10:39.664 "base_bdev_name": "malloc1" 00:10:39.664 } 00:10:39.664 } 00:10:39.664 }' 00:10:39.664 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:39.664 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:39.664 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:39.664 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:39.664 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:39.664 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:39.664 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:39.923 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:39.923 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:39.923 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:39.923 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:39.923 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:39.923 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:39.923 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:39.924 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:40.183 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:40.183 "name": "pt2", 00:10:40.183 "aliases": [ 00:10:40.183 "00000000-0000-0000-0000-000000000002" 00:10:40.183 ], 00:10:40.183 "product_name": "passthru", 00:10:40.183 "block_size": 512, 00:10:40.183 "num_blocks": 65536, 00:10:40.183 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:40.183 "assigned_rate_limits": { 00:10:40.183 "rw_ios_per_sec": 0, 00:10:40.183 "rw_mbytes_per_sec": 0, 00:10:40.183 "r_mbytes_per_sec": 0, 00:10:40.183 "w_mbytes_per_sec": 0 00:10:40.183 }, 00:10:40.183 "claimed": true, 00:10:40.183 "claim_type": "exclusive_write", 00:10:40.183 "zoned": false, 00:10:40.183 "supported_io_types": { 00:10:40.183 "read": true, 00:10:40.183 "write": true, 00:10:40.183 "unmap": true, 00:10:40.183 "flush": true, 00:10:40.183 "reset": true, 00:10:40.183 "nvme_admin": false, 00:10:40.183 "nvme_io": false, 00:10:40.183 "nvme_io_md": false, 00:10:40.183 "write_zeroes": true, 00:10:40.183 "zcopy": true, 00:10:40.183 "get_zone_info": false, 00:10:40.183 "zone_management": false, 00:10:40.183 "zone_append": false, 00:10:40.183 "compare": false, 00:10:40.183 "compare_and_write": false, 00:10:40.183 "abort": true, 00:10:40.183 "seek_hole": false, 00:10:40.183 "seek_data": false, 00:10:40.183 "copy": true, 00:10:40.183 "nvme_iov_md": false 00:10:40.183 }, 00:10:40.183 "memory_domains": [ 00:10:40.183 { 00:10:40.183 "dma_device_id": "system", 00:10:40.183 "dma_device_type": 1 00:10:40.183 }, 00:10:40.183 { 00:10:40.183 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:40.183 "dma_device_type": 2 00:10:40.183 } 00:10:40.183 ], 00:10:40.183 "driver_specific": { 00:10:40.183 "passthru": { 00:10:40.183 "name": "pt2", 00:10:40.183 "base_bdev_name": "malloc2" 00:10:40.183 } 00:10:40.183 } 00:10:40.183 }' 00:10:40.183 18:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:40.183 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:40.183 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:40.183 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:40.183 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:40.183 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:40.183 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:40.183 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:40.442 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:40.442 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:40.442 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:40.442 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:40.442 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:10:40.442 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:40.442 [2024-07-24 18:47:25.438492] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:40.702 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=78157c23-9574-4718-b1e2-7be8b41739e2 00:10:40.702 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 78157c23-9574-4718-b1e2-7be8b41739e2 ']' 00:10:40.702 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:40.702 [2024-07-24 18:47:25.606774] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:40.702 [2024-07-24 18:47:25.606787] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:40.702 [2024-07-24 18:47:25.606825] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:40.702 [2024-07-24 18:47:25.606862] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:40.702 [2024-07-24 18:47:25.606868] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfbd170 name raid_bdev1, state offline 00:10:40.702 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:40.702 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:10:40.961 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:10:40.961 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:10:40.961 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:40.961 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:40.961 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:40.961 18:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:41.220 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:41.220 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:10:41.480 [2024-07-24 18:47:26.448927] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:41.480 [2024-07-24 18:47:26.449916] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:41.480 [2024-07-24 18:47:26.449959] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:41.480 [2024-07-24 18:47:26.449986] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:41.480 [2024-07-24 18:47:26.449996] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:41.480 [2024-07-24 18:47:26.450001] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfbe900 name raid_bdev1, state configuring 00:10:41.480 request: 00:10:41.480 { 00:10:41.480 "name": "raid_bdev1", 00:10:41.480 "raid_level": "raid1", 00:10:41.480 "base_bdevs": [ 00:10:41.480 "malloc1", 00:10:41.480 "malloc2" 00:10:41.480 ], 00:10:41.480 "superblock": false, 00:10:41.480 "method": "bdev_raid_create", 00:10:41.480 "req_id": 1 00:10:41.480 } 00:10:41.480 Got JSON-RPC error response 00:10:41.480 response: 00:10:41.480 { 00:10:41.480 "code": -17, 00:10:41.480 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:41.480 } 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:41.480 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:10:41.739 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:10:41.739 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:10:41.739 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:41.999 [2024-07-24 18:47:26.789781] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:41.999 [2024-07-24 18:47:26.789809] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:41.999 [2024-07-24 18:47:26.789820] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe12760 00:10:41.999 [2024-07-24 18:47:26.789826] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:41.999 [2024-07-24 18:47:26.790995] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:41.999 [2024-07-24 18:47:26.791017] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:41.999 [2024-07-24 18:47:26.791062] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:41.999 [2024-07-24 18:47:26.791078] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:41.999 pt1 00:10:41.999 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:10:41.999 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:41.999 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:41.999 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:41.999 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:41.999 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:41.999 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:41.999 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:41.999 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:41.999 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:41.999 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:41.999 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:41.999 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:41.999 "name": "raid_bdev1", 00:10:41.999 "uuid": "78157c23-9574-4718-b1e2-7be8b41739e2", 00:10:41.999 "strip_size_kb": 0, 00:10:41.999 "state": "configuring", 00:10:41.999 "raid_level": "raid1", 00:10:41.999 "superblock": true, 00:10:41.999 "num_base_bdevs": 2, 00:10:41.999 "num_base_bdevs_discovered": 1, 00:10:41.999 "num_base_bdevs_operational": 2, 00:10:41.999 "base_bdevs_list": [ 00:10:41.999 { 00:10:41.999 "name": "pt1", 00:10:41.999 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:41.999 "is_configured": true, 00:10:41.999 "data_offset": 2048, 00:10:41.999 "data_size": 63488 00:10:41.999 }, 00:10:41.999 { 00:10:41.999 "name": null, 00:10:41.999 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:41.999 "is_configured": false, 00:10:41.999 "data_offset": 2048, 00:10:41.999 "data_size": 63488 00:10:41.999 } 00:10:41.999 ] 00:10:41.999 }' 00:10:41.999 18:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:41.999 18:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:42.567 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:10:42.567 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:10:42.567 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:42.567 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:42.827 [2024-07-24 18:47:27.607925] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:42.827 [2024-07-24 18:47:27.607960] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:42.827 [2024-07-24 18:47:27.607970] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe14a40 00:10:42.827 [2024-07-24 18:47:27.607976] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:42.827 [2024-07-24 18:47:27.608219] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:42.827 [2024-07-24 18:47:27.608229] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:42.827 [2024-07-24 18:47:27.608270] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:42.827 [2024-07-24 18:47:27.608282] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:42.827 [2024-07-24 18:47:27.608350] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfc04e0 00:10:42.827 [2024-07-24 18:47:27.608355] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:42.827 [2024-07-24 18:47:27.608475] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfc3cd0 00:10:42.827 [2024-07-24 18:47:27.608580] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfc04e0 00:10:42.827 [2024-07-24 18:47:27.608586] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfc04e0 00:10:42.827 [2024-07-24 18:47:27.608655] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:42.827 pt2 00:10:42.827 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:10:42.827 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:42.827 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:10:42.827 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:42.827 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:42.827 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:42.827 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:42.827 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:42.827 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:42.827 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:42.827 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:42.827 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:42.827 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:42.827 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:42.827 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:42.827 "name": "raid_bdev1", 00:10:42.827 "uuid": "78157c23-9574-4718-b1e2-7be8b41739e2", 00:10:42.827 "strip_size_kb": 0, 00:10:42.827 "state": "online", 00:10:42.827 "raid_level": "raid1", 00:10:42.827 "superblock": true, 00:10:42.827 "num_base_bdevs": 2, 00:10:42.827 "num_base_bdevs_discovered": 2, 00:10:42.827 "num_base_bdevs_operational": 2, 00:10:42.827 "base_bdevs_list": [ 00:10:42.827 { 00:10:42.827 "name": "pt1", 00:10:42.827 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:42.827 "is_configured": true, 00:10:42.827 "data_offset": 2048, 00:10:42.827 "data_size": 63488 00:10:42.827 }, 00:10:42.827 { 00:10:42.827 "name": "pt2", 00:10:42.827 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:42.827 "is_configured": true, 00:10:42.827 "data_offset": 2048, 00:10:42.827 "data_size": 63488 00:10:42.827 } 00:10:42.827 ] 00:10:42.827 }' 00:10:42.827 18:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:42.827 18:47:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:43.395 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:10:43.395 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:43.395 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:43.395 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:43.395 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:43.395 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:43.395 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:43.395 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:43.655 [2024-07-24 18:47:28.466318] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:43.655 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:43.655 "name": "raid_bdev1", 00:10:43.655 "aliases": [ 00:10:43.655 "78157c23-9574-4718-b1e2-7be8b41739e2" 00:10:43.655 ], 00:10:43.655 "product_name": "Raid Volume", 00:10:43.655 "block_size": 512, 00:10:43.655 "num_blocks": 63488, 00:10:43.655 "uuid": "78157c23-9574-4718-b1e2-7be8b41739e2", 00:10:43.655 "assigned_rate_limits": { 00:10:43.655 "rw_ios_per_sec": 0, 00:10:43.655 "rw_mbytes_per_sec": 0, 00:10:43.655 "r_mbytes_per_sec": 0, 00:10:43.655 "w_mbytes_per_sec": 0 00:10:43.655 }, 00:10:43.655 "claimed": false, 00:10:43.655 "zoned": false, 00:10:43.655 "supported_io_types": { 00:10:43.655 "read": true, 00:10:43.655 "write": true, 00:10:43.655 "unmap": false, 00:10:43.655 "flush": false, 00:10:43.655 "reset": true, 00:10:43.655 "nvme_admin": false, 00:10:43.655 "nvme_io": false, 00:10:43.655 "nvme_io_md": false, 00:10:43.655 "write_zeroes": true, 00:10:43.655 "zcopy": false, 00:10:43.655 "get_zone_info": false, 00:10:43.655 "zone_management": false, 00:10:43.655 "zone_append": false, 00:10:43.655 "compare": false, 00:10:43.655 "compare_and_write": false, 00:10:43.655 "abort": false, 00:10:43.655 "seek_hole": false, 00:10:43.655 "seek_data": false, 00:10:43.655 "copy": false, 00:10:43.655 "nvme_iov_md": false 00:10:43.655 }, 00:10:43.655 "memory_domains": [ 00:10:43.655 { 00:10:43.655 "dma_device_id": "system", 00:10:43.655 "dma_device_type": 1 00:10:43.655 }, 00:10:43.655 { 00:10:43.655 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:43.655 "dma_device_type": 2 00:10:43.655 }, 00:10:43.655 { 00:10:43.655 "dma_device_id": "system", 00:10:43.655 "dma_device_type": 1 00:10:43.655 }, 00:10:43.655 { 00:10:43.655 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:43.655 "dma_device_type": 2 00:10:43.655 } 00:10:43.655 ], 00:10:43.655 "driver_specific": { 00:10:43.655 "raid": { 00:10:43.655 "uuid": "78157c23-9574-4718-b1e2-7be8b41739e2", 00:10:43.655 "strip_size_kb": 0, 00:10:43.655 "state": "online", 00:10:43.655 "raid_level": "raid1", 00:10:43.655 "superblock": true, 00:10:43.655 "num_base_bdevs": 2, 00:10:43.655 "num_base_bdevs_discovered": 2, 00:10:43.655 "num_base_bdevs_operational": 2, 00:10:43.655 "base_bdevs_list": [ 00:10:43.655 { 00:10:43.655 "name": "pt1", 00:10:43.655 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:43.655 "is_configured": true, 00:10:43.655 "data_offset": 2048, 00:10:43.655 "data_size": 63488 00:10:43.655 }, 00:10:43.655 { 00:10:43.655 "name": "pt2", 00:10:43.655 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:43.655 "is_configured": true, 00:10:43.655 "data_offset": 2048, 00:10:43.655 "data_size": 63488 00:10:43.655 } 00:10:43.655 ] 00:10:43.655 } 00:10:43.655 } 00:10:43.655 }' 00:10:43.655 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:43.655 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:43.655 pt2' 00:10:43.655 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:43.655 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:43.655 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:43.915 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:43.915 "name": "pt1", 00:10:43.915 "aliases": [ 00:10:43.915 "00000000-0000-0000-0000-000000000001" 00:10:43.915 ], 00:10:43.915 "product_name": "passthru", 00:10:43.915 "block_size": 512, 00:10:43.915 "num_blocks": 65536, 00:10:43.915 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:43.915 "assigned_rate_limits": { 00:10:43.915 "rw_ios_per_sec": 0, 00:10:43.915 "rw_mbytes_per_sec": 0, 00:10:43.915 "r_mbytes_per_sec": 0, 00:10:43.915 "w_mbytes_per_sec": 0 00:10:43.915 }, 00:10:43.915 "claimed": true, 00:10:43.915 "claim_type": "exclusive_write", 00:10:43.915 "zoned": false, 00:10:43.915 "supported_io_types": { 00:10:43.915 "read": true, 00:10:43.915 "write": true, 00:10:43.915 "unmap": true, 00:10:43.915 "flush": true, 00:10:43.915 "reset": true, 00:10:43.915 "nvme_admin": false, 00:10:43.915 "nvme_io": false, 00:10:43.915 "nvme_io_md": false, 00:10:43.915 "write_zeroes": true, 00:10:43.915 "zcopy": true, 00:10:43.915 "get_zone_info": false, 00:10:43.915 "zone_management": false, 00:10:43.915 "zone_append": false, 00:10:43.915 "compare": false, 00:10:43.915 "compare_and_write": false, 00:10:43.915 "abort": true, 00:10:43.915 "seek_hole": false, 00:10:43.915 "seek_data": false, 00:10:43.915 "copy": true, 00:10:43.915 "nvme_iov_md": false 00:10:43.915 }, 00:10:43.915 "memory_domains": [ 00:10:43.915 { 00:10:43.915 "dma_device_id": "system", 00:10:43.915 "dma_device_type": 1 00:10:43.915 }, 00:10:43.915 { 00:10:43.915 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:43.915 "dma_device_type": 2 00:10:43.915 } 00:10:43.915 ], 00:10:43.915 "driver_specific": { 00:10:43.915 "passthru": { 00:10:43.915 "name": "pt1", 00:10:43.915 "base_bdev_name": "malloc1" 00:10:43.915 } 00:10:43.915 } 00:10:43.915 }' 00:10:43.915 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:43.915 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:43.915 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:43.915 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:43.915 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:43.915 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:43.915 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:43.915 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:43.915 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:43.915 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:44.174 18:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:44.174 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:44.174 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:44.174 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:44.174 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:44.174 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:44.174 "name": "pt2", 00:10:44.174 "aliases": [ 00:10:44.174 "00000000-0000-0000-0000-000000000002" 00:10:44.174 ], 00:10:44.174 "product_name": "passthru", 00:10:44.174 "block_size": 512, 00:10:44.174 "num_blocks": 65536, 00:10:44.174 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:44.174 "assigned_rate_limits": { 00:10:44.174 "rw_ios_per_sec": 0, 00:10:44.174 "rw_mbytes_per_sec": 0, 00:10:44.174 "r_mbytes_per_sec": 0, 00:10:44.174 "w_mbytes_per_sec": 0 00:10:44.174 }, 00:10:44.174 "claimed": true, 00:10:44.174 "claim_type": "exclusive_write", 00:10:44.174 "zoned": false, 00:10:44.174 "supported_io_types": { 00:10:44.174 "read": true, 00:10:44.174 "write": true, 00:10:44.174 "unmap": true, 00:10:44.174 "flush": true, 00:10:44.174 "reset": true, 00:10:44.174 "nvme_admin": false, 00:10:44.174 "nvme_io": false, 00:10:44.174 "nvme_io_md": false, 00:10:44.174 "write_zeroes": true, 00:10:44.174 "zcopy": true, 00:10:44.174 "get_zone_info": false, 00:10:44.174 "zone_management": false, 00:10:44.174 "zone_append": false, 00:10:44.174 "compare": false, 00:10:44.174 "compare_and_write": false, 00:10:44.174 "abort": true, 00:10:44.174 "seek_hole": false, 00:10:44.174 "seek_data": false, 00:10:44.174 "copy": true, 00:10:44.174 "nvme_iov_md": false 00:10:44.174 }, 00:10:44.174 "memory_domains": [ 00:10:44.174 { 00:10:44.174 "dma_device_id": "system", 00:10:44.174 "dma_device_type": 1 00:10:44.175 }, 00:10:44.175 { 00:10:44.175 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:44.175 "dma_device_type": 2 00:10:44.175 } 00:10:44.175 ], 00:10:44.175 "driver_specific": { 00:10:44.175 "passthru": { 00:10:44.175 "name": "pt2", 00:10:44.175 "base_bdev_name": "malloc2" 00:10:44.175 } 00:10:44.175 } 00:10:44.175 }' 00:10:44.433 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:44.433 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:44.433 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:44.433 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:44.433 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:44.433 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:44.434 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:44.434 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:44.434 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:44.434 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:44.434 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:44.692 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:44.692 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:44.693 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:10:44.693 [2024-07-24 18:47:29.625301] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:44.693 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 78157c23-9574-4718-b1e2-7be8b41739e2 '!=' 78157c23-9574-4718-b1e2-7be8b41739e2 ']' 00:10:44.693 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:10:44.693 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:44.693 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:10:44.693 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:44.952 [2024-07-24 18:47:29.793596] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:10:44.952 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:10:44.952 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:44.952 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:44.952 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:44.952 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:44.952 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:44.952 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:44.952 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:44.952 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:44.952 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:44.952 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:44.952 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:45.211 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:45.211 "name": "raid_bdev1", 00:10:45.211 "uuid": "78157c23-9574-4718-b1e2-7be8b41739e2", 00:10:45.211 "strip_size_kb": 0, 00:10:45.211 "state": "online", 00:10:45.211 "raid_level": "raid1", 00:10:45.211 "superblock": true, 00:10:45.211 "num_base_bdevs": 2, 00:10:45.211 "num_base_bdevs_discovered": 1, 00:10:45.211 "num_base_bdevs_operational": 1, 00:10:45.211 "base_bdevs_list": [ 00:10:45.211 { 00:10:45.211 "name": null, 00:10:45.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:45.211 "is_configured": false, 00:10:45.211 "data_offset": 2048, 00:10:45.211 "data_size": 63488 00:10:45.211 }, 00:10:45.211 { 00:10:45.211 "name": "pt2", 00:10:45.211 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:45.211 "is_configured": true, 00:10:45.211 "data_offset": 2048, 00:10:45.211 "data_size": 63488 00:10:45.211 } 00:10:45.211 ] 00:10:45.211 }' 00:10:45.211 18:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:45.211 18:47:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:45.498 18:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:45.786 [2024-07-24 18:47:30.611697] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:45.786 [2024-07-24 18:47:30.611717] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:45.786 [2024-07-24 18:47:30.611758] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:45.786 [2024-07-24 18:47:30.611786] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:45.786 [2024-07-24 18:47:30.611796] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfc04e0 name raid_bdev1, state offline 00:10:45.786 18:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:45.786 18:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:10:46.045 18:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:10:46.045 18:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:10:46.045 18:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:10:46.045 18:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:10:46.045 18:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:46.045 18:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:10:46.045 18:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:10:46.045 18:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:10:46.045 18:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:10:46.045 18:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:10:46.045 18:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:46.304 [2024-07-24 18:47:31.137030] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:46.304 [2024-07-24 18:47:31.137061] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:46.304 [2024-07-24 18:47:31.137073] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe14050 00:10:46.304 [2024-07-24 18:47:31.137078] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:46.304 [2024-07-24 18:47:31.138242] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:46.304 [2024-07-24 18:47:31.138261] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:46.304 [2024-07-24 18:47:31.138302] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:46.304 [2024-07-24 18:47:31.138319] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:46.304 [2024-07-24 18:47:31.138375] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfc4f20 00:10:46.304 [2024-07-24 18:47:31.138381] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:46.304 [2024-07-24 18:47:31.138500] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe15420 00:10:46.304 [2024-07-24 18:47:31.138583] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfc4f20 00:10:46.304 [2024-07-24 18:47:31.138588] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfc4f20 00:10:46.304 [2024-07-24 18:47:31.138651] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:46.304 pt2 00:10:46.304 18:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:10:46.304 18:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:46.304 18:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:46.304 18:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:46.304 18:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:46.304 18:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:46.304 18:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:46.304 18:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:46.304 18:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:46.304 18:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:46.304 18:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:46.304 18:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:46.563 18:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:46.563 "name": "raid_bdev1", 00:10:46.563 "uuid": "78157c23-9574-4718-b1e2-7be8b41739e2", 00:10:46.563 "strip_size_kb": 0, 00:10:46.563 "state": "online", 00:10:46.563 "raid_level": "raid1", 00:10:46.563 "superblock": true, 00:10:46.563 "num_base_bdevs": 2, 00:10:46.563 "num_base_bdevs_discovered": 1, 00:10:46.563 "num_base_bdevs_operational": 1, 00:10:46.563 "base_bdevs_list": [ 00:10:46.563 { 00:10:46.563 "name": null, 00:10:46.563 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:46.563 "is_configured": false, 00:10:46.563 "data_offset": 2048, 00:10:46.563 "data_size": 63488 00:10:46.563 }, 00:10:46.563 { 00:10:46.563 "name": "pt2", 00:10:46.563 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:46.563 "is_configured": true, 00:10:46.563 "data_offset": 2048, 00:10:46.563 "data_size": 63488 00:10:46.563 } 00:10:46.563 ] 00:10:46.563 }' 00:10:46.563 18:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:46.563 18:47:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:46.821 18:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:47.080 [2024-07-24 18:47:31.967176] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:47.080 [2024-07-24 18:47:31.967194] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:47.080 [2024-07-24 18:47:31.967232] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:47.080 [2024-07-24 18:47:31.967263] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:47.080 [2024-07-24 18:47:31.967269] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfc4f20 name raid_bdev1, state offline 00:10:47.080 18:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:10:47.080 18:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:47.340 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:10:47.340 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:10:47.340 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:10:47.340 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:47.340 [2024-07-24 18:47:32.300025] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:47.340 [2024-07-24 18:47:32.300061] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:47.340 [2024-07-24 18:47:32.300071] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfbe100 00:10:47.340 [2024-07-24 18:47:32.300077] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:47.340 [2024-07-24 18:47:32.301191] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:47.340 [2024-07-24 18:47:32.301209] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:47.340 [2024-07-24 18:47:32.301251] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:47.340 [2024-07-24 18:47:32.301266] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:47.340 [2024-07-24 18:47:32.301328] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:10:47.340 [2024-07-24 18:47:32.301334] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:47.340 [2024-07-24 18:47:32.301341] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfc2df0 name raid_bdev1, state configuring 00:10:47.340 [2024-07-24 18:47:32.301358] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:47.340 [2024-07-24 18:47:32.301393] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfc3900 00:10:47.340 [2024-07-24 18:47:32.301398] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:47.340 [2024-07-24 18:47:32.301512] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfbebe0 00:10:47.340 [2024-07-24 18:47:32.301592] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfc3900 00:10:47.340 [2024-07-24 18:47:32.301597] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfc3900 00:10:47.340 [2024-07-24 18:47:32.301660] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:47.340 pt1 00:10:47.340 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:10:47.340 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:10:47.340 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:47.340 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:47.340 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:47.340 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:47.340 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:47.340 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:47.340 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:47.340 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:47.340 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:47.340 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:47.340 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:47.599 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:47.599 "name": "raid_bdev1", 00:10:47.599 "uuid": "78157c23-9574-4718-b1e2-7be8b41739e2", 00:10:47.599 "strip_size_kb": 0, 00:10:47.599 "state": "online", 00:10:47.599 "raid_level": "raid1", 00:10:47.599 "superblock": true, 00:10:47.599 "num_base_bdevs": 2, 00:10:47.599 "num_base_bdevs_discovered": 1, 00:10:47.599 "num_base_bdevs_operational": 1, 00:10:47.599 "base_bdevs_list": [ 00:10:47.599 { 00:10:47.599 "name": null, 00:10:47.599 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:47.599 "is_configured": false, 00:10:47.599 "data_offset": 2048, 00:10:47.599 "data_size": 63488 00:10:47.599 }, 00:10:47.599 { 00:10:47.599 "name": "pt2", 00:10:47.599 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:47.599 "is_configured": true, 00:10:47.599 "data_offset": 2048, 00:10:47.599 "data_size": 63488 00:10:47.599 } 00:10:47.599 ] 00:10:47.599 }' 00:10:47.599 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:47.599 18:47:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:48.167 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:48.167 18:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:10:48.167 18:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:10:48.167 18:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:48.167 18:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:10:48.426 [2024-07-24 18:47:33.290736] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:48.426 18:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 78157c23-9574-4718-b1e2-7be8b41739e2 '!=' 78157c23-9574-4718-b1e2-7be8b41739e2 ']' 00:10:48.426 18:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2062698 00:10:48.426 18:47:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2062698 ']' 00:10:48.426 18:47:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2062698 00:10:48.426 18:47:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:10:48.426 18:47:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:48.426 18:47:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2062698 00:10:48.426 18:47:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:48.426 18:47:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:48.426 18:47:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2062698' 00:10:48.426 killing process with pid 2062698 00:10:48.426 18:47:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2062698 00:10:48.426 [2024-07-24 18:47:33.346693] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:48.426 [2024-07-24 18:47:33.346742] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:48.426 18:47:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2062698 00:10:48.426 [2024-07-24 18:47:33.346774] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:48.426 [2024-07-24 18:47:33.346781] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfc3900 name raid_bdev1, state offline 00:10:48.426 [2024-07-24 18:47:33.362238] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:48.685 18:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:10:48.685 00:10:48.685 real 0m11.684s 00:10:48.685 user 0m21.377s 00:10:48.685 sys 0m1.828s 00:10:48.685 18:47:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:48.685 18:47:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:48.685 ************************************ 00:10:48.685 END TEST raid_superblock_test 00:10:48.685 ************************************ 00:10:48.685 18:47:33 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:10:48.685 18:47:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:48.685 18:47:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:48.685 18:47:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:48.685 ************************************ 00:10:48.685 START TEST raid_read_error_test 00:10:48.685 ************************************ 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.pVe3zACNE2 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2065069 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2065069 /var/tmp/spdk-raid.sock 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2065069 ']' 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:48.685 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:48.685 18:47:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:48.685 [2024-07-24 18:47:33.651774] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:10:48.685 [2024-07-24 18:47:33.651812] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2065069 ] 00:10:48.944 [2024-07-24 18:47:33.714044] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:48.944 [2024-07-24 18:47:33.791500] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:48.944 [2024-07-24 18:47:33.841966] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:48.944 [2024-07-24 18:47:33.841991] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:49.510 18:47:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:49.510 18:47:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:10:49.510 18:47:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:49.510 18:47:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:49.768 BaseBdev1_malloc 00:10:49.768 18:47:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:49.768 true 00:10:49.768 18:47:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:50.025 [2024-07-24 18:47:34.893682] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:50.025 [2024-07-24 18:47:34.893712] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:50.025 [2024-07-24 18:47:34.893723] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x189cd20 00:10:50.025 [2024-07-24 18:47:34.893729] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:50.025 [2024-07-24 18:47:34.894881] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:50.025 [2024-07-24 18:47:34.894902] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:50.025 BaseBdev1 00:10:50.025 18:47:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:50.025 18:47:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:50.282 BaseBdev2_malloc 00:10:50.282 18:47:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:50.282 true 00:10:50.282 18:47:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:50.539 [2024-07-24 18:47:35.370431] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:50.539 [2024-07-24 18:47:35.370460] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:50.539 [2024-07-24 18:47:35.370475] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18a1d50 00:10:50.539 [2024-07-24 18:47:35.370497] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:50.539 [2024-07-24 18:47:35.371552] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:50.539 [2024-07-24 18:47:35.371573] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:50.539 BaseBdev2 00:10:50.539 18:47:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:50.539 [2024-07-24 18:47:35.522849] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:50.539 [2024-07-24 18:47:35.523668] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:50.539 [2024-07-24 18:47:35.523791] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x18a30e0 00:10:50.539 [2024-07-24 18:47:35.523799] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:50.539 [2024-07-24 18:47:35.523922] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18ab7d0 00:10:50.539 [2024-07-24 18:47:35.524021] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18a30e0 00:10:50.539 [2024-07-24 18:47:35.524026] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18a30e0 00:10:50.540 [2024-07-24 18:47:35.524090] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:50.540 18:47:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:10:50.540 18:47:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:50.540 18:47:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:50.540 18:47:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:50.540 18:47:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:50.540 18:47:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:50.540 18:47:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:50.540 18:47:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:50.540 18:47:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:50.540 18:47:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:50.540 18:47:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:50.540 18:47:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:50.797 18:47:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:50.797 "name": "raid_bdev1", 00:10:50.797 "uuid": "c4acbfce-a3a7-4d65-8943-0340e0b24592", 00:10:50.797 "strip_size_kb": 0, 00:10:50.797 "state": "online", 00:10:50.797 "raid_level": "raid1", 00:10:50.797 "superblock": true, 00:10:50.797 "num_base_bdevs": 2, 00:10:50.797 "num_base_bdevs_discovered": 2, 00:10:50.797 "num_base_bdevs_operational": 2, 00:10:50.797 "base_bdevs_list": [ 00:10:50.797 { 00:10:50.797 "name": "BaseBdev1", 00:10:50.797 "uuid": "9b983765-ec3f-5b52-b931-7c57e316a92c", 00:10:50.797 "is_configured": true, 00:10:50.797 "data_offset": 2048, 00:10:50.797 "data_size": 63488 00:10:50.797 }, 00:10:50.797 { 00:10:50.797 "name": "BaseBdev2", 00:10:50.797 "uuid": "27ecd5c0-c853-5db5-963e-d7f65b8a8633", 00:10:50.797 "is_configured": true, 00:10:50.797 "data_offset": 2048, 00:10:50.797 "data_size": 63488 00:10:50.797 } 00:10:50.797 ] 00:10:50.797 }' 00:10:50.797 18:47:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:50.797 18:47:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:51.362 18:47:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:51.362 18:47:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:51.362 [2024-07-24 18:47:36.248945] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x189ebc0 00:10:52.297 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:52.555 "name": "raid_bdev1", 00:10:52.555 "uuid": "c4acbfce-a3a7-4d65-8943-0340e0b24592", 00:10:52.555 "strip_size_kb": 0, 00:10:52.555 "state": "online", 00:10:52.555 "raid_level": "raid1", 00:10:52.555 "superblock": true, 00:10:52.555 "num_base_bdevs": 2, 00:10:52.555 "num_base_bdevs_discovered": 2, 00:10:52.555 "num_base_bdevs_operational": 2, 00:10:52.555 "base_bdevs_list": [ 00:10:52.555 { 00:10:52.555 "name": "BaseBdev1", 00:10:52.555 "uuid": "9b983765-ec3f-5b52-b931-7c57e316a92c", 00:10:52.555 "is_configured": true, 00:10:52.555 "data_offset": 2048, 00:10:52.555 "data_size": 63488 00:10:52.555 }, 00:10:52.555 { 00:10:52.555 "name": "BaseBdev2", 00:10:52.555 "uuid": "27ecd5c0-c853-5db5-963e-d7f65b8a8633", 00:10:52.555 "is_configured": true, 00:10:52.555 "data_offset": 2048, 00:10:52.555 "data_size": 63488 00:10:52.555 } 00:10:52.555 ] 00:10:52.555 }' 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:52.555 18:47:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:53.122 18:47:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:53.381 [2024-07-24 18:47:38.164429] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:53.381 [2024-07-24 18:47:38.164464] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:53.381 [2024-07-24 18:47:38.166441] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:53.381 [2024-07-24 18:47:38.166462] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:53.381 [2024-07-24 18:47:38.166516] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:53.381 [2024-07-24 18:47:38.166522] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18a30e0 name raid_bdev1, state offline 00:10:53.381 0 00:10:53.381 18:47:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2065069 00:10:53.381 18:47:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2065069 ']' 00:10:53.381 18:47:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2065069 00:10:53.381 18:47:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:10:53.381 18:47:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:53.381 18:47:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2065069 00:10:53.381 18:47:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:53.381 18:47:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:53.381 18:47:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2065069' 00:10:53.381 killing process with pid 2065069 00:10:53.381 18:47:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2065069 00:10:53.381 [2024-07-24 18:47:38.216338] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:53.381 18:47:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2065069 00:10:53.381 [2024-07-24 18:47:38.225864] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:53.639 18:47:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.pVe3zACNE2 00:10:53.639 18:47:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:53.639 18:47:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:53.639 18:47:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:10:53.639 18:47:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:10:53.639 18:47:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:53.639 18:47:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:10:53.639 18:47:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:10:53.639 00:10:53.639 real 0m4.815s 00:10:53.639 user 0m7.354s 00:10:53.639 sys 0m0.649s 00:10:53.639 18:47:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:53.639 18:47:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:53.639 ************************************ 00:10:53.639 END TEST raid_read_error_test 00:10:53.639 ************************************ 00:10:53.639 18:47:38 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:10:53.639 18:47:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:53.639 18:47:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:53.639 18:47:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:53.639 ************************************ 00:10:53.639 START TEST raid_write_error_test 00:10:53.639 ************************************ 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.CcIgZkhIEW 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2065859 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2065859 /var/tmp/spdk-raid.sock 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2065859 ']' 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:53.639 18:47:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:53.640 18:47:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:53.640 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:53.640 18:47:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:53.640 18:47:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:53.640 18:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:53.640 [2024-07-24 18:47:38.526080] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:10:53.640 [2024-07-24 18:47:38.526120] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2065859 ] 00:10:53.640 [2024-07-24 18:47:38.588009] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:53.898 [2024-07-24 18:47:38.666916] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:53.898 [2024-07-24 18:47:38.718127] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:53.898 [2024-07-24 18:47:38.718155] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:54.464 18:47:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:54.464 18:47:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:10:54.464 18:47:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:54.464 18:47:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:54.464 BaseBdev1_malloc 00:10:54.464 18:47:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:54.722 true 00:10:54.722 18:47:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:54.981 [2024-07-24 18:47:39.794117] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:54.981 [2024-07-24 18:47:39.794150] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:54.981 [2024-07-24 18:47:39.794162] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc70d20 00:10:54.981 [2024-07-24 18:47:39.794169] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:54.981 [2024-07-24 18:47:39.795355] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:54.981 [2024-07-24 18:47:39.795375] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:54.981 BaseBdev1 00:10:54.981 18:47:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:54.981 18:47:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:54.981 BaseBdev2_malloc 00:10:54.981 18:47:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:55.239 true 00:10:55.240 18:47:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:55.498 [2024-07-24 18:47:40.306794] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:55.499 [2024-07-24 18:47:40.306826] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:55.499 [2024-07-24 18:47:40.306837] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc75d50 00:10:55.499 [2024-07-24 18:47:40.306843] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:55.499 [2024-07-24 18:47:40.307940] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:55.499 [2024-07-24 18:47:40.307961] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:55.499 BaseBdev2 00:10:55.499 18:47:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:55.499 [2024-07-24 18:47:40.471240] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:55.499 [2024-07-24 18:47:40.472178] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:55.499 [2024-07-24 18:47:40.472319] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xc770e0 00:10:55.499 [2024-07-24 18:47:40.472327] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:55.499 [2024-07-24 18:47:40.472462] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc7f7d0 00:10:55.499 [2024-07-24 18:47:40.472577] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc770e0 00:10:55.499 [2024-07-24 18:47:40.472583] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc770e0 00:10:55.499 [2024-07-24 18:47:40.472654] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:55.499 18:47:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:10:55.499 18:47:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:55.499 18:47:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:55.499 18:47:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:55.499 18:47:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:55.499 18:47:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:55.499 18:47:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:55.499 18:47:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:55.499 18:47:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:55.499 18:47:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:55.499 18:47:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:55.499 18:47:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:55.758 18:47:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:55.758 "name": "raid_bdev1", 00:10:55.758 "uuid": "cf78c058-4fde-494c-abcb-7ccdfc525e0c", 00:10:55.758 "strip_size_kb": 0, 00:10:55.758 "state": "online", 00:10:55.758 "raid_level": "raid1", 00:10:55.758 "superblock": true, 00:10:55.758 "num_base_bdevs": 2, 00:10:55.758 "num_base_bdevs_discovered": 2, 00:10:55.758 "num_base_bdevs_operational": 2, 00:10:55.758 "base_bdevs_list": [ 00:10:55.758 { 00:10:55.758 "name": "BaseBdev1", 00:10:55.758 "uuid": "022ca86b-20df-5fe5-9458-bcd881bafb09", 00:10:55.758 "is_configured": true, 00:10:55.758 "data_offset": 2048, 00:10:55.758 "data_size": 63488 00:10:55.758 }, 00:10:55.758 { 00:10:55.758 "name": "BaseBdev2", 00:10:55.758 "uuid": "e565396c-2062-53d5-b77e-0e35f933dcc9", 00:10:55.758 "is_configured": true, 00:10:55.758 "data_offset": 2048, 00:10:55.758 "data_size": 63488 00:10:55.758 } 00:10:55.758 ] 00:10:55.758 }' 00:10:55.758 18:47:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:55.758 18:47:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:56.340 18:47:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:56.340 18:47:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:56.340 [2024-07-24 18:47:41.201325] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc72bc0 00:10:57.282 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:10:57.282 [2024-07-24 18:47:42.277976] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:10:57.282 [2024-07-24 18:47:42.278029] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:57.282 [2024-07-24 18:47:42.278186] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xc72bc0 00:10:57.540 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:57.540 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:10:57.540 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:10:57.540 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:10:57.541 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:10:57.541 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:57.541 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:57.541 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:57.541 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:57.541 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:57.541 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:57.541 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:57.541 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:57.541 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:57.541 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:57.541 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:57.541 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:57.541 "name": "raid_bdev1", 00:10:57.541 "uuid": "cf78c058-4fde-494c-abcb-7ccdfc525e0c", 00:10:57.541 "strip_size_kb": 0, 00:10:57.541 "state": "online", 00:10:57.541 "raid_level": "raid1", 00:10:57.541 "superblock": true, 00:10:57.541 "num_base_bdevs": 2, 00:10:57.541 "num_base_bdevs_discovered": 1, 00:10:57.541 "num_base_bdevs_operational": 1, 00:10:57.541 "base_bdevs_list": [ 00:10:57.541 { 00:10:57.541 "name": null, 00:10:57.541 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:57.541 "is_configured": false, 00:10:57.541 "data_offset": 2048, 00:10:57.541 "data_size": 63488 00:10:57.541 }, 00:10:57.541 { 00:10:57.541 "name": "BaseBdev2", 00:10:57.541 "uuid": "e565396c-2062-53d5-b77e-0e35f933dcc9", 00:10:57.541 "is_configured": true, 00:10:57.541 "data_offset": 2048, 00:10:57.541 "data_size": 63488 00:10:57.541 } 00:10:57.541 ] 00:10:57.541 }' 00:10:57.541 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:57.541 18:47:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:58.108 18:47:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:58.108 [2024-07-24 18:47:43.082150] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:58.108 [2024-07-24 18:47:43.082177] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:58.108 [2024-07-24 18:47:43.084138] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:58.108 [2024-07-24 18:47:43.084158] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:58.108 [2024-07-24 18:47:43.084187] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:58.108 [2024-07-24 18:47:43.084192] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc770e0 name raid_bdev1, state offline 00:10:58.108 0 00:10:58.108 18:47:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2065859 00:10:58.108 18:47:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2065859 ']' 00:10:58.108 18:47:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2065859 00:10:58.108 18:47:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:10:58.108 18:47:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:58.108 18:47:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2065859 00:10:58.368 18:47:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:58.368 18:47:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:58.368 18:47:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2065859' 00:10:58.368 killing process with pid 2065859 00:10:58.368 18:47:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2065859 00:10:58.368 [2024-07-24 18:47:43.136803] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:58.368 18:47:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2065859 00:10:58.368 [2024-07-24 18:47:43.146123] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:58.368 18:47:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.CcIgZkhIEW 00:10:58.368 18:47:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:58.368 18:47:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:58.368 18:47:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:10:58.368 18:47:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:10:58.368 18:47:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:58.368 18:47:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:10:58.368 18:47:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:10:58.368 00:10:58.368 real 0m4.863s 00:10:58.368 user 0m7.426s 00:10:58.368 sys 0m0.702s 00:10:58.368 18:47:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:58.368 18:47:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:58.368 ************************************ 00:10:58.368 END TEST raid_write_error_test 00:10:58.368 ************************************ 00:10:58.368 18:47:43 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:10:58.368 18:47:43 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:58.368 18:47:43 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:10:58.368 18:47:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:58.368 18:47:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:58.368 18:47:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:58.626 ************************************ 00:10:58.626 START TEST raid_state_function_test 00:10:58.626 ************************************ 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2066857 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2066857' 00:10:58.626 Process raid pid: 2066857 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2066857 /var/tmp/spdk-raid.sock 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2066857 ']' 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:58.626 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:58.626 18:47:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:58.626 [2024-07-24 18:47:43.457173] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:10:58.626 [2024-07-24 18:47:43.457212] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:58.626 [2024-07-24 18:47:43.521508] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:58.626 [2024-07-24 18:47:43.590299] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:58.884 [2024-07-24 18:47:43.640522] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:58.884 [2024-07-24 18:47:43.640544] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:59.450 18:47:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:59.450 18:47:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:10:59.450 18:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:10:59.450 [2024-07-24 18:47:44.395595] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:59.450 [2024-07-24 18:47:44.395629] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:59.450 [2024-07-24 18:47:44.395636] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:59.450 [2024-07-24 18:47:44.395642] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:59.450 [2024-07-24 18:47:44.395649] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:10:59.450 [2024-07-24 18:47:44.395656] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:10:59.450 18:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:59.450 18:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:59.450 18:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:59.450 18:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:59.450 18:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:59.450 18:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:10:59.450 18:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:59.450 18:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:59.450 18:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:59.450 18:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:59.450 18:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:59.450 18:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:59.708 18:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:59.708 "name": "Existed_Raid", 00:10:59.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:59.708 "strip_size_kb": 64, 00:10:59.708 "state": "configuring", 00:10:59.708 "raid_level": "raid0", 00:10:59.708 "superblock": false, 00:10:59.708 "num_base_bdevs": 3, 00:10:59.708 "num_base_bdevs_discovered": 0, 00:10:59.708 "num_base_bdevs_operational": 3, 00:10:59.708 "base_bdevs_list": [ 00:10:59.708 { 00:10:59.708 "name": "BaseBdev1", 00:10:59.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:59.708 "is_configured": false, 00:10:59.708 "data_offset": 0, 00:10:59.708 "data_size": 0 00:10:59.708 }, 00:10:59.708 { 00:10:59.708 "name": "BaseBdev2", 00:10:59.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:59.708 "is_configured": false, 00:10:59.708 "data_offset": 0, 00:10:59.708 "data_size": 0 00:10:59.708 }, 00:10:59.708 { 00:10:59.708 "name": "BaseBdev3", 00:10:59.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:59.708 "is_configured": false, 00:10:59.708 "data_offset": 0, 00:10:59.708 "data_size": 0 00:10:59.708 } 00:10:59.708 ] 00:10:59.708 }' 00:10:59.708 18:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:59.708 18:47:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:00.275 18:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:00.275 [2024-07-24 18:47:45.213613] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:00.275 [2024-07-24 18:47:45.213634] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15b7ba0 name Existed_Raid, state configuring 00:11:00.275 18:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:00.534 [2024-07-24 18:47:45.378047] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:00.534 [2024-07-24 18:47:45.378064] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:00.534 [2024-07-24 18:47:45.378068] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:00.534 [2024-07-24 18:47:45.378073] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:00.534 [2024-07-24 18:47:45.378076] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:00.534 [2024-07-24 18:47:45.378081] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:00.534 18:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:00.792 [2024-07-24 18:47:45.546965] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:00.792 BaseBdev1 00:11:00.792 18:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:00.793 18:47:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:00.793 18:47:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:00.793 18:47:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:00.793 18:47:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:00.793 18:47:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:00.793 18:47:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:00.793 18:47:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:01.052 [ 00:11:01.052 { 00:11:01.052 "name": "BaseBdev1", 00:11:01.052 "aliases": [ 00:11:01.052 "dbf0aa08-f3bb-4e83-81dc-1c5f47e96c27" 00:11:01.052 ], 00:11:01.052 "product_name": "Malloc disk", 00:11:01.052 "block_size": 512, 00:11:01.052 "num_blocks": 65536, 00:11:01.052 "uuid": "dbf0aa08-f3bb-4e83-81dc-1c5f47e96c27", 00:11:01.052 "assigned_rate_limits": { 00:11:01.052 "rw_ios_per_sec": 0, 00:11:01.052 "rw_mbytes_per_sec": 0, 00:11:01.052 "r_mbytes_per_sec": 0, 00:11:01.052 "w_mbytes_per_sec": 0 00:11:01.052 }, 00:11:01.052 "claimed": true, 00:11:01.052 "claim_type": "exclusive_write", 00:11:01.052 "zoned": false, 00:11:01.052 "supported_io_types": { 00:11:01.052 "read": true, 00:11:01.052 "write": true, 00:11:01.052 "unmap": true, 00:11:01.052 "flush": true, 00:11:01.052 "reset": true, 00:11:01.052 "nvme_admin": false, 00:11:01.052 "nvme_io": false, 00:11:01.052 "nvme_io_md": false, 00:11:01.052 "write_zeroes": true, 00:11:01.052 "zcopy": true, 00:11:01.052 "get_zone_info": false, 00:11:01.052 "zone_management": false, 00:11:01.052 "zone_append": false, 00:11:01.052 "compare": false, 00:11:01.052 "compare_and_write": false, 00:11:01.052 "abort": true, 00:11:01.052 "seek_hole": false, 00:11:01.052 "seek_data": false, 00:11:01.052 "copy": true, 00:11:01.052 "nvme_iov_md": false 00:11:01.052 }, 00:11:01.052 "memory_domains": [ 00:11:01.052 { 00:11:01.052 "dma_device_id": "system", 00:11:01.052 "dma_device_type": 1 00:11:01.052 }, 00:11:01.052 { 00:11:01.052 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:01.052 "dma_device_type": 2 00:11:01.052 } 00:11:01.052 ], 00:11:01.052 "driver_specific": {} 00:11:01.052 } 00:11:01.052 ] 00:11:01.052 18:47:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:01.052 18:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:01.052 18:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:01.052 18:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:01.052 18:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:01.052 18:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:01.052 18:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:01.052 18:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:01.052 18:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:01.052 18:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:01.052 18:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:01.052 18:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:01.052 18:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:01.311 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:01.311 "name": "Existed_Raid", 00:11:01.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:01.311 "strip_size_kb": 64, 00:11:01.311 "state": "configuring", 00:11:01.311 "raid_level": "raid0", 00:11:01.311 "superblock": false, 00:11:01.311 "num_base_bdevs": 3, 00:11:01.311 "num_base_bdevs_discovered": 1, 00:11:01.311 "num_base_bdevs_operational": 3, 00:11:01.311 "base_bdevs_list": [ 00:11:01.311 { 00:11:01.311 "name": "BaseBdev1", 00:11:01.311 "uuid": "dbf0aa08-f3bb-4e83-81dc-1c5f47e96c27", 00:11:01.311 "is_configured": true, 00:11:01.311 "data_offset": 0, 00:11:01.311 "data_size": 65536 00:11:01.311 }, 00:11:01.311 { 00:11:01.311 "name": "BaseBdev2", 00:11:01.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:01.311 "is_configured": false, 00:11:01.311 "data_offset": 0, 00:11:01.311 "data_size": 0 00:11:01.311 }, 00:11:01.311 { 00:11:01.311 "name": "BaseBdev3", 00:11:01.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:01.311 "is_configured": false, 00:11:01.311 "data_offset": 0, 00:11:01.311 "data_size": 0 00:11:01.311 } 00:11:01.311 ] 00:11:01.311 }' 00:11:01.311 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:01.311 18:47:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:01.569 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:01.828 [2024-07-24 18:47:46.645788] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:01.828 [2024-07-24 18:47:46.645819] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15b7470 name Existed_Raid, state configuring 00:11:01.828 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:01.828 [2024-07-24 18:47:46.802230] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:01.828 [2024-07-24 18:47:46.803289] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:01.828 [2024-07-24 18:47:46.803317] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:01.828 [2024-07-24 18:47:46.803322] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:01.828 [2024-07-24 18:47:46.803327] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:01.828 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:01.828 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:01.828 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:01.828 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:01.828 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:01.828 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:01.828 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:01.828 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:01.828 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:01.828 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:01.828 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:01.828 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:01.828 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:01.828 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:02.087 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:02.087 "name": "Existed_Raid", 00:11:02.087 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:02.087 "strip_size_kb": 64, 00:11:02.087 "state": "configuring", 00:11:02.087 "raid_level": "raid0", 00:11:02.087 "superblock": false, 00:11:02.087 "num_base_bdevs": 3, 00:11:02.087 "num_base_bdevs_discovered": 1, 00:11:02.087 "num_base_bdevs_operational": 3, 00:11:02.087 "base_bdevs_list": [ 00:11:02.087 { 00:11:02.087 "name": "BaseBdev1", 00:11:02.087 "uuid": "dbf0aa08-f3bb-4e83-81dc-1c5f47e96c27", 00:11:02.087 "is_configured": true, 00:11:02.087 "data_offset": 0, 00:11:02.087 "data_size": 65536 00:11:02.087 }, 00:11:02.087 { 00:11:02.087 "name": "BaseBdev2", 00:11:02.087 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:02.087 "is_configured": false, 00:11:02.087 "data_offset": 0, 00:11:02.087 "data_size": 0 00:11:02.087 }, 00:11:02.087 { 00:11:02.087 "name": "BaseBdev3", 00:11:02.087 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:02.087 "is_configured": false, 00:11:02.087 "data_offset": 0, 00:11:02.087 "data_size": 0 00:11:02.087 } 00:11:02.087 ] 00:11:02.087 }' 00:11:02.087 18:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:02.087 18:47:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:02.690 18:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:02.690 [2024-07-24 18:47:47.643023] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:02.690 BaseBdev2 00:11:02.690 18:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:02.690 18:47:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:02.690 18:47:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:02.690 18:47:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:02.690 18:47:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:02.690 18:47:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:02.690 18:47:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:02.948 18:47:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:03.208 [ 00:11:03.208 { 00:11:03.208 "name": "BaseBdev2", 00:11:03.208 "aliases": [ 00:11:03.208 "48b29ea9-fa32-4741-a3ef-7bb9c559dc58" 00:11:03.208 ], 00:11:03.208 "product_name": "Malloc disk", 00:11:03.208 "block_size": 512, 00:11:03.208 "num_blocks": 65536, 00:11:03.208 "uuid": "48b29ea9-fa32-4741-a3ef-7bb9c559dc58", 00:11:03.208 "assigned_rate_limits": { 00:11:03.208 "rw_ios_per_sec": 0, 00:11:03.208 "rw_mbytes_per_sec": 0, 00:11:03.208 "r_mbytes_per_sec": 0, 00:11:03.208 "w_mbytes_per_sec": 0 00:11:03.208 }, 00:11:03.208 "claimed": true, 00:11:03.208 "claim_type": "exclusive_write", 00:11:03.208 "zoned": false, 00:11:03.208 "supported_io_types": { 00:11:03.208 "read": true, 00:11:03.208 "write": true, 00:11:03.208 "unmap": true, 00:11:03.208 "flush": true, 00:11:03.208 "reset": true, 00:11:03.208 "nvme_admin": false, 00:11:03.208 "nvme_io": false, 00:11:03.208 "nvme_io_md": false, 00:11:03.208 "write_zeroes": true, 00:11:03.208 "zcopy": true, 00:11:03.208 "get_zone_info": false, 00:11:03.208 "zone_management": false, 00:11:03.208 "zone_append": false, 00:11:03.208 "compare": false, 00:11:03.208 "compare_and_write": false, 00:11:03.208 "abort": true, 00:11:03.208 "seek_hole": false, 00:11:03.208 "seek_data": false, 00:11:03.208 "copy": true, 00:11:03.208 "nvme_iov_md": false 00:11:03.208 }, 00:11:03.208 "memory_domains": [ 00:11:03.208 { 00:11:03.208 "dma_device_id": "system", 00:11:03.208 "dma_device_type": 1 00:11:03.208 }, 00:11:03.208 { 00:11:03.208 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:03.208 "dma_device_type": 2 00:11:03.208 } 00:11:03.208 ], 00:11:03.208 "driver_specific": {} 00:11:03.208 } 00:11:03.208 ] 00:11:03.208 18:47:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:03.208 18:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:03.208 18:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:03.208 18:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:03.208 18:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:03.208 18:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:03.208 18:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:03.208 18:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:03.208 18:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:03.208 18:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:03.208 18:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:03.208 18:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:03.208 18:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:03.208 18:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:03.208 18:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:03.208 18:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:03.208 "name": "Existed_Raid", 00:11:03.208 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:03.208 "strip_size_kb": 64, 00:11:03.208 "state": "configuring", 00:11:03.208 "raid_level": "raid0", 00:11:03.208 "superblock": false, 00:11:03.208 "num_base_bdevs": 3, 00:11:03.208 "num_base_bdevs_discovered": 2, 00:11:03.208 "num_base_bdevs_operational": 3, 00:11:03.208 "base_bdevs_list": [ 00:11:03.208 { 00:11:03.208 "name": "BaseBdev1", 00:11:03.208 "uuid": "dbf0aa08-f3bb-4e83-81dc-1c5f47e96c27", 00:11:03.208 "is_configured": true, 00:11:03.208 "data_offset": 0, 00:11:03.208 "data_size": 65536 00:11:03.208 }, 00:11:03.208 { 00:11:03.208 "name": "BaseBdev2", 00:11:03.208 "uuid": "48b29ea9-fa32-4741-a3ef-7bb9c559dc58", 00:11:03.208 "is_configured": true, 00:11:03.208 "data_offset": 0, 00:11:03.208 "data_size": 65536 00:11:03.208 }, 00:11:03.208 { 00:11:03.208 "name": "BaseBdev3", 00:11:03.208 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:03.208 "is_configured": false, 00:11:03.208 "data_offset": 0, 00:11:03.208 "data_size": 0 00:11:03.208 } 00:11:03.208 ] 00:11:03.208 }' 00:11:03.208 18:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:03.208 18:47:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:03.776 18:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:04.034 [2024-07-24 18:47:48.812722] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:04.034 [2024-07-24 18:47:48.812751] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x15b8360 00:11:04.034 [2024-07-24 18:47:48.812755] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:11:04.034 [2024-07-24 18:47:48.812886] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17605f0 00:11:04.034 [2024-07-24 18:47:48.812976] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15b8360 00:11:04.034 [2024-07-24 18:47:48.812982] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x15b8360 00:11:04.034 [2024-07-24 18:47:48.813109] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:04.034 BaseBdev3 00:11:04.034 18:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:11:04.034 18:47:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:11:04.034 18:47:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:04.034 18:47:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:04.034 18:47:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:04.034 18:47:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:04.034 18:47:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:04.034 18:47:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:04.293 [ 00:11:04.293 { 00:11:04.293 "name": "BaseBdev3", 00:11:04.293 "aliases": [ 00:11:04.293 "7e3ad79e-445b-4967-b6ee-d9519edf2be4" 00:11:04.293 ], 00:11:04.293 "product_name": "Malloc disk", 00:11:04.293 "block_size": 512, 00:11:04.293 "num_blocks": 65536, 00:11:04.293 "uuid": "7e3ad79e-445b-4967-b6ee-d9519edf2be4", 00:11:04.293 "assigned_rate_limits": { 00:11:04.293 "rw_ios_per_sec": 0, 00:11:04.293 "rw_mbytes_per_sec": 0, 00:11:04.293 "r_mbytes_per_sec": 0, 00:11:04.293 "w_mbytes_per_sec": 0 00:11:04.294 }, 00:11:04.294 "claimed": true, 00:11:04.294 "claim_type": "exclusive_write", 00:11:04.294 "zoned": false, 00:11:04.294 "supported_io_types": { 00:11:04.294 "read": true, 00:11:04.294 "write": true, 00:11:04.294 "unmap": true, 00:11:04.294 "flush": true, 00:11:04.294 "reset": true, 00:11:04.294 "nvme_admin": false, 00:11:04.294 "nvme_io": false, 00:11:04.294 "nvme_io_md": false, 00:11:04.294 "write_zeroes": true, 00:11:04.294 "zcopy": true, 00:11:04.294 "get_zone_info": false, 00:11:04.294 "zone_management": false, 00:11:04.294 "zone_append": false, 00:11:04.294 "compare": false, 00:11:04.294 "compare_and_write": false, 00:11:04.294 "abort": true, 00:11:04.294 "seek_hole": false, 00:11:04.294 "seek_data": false, 00:11:04.294 "copy": true, 00:11:04.294 "nvme_iov_md": false 00:11:04.294 }, 00:11:04.294 "memory_domains": [ 00:11:04.294 { 00:11:04.294 "dma_device_id": "system", 00:11:04.294 "dma_device_type": 1 00:11:04.294 }, 00:11:04.294 { 00:11:04.294 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:04.294 "dma_device_type": 2 00:11:04.294 } 00:11:04.294 ], 00:11:04.294 "driver_specific": {} 00:11:04.294 } 00:11:04.294 ] 00:11:04.294 18:47:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:04.294 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:04.294 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:04.294 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:04.294 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:04.294 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:04.294 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:04.294 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:04.294 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:04.294 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:04.294 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:04.294 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:04.294 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:04.294 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:04.294 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:04.553 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:04.553 "name": "Existed_Raid", 00:11:04.553 "uuid": "1d4c3250-84b2-4689-ac7f-29d4e44a213f", 00:11:04.553 "strip_size_kb": 64, 00:11:04.553 "state": "online", 00:11:04.553 "raid_level": "raid0", 00:11:04.553 "superblock": false, 00:11:04.553 "num_base_bdevs": 3, 00:11:04.553 "num_base_bdevs_discovered": 3, 00:11:04.553 "num_base_bdevs_operational": 3, 00:11:04.553 "base_bdevs_list": [ 00:11:04.553 { 00:11:04.553 "name": "BaseBdev1", 00:11:04.553 "uuid": "dbf0aa08-f3bb-4e83-81dc-1c5f47e96c27", 00:11:04.553 "is_configured": true, 00:11:04.553 "data_offset": 0, 00:11:04.553 "data_size": 65536 00:11:04.553 }, 00:11:04.553 { 00:11:04.553 "name": "BaseBdev2", 00:11:04.553 "uuid": "48b29ea9-fa32-4741-a3ef-7bb9c559dc58", 00:11:04.553 "is_configured": true, 00:11:04.553 "data_offset": 0, 00:11:04.553 "data_size": 65536 00:11:04.553 }, 00:11:04.553 { 00:11:04.553 "name": "BaseBdev3", 00:11:04.553 "uuid": "7e3ad79e-445b-4967-b6ee-d9519edf2be4", 00:11:04.553 "is_configured": true, 00:11:04.553 "data_offset": 0, 00:11:04.553 "data_size": 65536 00:11:04.553 } 00:11:04.553 ] 00:11:04.553 }' 00:11:04.553 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:04.553 18:47:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:04.811 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:04.811 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:04.811 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:04.811 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:04.811 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:04.811 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:04.811 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:04.812 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:05.070 [2024-07-24 18:47:49.971926] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:05.070 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:05.070 "name": "Existed_Raid", 00:11:05.070 "aliases": [ 00:11:05.070 "1d4c3250-84b2-4689-ac7f-29d4e44a213f" 00:11:05.070 ], 00:11:05.070 "product_name": "Raid Volume", 00:11:05.070 "block_size": 512, 00:11:05.070 "num_blocks": 196608, 00:11:05.070 "uuid": "1d4c3250-84b2-4689-ac7f-29d4e44a213f", 00:11:05.070 "assigned_rate_limits": { 00:11:05.070 "rw_ios_per_sec": 0, 00:11:05.070 "rw_mbytes_per_sec": 0, 00:11:05.070 "r_mbytes_per_sec": 0, 00:11:05.070 "w_mbytes_per_sec": 0 00:11:05.070 }, 00:11:05.070 "claimed": false, 00:11:05.070 "zoned": false, 00:11:05.070 "supported_io_types": { 00:11:05.070 "read": true, 00:11:05.070 "write": true, 00:11:05.070 "unmap": true, 00:11:05.070 "flush": true, 00:11:05.070 "reset": true, 00:11:05.070 "nvme_admin": false, 00:11:05.070 "nvme_io": false, 00:11:05.070 "nvme_io_md": false, 00:11:05.070 "write_zeroes": true, 00:11:05.070 "zcopy": false, 00:11:05.070 "get_zone_info": false, 00:11:05.070 "zone_management": false, 00:11:05.070 "zone_append": false, 00:11:05.070 "compare": false, 00:11:05.070 "compare_and_write": false, 00:11:05.070 "abort": false, 00:11:05.070 "seek_hole": false, 00:11:05.070 "seek_data": false, 00:11:05.070 "copy": false, 00:11:05.070 "nvme_iov_md": false 00:11:05.070 }, 00:11:05.070 "memory_domains": [ 00:11:05.070 { 00:11:05.070 "dma_device_id": "system", 00:11:05.070 "dma_device_type": 1 00:11:05.070 }, 00:11:05.070 { 00:11:05.070 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.070 "dma_device_type": 2 00:11:05.070 }, 00:11:05.070 { 00:11:05.070 "dma_device_id": "system", 00:11:05.070 "dma_device_type": 1 00:11:05.070 }, 00:11:05.071 { 00:11:05.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.071 "dma_device_type": 2 00:11:05.071 }, 00:11:05.071 { 00:11:05.071 "dma_device_id": "system", 00:11:05.071 "dma_device_type": 1 00:11:05.071 }, 00:11:05.071 { 00:11:05.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.071 "dma_device_type": 2 00:11:05.071 } 00:11:05.071 ], 00:11:05.071 "driver_specific": { 00:11:05.071 "raid": { 00:11:05.071 "uuid": "1d4c3250-84b2-4689-ac7f-29d4e44a213f", 00:11:05.071 "strip_size_kb": 64, 00:11:05.071 "state": "online", 00:11:05.071 "raid_level": "raid0", 00:11:05.071 "superblock": false, 00:11:05.071 "num_base_bdevs": 3, 00:11:05.071 "num_base_bdevs_discovered": 3, 00:11:05.071 "num_base_bdevs_operational": 3, 00:11:05.071 "base_bdevs_list": [ 00:11:05.071 { 00:11:05.071 "name": "BaseBdev1", 00:11:05.071 "uuid": "dbf0aa08-f3bb-4e83-81dc-1c5f47e96c27", 00:11:05.071 "is_configured": true, 00:11:05.071 "data_offset": 0, 00:11:05.071 "data_size": 65536 00:11:05.071 }, 00:11:05.071 { 00:11:05.071 "name": "BaseBdev2", 00:11:05.071 "uuid": "48b29ea9-fa32-4741-a3ef-7bb9c559dc58", 00:11:05.071 "is_configured": true, 00:11:05.071 "data_offset": 0, 00:11:05.071 "data_size": 65536 00:11:05.071 }, 00:11:05.071 { 00:11:05.071 "name": "BaseBdev3", 00:11:05.071 "uuid": "7e3ad79e-445b-4967-b6ee-d9519edf2be4", 00:11:05.071 "is_configured": true, 00:11:05.071 "data_offset": 0, 00:11:05.071 "data_size": 65536 00:11:05.071 } 00:11:05.071 ] 00:11:05.071 } 00:11:05.071 } 00:11:05.071 }' 00:11:05.071 18:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:05.071 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:05.071 BaseBdev2 00:11:05.071 BaseBdev3' 00:11:05.071 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:05.071 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:05.071 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:05.329 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:05.329 "name": "BaseBdev1", 00:11:05.329 "aliases": [ 00:11:05.329 "dbf0aa08-f3bb-4e83-81dc-1c5f47e96c27" 00:11:05.329 ], 00:11:05.329 "product_name": "Malloc disk", 00:11:05.329 "block_size": 512, 00:11:05.329 "num_blocks": 65536, 00:11:05.329 "uuid": "dbf0aa08-f3bb-4e83-81dc-1c5f47e96c27", 00:11:05.329 "assigned_rate_limits": { 00:11:05.329 "rw_ios_per_sec": 0, 00:11:05.329 "rw_mbytes_per_sec": 0, 00:11:05.329 "r_mbytes_per_sec": 0, 00:11:05.329 "w_mbytes_per_sec": 0 00:11:05.329 }, 00:11:05.329 "claimed": true, 00:11:05.329 "claim_type": "exclusive_write", 00:11:05.329 "zoned": false, 00:11:05.329 "supported_io_types": { 00:11:05.329 "read": true, 00:11:05.329 "write": true, 00:11:05.329 "unmap": true, 00:11:05.329 "flush": true, 00:11:05.329 "reset": true, 00:11:05.329 "nvme_admin": false, 00:11:05.329 "nvme_io": false, 00:11:05.329 "nvme_io_md": false, 00:11:05.329 "write_zeroes": true, 00:11:05.329 "zcopy": true, 00:11:05.329 "get_zone_info": false, 00:11:05.329 "zone_management": false, 00:11:05.329 "zone_append": false, 00:11:05.329 "compare": false, 00:11:05.329 "compare_and_write": false, 00:11:05.329 "abort": true, 00:11:05.329 "seek_hole": false, 00:11:05.329 "seek_data": false, 00:11:05.329 "copy": true, 00:11:05.329 "nvme_iov_md": false 00:11:05.329 }, 00:11:05.329 "memory_domains": [ 00:11:05.329 { 00:11:05.329 "dma_device_id": "system", 00:11:05.329 "dma_device_type": 1 00:11:05.329 }, 00:11:05.329 { 00:11:05.329 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.330 "dma_device_type": 2 00:11:05.330 } 00:11:05.330 ], 00:11:05.330 "driver_specific": {} 00:11:05.330 }' 00:11:05.330 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:05.330 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:05.330 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:05.330 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:05.330 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:05.589 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:05.589 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:05.589 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:05.589 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:05.589 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:05.589 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:05.589 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:05.589 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:05.589 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:05.589 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:05.848 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:05.848 "name": "BaseBdev2", 00:11:05.848 "aliases": [ 00:11:05.848 "48b29ea9-fa32-4741-a3ef-7bb9c559dc58" 00:11:05.848 ], 00:11:05.848 "product_name": "Malloc disk", 00:11:05.848 "block_size": 512, 00:11:05.848 "num_blocks": 65536, 00:11:05.848 "uuid": "48b29ea9-fa32-4741-a3ef-7bb9c559dc58", 00:11:05.848 "assigned_rate_limits": { 00:11:05.848 "rw_ios_per_sec": 0, 00:11:05.848 "rw_mbytes_per_sec": 0, 00:11:05.848 "r_mbytes_per_sec": 0, 00:11:05.848 "w_mbytes_per_sec": 0 00:11:05.848 }, 00:11:05.848 "claimed": true, 00:11:05.848 "claim_type": "exclusive_write", 00:11:05.848 "zoned": false, 00:11:05.848 "supported_io_types": { 00:11:05.848 "read": true, 00:11:05.848 "write": true, 00:11:05.848 "unmap": true, 00:11:05.848 "flush": true, 00:11:05.848 "reset": true, 00:11:05.848 "nvme_admin": false, 00:11:05.848 "nvme_io": false, 00:11:05.848 "nvme_io_md": false, 00:11:05.848 "write_zeroes": true, 00:11:05.848 "zcopy": true, 00:11:05.848 "get_zone_info": false, 00:11:05.848 "zone_management": false, 00:11:05.848 "zone_append": false, 00:11:05.848 "compare": false, 00:11:05.848 "compare_and_write": false, 00:11:05.848 "abort": true, 00:11:05.848 "seek_hole": false, 00:11:05.848 "seek_data": false, 00:11:05.848 "copy": true, 00:11:05.848 "nvme_iov_md": false 00:11:05.848 }, 00:11:05.848 "memory_domains": [ 00:11:05.848 { 00:11:05.848 "dma_device_id": "system", 00:11:05.848 "dma_device_type": 1 00:11:05.848 }, 00:11:05.848 { 00:11:05.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.848 "dma_device_type": 2 00:11:05.848 } 00:11:05.848 ], 00:11:05.848 "driver_specific": {} 00:11:05.848 }' 00:11:05.848 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:05.848 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:05.848 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:05.848 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:05.848 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:05.848 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:05.848 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:06.106 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:06.106 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:06.106 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:06.106 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:06.106 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:06.106 18:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:06.106 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:06.106 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:06.364 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:06.364 "name": "BaseBdev3", 00:11:06.364 "aliases": [ 00:11:06.364 "7e3ad79e-445b-4967-b6ee-d9519edf2be4" 00:11:06.364 ], 00:11:06.364 "product_name": "Malloc disk", 00:11:06.364 "block_size": 512, 00:11:06.364 "num_blocks": 65536, 00:11:06.364 "uuid": "7e3ad79e-445b-4967-b6ee-d9519edf2be4", 00:11:06.364 "assigned_rate_limits": { 00:11:06.364 "rw_ios_per_sec": 0, 00:11:06.364 "rw_mbytes_per_sec": 0, 00:11:06.364 "r_mbytes_per_sec": 0, 00:11:06.364 "w_mbytes_per_sec": 0 00:11:06.364 }, 00:11:06.364 "claimed": true, 00:11:06.364 "claim_type": "exclusive_write", 00:11:06.364 "zoned": false, 00:11:06.364 "supported_io_types": { 00:11:06.364 "read": true, 00:11:06.364 "write": true, 00:11:06.364 "unmap": true, 00:11:06.364 "flush": true, 00:11:06.364 "reset": true, 00:11:06.364 "nvme_admin": false, 00:11:06.364 "nvme_io": false, 00:11:06.364 "nvme_io_md": false, 00:11:06.364 "write_zeroes": true, 00:11:06.364 "zcopy": true, 00:11:06.364 "get_zone_info": false, 00:11:06.364 "zone_management": false, 00:11:06.364 "zone_append": false, 00:11:06.364 "compare": false, 00:11:06.364 "compare_and_write": false, 00:11:06.364 "abort": true, 00:11:06.364 "seek_hole": false, 00:11:06.364 "seek_data": false, 00:11:06.364 "copy": true, 00:11:06.364 "nvme_iov_md": false 00:11:06.364 }, 00:11:06.364 "memory_domains": [ 00:11:06.364 { 00:11:06.364 "dma_device_id": "system", 00:11:06.365 "dma_device_type": 1 00:11:06.365 }, 00:11:06.365 { 00:11:06.365 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:06.365 "dma_device_type": 2 00:11:06.365 } 00:11:06.365 ], 00:11:06.365 "driver_specific": {} 00:11:06.365 }' 00:11:06.365 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:06.365 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:06.365 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:06.365 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:06.365 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:06.365 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:06.365 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:06.623 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:06.623 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:06.623 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:06.623 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:06.623 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:06.623 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:06.882 [2024-07-24 18:47:51.656187] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:06.882 [2024-07-24 18:47:51.656206] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:06.883 [2024-07-24 18:47:51.656235] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:06.883 "name": "Existed_Raid", 00:11:06.883 "uuid": "1d4c3250-84b2-4689-ac7f-29d4e44a213f", 00:11:06.883 "strip_size_kb": 64, 00:11:06.883 "state": "offline", 00:11:06.883 "raid_level": "raid0", 00:11:06.883 "superblock": false, 00:11:06.883 "num_base_bdevs": 3, 00:11:06.883 "num_base_bdevs_discovered": 2, 00:11:06.883 "num_base_bdevs_operational": 2, 00:11:06.883 "base_bdevs_list": [ 00:11:06.883 { 00:11:06.883 "name": null, 00:11:06.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:06.883 "is_configured": false, 00:11:06.883 "data_offset": 0, 00:11:06.883 "data_size": 65536 00:11:06.883 }, 00:11:06.883 { 00:11:06.883 "name": "BaseBdev2", 00:11:06.883 "uuid": "48b29ea9-fa32-4741-a3ef-7bb9c559dc58", 00:11:06.883 "is_configured": true, 00:11:06.883 "data_offset": 0, 00:11:06.883 "data_size": 65536 00:11:06.883 }, 00:11:06.883 { 00:11:06.883 "name": "BaseBdev3", 00:11:06.883 "uuid": "7e3ad79e-445b-4967-b6ee-d9519edf2be4", 00:11:06.883 "is_configured": true, 00:11:06.883 "data_offset": 0, 00:11:06.883 "data_size": 65536 00:11:06.883 } 00:11:06.883 ] 00:11:06.883 }' 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:06.883 18:47:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:07.450 18:47:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:07.450 18:47:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:07.450 18:47:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:07.450 18:47:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:07.708 18:47:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:07.708 18:47:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:07.708 18:47:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:07.708 [2024-07-24 18:47:52.687610] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:07.708 18:47:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:07.708 18:47:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:07.708 18:47:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:07.708 18:47:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:07.967 18:47:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:07.967 18:47:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:07.967 18:47:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:11:08.225 [2024-07-24 18:47:53.050384] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:08.225 [2024-07-24 18:47:53.050415] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15b8360 name Existed_Raid, state offline 00:11:08.225 18:47:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:08.225 18:47:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:08.225 18:47:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:08.225 18:47:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:08.489 18:47:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:08.489 18:47:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:08.489 18:47:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:11:08.489 18:47:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:11:08.489 18:47:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:08.489 18:47:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:08.489 BaseBdev2 00:11:08.489 18:47:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:11:08.489 18:47:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:08.489 18:47:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:08.489 18:47:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:08.489 18:47:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:08.489 18:47:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:08.489 18:47:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:08.748 18:47:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:08.748 [ 00:11:08.748 { 00:11:08.748 "name": "BaseBdev2", 00:11:08.748 "aliases": [ 00:11:08.748 "f930c584-b2ae-4bd3-af9b-b7eb52a1c60e" 00:11:08.748 ], 00:11:08.748 "product_name": "Malloc disk", 00:11:08.748 "block_size": 512, 00:11:08.748 "num_blocks": 65536, 00:11:08.748 "uuid": "f930c584-b2ae-4bd3-af9b-b7eb52a1c60e", 00:11:08.748 "assigned_rate_limits": { 00:11:08.748 "rw_ios_per_sec": 0, 00:11:08.748 "rw_mbytes_per_sec": 0, 00:11:08.748 "r_mbytes_per_sec": 0, 00:11:08.748 "w_mbytes_per_sec": 0 00:11:08.748 }, 00:11:08.748 "claimed": false, 00:11:08.748 "zoned": false, 00:11:08.748 "supported_io_types": { 00:11:08.748 "read": true, 00:11:08.748 "write": true, 00:11:08.748 "unmap": true, 00:11:08.748 "flush": true, 00:11:08.748 "reset": true, 00:11:08.748 "nvme_admin": false, 00:11:08.748 "nvme_io": false, 00:11:08.748 "nvme_io_md": false, 00:11:08.748 "write_zeroes": true, 00:11:08.748 "zcopy": true, 00:11:08.748 "get_zone_info": false, 00:11:08.748 "zone_management": false, 00:11:08.748 "zone_append": false, 00:11:08.748 "compare": false, 00:11:08.748 "compare_and_write": false, 00:11:08.748 "abort": true, 00:11:08.748 "seek_hole": false, 00:11:08.748 "seek_data": false, 00:11:08.748 "copy": true, 00:11:08.748 "nvme_iov_md": false 00:11:08.748 }, 00:11:08.748 "memory_domains": [ 00:11:08.748 { 00:11:08.748 "dma_device_id": "system", 00:11:08.748 "dma_device_type": 1 00:11:08.748 }, 00:11:08.748 { 00:11:08.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:08.748 "dma_device_type": 2 00:11:08.748 } 00:11:08.748 ], 00:11:08.748 "driver_specific": {} 00:11:08.748 } 00:11:08.748 ] 00:11:08.748 18:47:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:08.748 18:47:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:11:08.748 18:47:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:08.748 18:47:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:09.006 BaseBdev3 00:11:09.006 18:47:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:11:09.006 18:47:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:11:09.006 18:47:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:09.006 18:47:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:09.006 18:47:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:09.006 18:47:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:09.006 18:47:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:09.265 18:47:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:09.265 [ 00:11:09.265 { 00:11:09.265 "name": "BaseBdev3", 00:11:09.265 "aliases": [ 00:11:09.265 "4a5ae503-d04c-4909-ba6e-efb7b32af445" 00:11:09.265 ], 00:11:09.265 "product_name": "Malloc disk", 00:11:09.265 "block_size": 512, 00:11:09.265 "num_blocks": 65536, 00:11:09.265 "uuid": "4a5ae503-d04c-4909-ba6e-efb7b32af445", 00:11:09.265 "assigned_rate_limits": { 00:11:09.265 "rw_ios_per_sec": 0, 00:11:09.265 "rw_mbytes_per_sec": 0, 00:11:09.265 "r_mbytes_per_sec": 0, 00:11:09.265 "w_mbytes_per_sec": 0 00:11:09.265 }, 00:11:09.265 "claimed": false, 00:11:09.265 "zoned": false, 00:11:09.265 "supported_io_types": { 00:11:09.265 "read": true, 00:11:09.265 "write": true, 00:11:09.265 "unmap": true, 00:11:09.265 "flush": true, 00:11:09.265 "reset": true, 00:11:09.265 "nvme_admin": false, 00:11:09.265 "nvme_io": false, 00:11:09.265 "nvme_io_md": false, 00:11:09.265 "write_zeroes": true, 00:11:09.265 "zcopy": true, 00:11:09.265 "get_zone_info": false, 00:11:09.265 "zone_management": false, 00:11:09.265 "zone_append": false, 00:11:09.265 "compare": false, 00:11:09.265 "compare_and_write": false, 00:11:09.265 "abort": true, 00:11:09.265 "seek_hole": false, 00:11:09.265 "seek_data": false, 00:11:09.265 "copy": true, 00:11:09.265 "nvme_iov_md": false 00:11:09.265 }, 00:11:09.265 "memory_domains": [ 00:11:09.265 { 00:11:09.265 "dma_device_id": "system", 00:11:09.265 "dma_device_type": 1 00:11:09.265 }, 00:11:09.265 { 00:11:09.265 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:09.265 "dma_device_type": 2 00:11:09.265 } 00:11:09.265 ], 00:11:09.265 "driver_specific": {} 00:11:09.265 } 00:11:09.265 ] 00:11:09.265 18:47:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:09.265 18:47:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:11:09.265 18:47:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:09.265 18:47:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:09.524 [2024-07-24 18:47:54.387048] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:09.524 [2024-07-24 18:47:54.387080] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:09.524 [2024-07-24 18:47:54.387091] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:09.524 [2024-07-24 18:47:54.388054] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:09.524 18:47:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:09.524 18:47:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:09.524 18:47:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:09.524 18:47:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:09.524 18:47:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:09.524 18:47:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:09.524 18:47:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:09.524 18:47:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:09.524 18:47:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:09.524 18:47:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:09.524 18:47:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:09.524 18:47:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:09.783 18:47:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:09.783 "name": "Existed_Raid", 00:11:09.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:09.783 "strip_size_kb": 64, 00:11:09.783 "state": "configuring", 00:11:09.783 "raid_level": "raid0", 00:11:09.783 "superblock": false, 00:11:09.783 "num_base_bdevs": 3, 00:11:09.783 "num_base_bdevs_discovered": 2, 00:11:09.783 "num_base_bdevs_operational": 3, 00:11:09.783 "base_bdevs_list": [ 00:11:09.783 { 00:11:09.783 "name": "BaseBdev1", 00:11:09.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:09.783 "is_configured": false, 00:11:09.783 "data_offset": 0, 00:11:09.783 "data_size": 0 00:11:09.783 }, 00:11:09.783 { 00:11:09.783 "name": "BaseBdev2", 00:11:09.783 "uuid": "f930c584-b2ae-4bd3-af9b-b7eb52a1c60e", 00:11:09.783 "is_configured": true, 00:11:09.783 "data_offset": 0, 00:11:09.783 "data_size": 65536 00:11:09.783 }, 00:11:09.783 { 00:11:09.783 "name": "BaseBdev3", 00:11:09.783 "uuid": "4a5ae503-d04c-4909-ba6e-efb7b32af445", 00:11:09.783 "is_configured": true, 00:11:09.783 "data_offset": 0, 00:11:09.783 "data_size": 65536 00:11:09.783 } 00:11:09.783 ] 00:11:09.783 }' 00:11:09.783 18:47:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:09.783 18:47:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:10.042 18:47:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:11:10.300 [2024-07-24 18:47:55.205150] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:10.300 18:47:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:10.301 18:47:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:10.301 18:47:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:10.301 18:47:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:10.301 18:47:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:10.301 18:47:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:10.301 18:47:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:10.301 18:47:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:10.301 18:47:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:10.301 18:47:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:10.301 18:47:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:10.301 18:47:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:10.560 18:47:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:10.560 "name": "Existed_Raid", 00:11:10.560 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:10.560 "strip_size_kb": 64, 00:11:10.560 "state": "configuring", 00:11:10.560 "raid_level": "raid0", 00:11:10.560 "superblock": false, 00:11:10.560 "num_base_bdevs": 3, 00:11:10.560 "num_base_bdevs_discovered": 1, 00:11:10.560 "num_base_bdevs_operational": 3, 00:11:10.560 "base_bdevs_list": [ 00:11:10.560 { 00:11:10.560 "name": "BaseBdev1", 00:11:10.560 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:10.560 "is_configured": false, 00:11:10.560 "data_offset": 0, 00:11:10.560 "data_size": 0 00:11:10.560 }, 00:11:10.560 { 00:11:10.560 "name": null, 00:11:10.560 "uuid": "f930c584-b2ae-4bd3-af9b-b7eb52a1c60e", 00:11:10.560 "is_configured": false, 00:11:10.560 "data_offset": 0, 00:11:10.560 "data_size": 65536 00:11:10.560 }, 00:11:10.560 { 00:11:10.560 "name": "BaseBdev3", 00:11:10.560 "uuid": "4a5ae503-d04c-4909-ba6e-efb7b32af445", 00:11:10.560 "is_configured": true, 00:11:10.560 "data_offset": 0, 00:11:10.560 "data_size": 65536 00:11:10.560 } 00:11:10.560 ] 00:11:10.560 }' 00:11:10.560 18:47:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:10.560 18:47:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:11.127 18:47:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:11.127 18:47:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:11.127 18:47:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:11:11.127 18:47:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:11.387 [2024-07-24 18:47:56.222371] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:11.387 BaseBdev1 00:11:11.387 18:47:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:11:11.387 18:47:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:11.387 18:47:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:11.387 18:47:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:11.387 18:47:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:11.387 18:47:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:11.387 18:47:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:11.646 18:47:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:11.646 [ 00:11:11.646 { 00:11:11.646 "name": "BaseBdev1", 00:11:11.646 "aliases": [ 00:11:11.646 "986f41f1-67f6-4578-ab93-e591cb900868" 00:11:11.646 ], 00:11:11.646 "product_name": "Malloc disk", 00:11:11.646 "block_size": 512, 00:11:11.646 "num_blocks": 65536, 00:11:11.646 "uuid": "986f41f1-67f6-4578-ab93-e591cb900868", 00:11:11.646 "assigned_rate_limits": { 00:11:11.646 "rw_ios_per_sec": 0, 00:11:11.646 "rw_mbytes_per_sec": 0, 00:11:11.646 "r_mbytes_per_sec": 0, 00:11:11.646 "w_mbytes_per_sec": 0 00:11:11.646 }, 00:11:11.646 "claimed": true, 00:11:11.646 "claim_type": "exclusive_write", 00:11:11.646 "zoned": false, 00:11:11.646 "supported_io_types": { 00:11:11.646 "read": true, 00:11:11.646 "write": true, 00:11:11.646 "unmap": true, 00:11:11.646 "flush": true, 00:11:11.646 "reset": true, 00:11:11.646 "nvme_admin": false, 00:11:11.646 "nvme_io": false, 00:11:11.646 "nvme_io_md": false, 00:11:11.646 "write_zeroes": true, 00:11:11.646 "zcopy": true, 00:11:11.646 "get_zone_info": false, 00:11:11.646 "zone_management": false, 00:11:11.646 "zone_append": false, 00:11:11.646 "compare": false, 00:11:11.646 "compare_and_write": false, 00:11:11.646 "abort": true, 00:11:11.646 "seek_hole": false, 00:11:11.646 "seek_data": false, 00:11:11.646 "copy": true, 00:11:11.646 "nvme_iov_md": false 00:11:11.646 }, 00:11:11.646 "memory_domains": [ 00:11:11.646 { 00:11:11.646 "dma_device_id": "system", 00:11:11.646 "dma_device_type": 1 00:11:11.646 }, 00:11:11.646 { 00:11:11.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:11.646 "dma_device_type": 2 00:11:11.646 } 00:11:11.646 ], 00:11:11.646 "driver_specific": {} 00:11:11.646 } 00:11:11.646 ] 00:11:11.646 18:47:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:11.646 18:47:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:11.646 18:47:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:11.646 18:47:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:11.646 18:47:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:11.646 18:47:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:11.646 18:47:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:11.646 18:47:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:11.646 18:47:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:11.646 18:47:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:11.646 18:47:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:11.646 18:47:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:11.646 18:47:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:11.905 18:47:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:11.905 "name": "Existed_Raid", 00:11:11.905 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:11.905 "strip_size_kb": 64, 00:11:11.905 "state": "configuring", 00:11:11.905 "raid_level": "raid0", 00:11:11.905 "superblock": false, 00:11:11.905 "num_base_bdevs": 3, 00:11:11.905 "num_base_bdevs_discovered": 2, 00:11:11.905 "num_base_bdevs_operational": 3, 00:11:11.905 "base_bdevs_list": [ 00:11:11.905 { 00:11:11.905 "name": "BaseBdev1", 00:11:11.905 "uuid": "986f41f1-67f6-4578-ab93-e591cb900868", 00:11:11.905 "is_configured": true, 00:11:11.905 "data_offset": 0, 00:11:11.905 "data_size": 65536 00:11:11.905 }, 00:11:11.905 { 00:11:11.905 "name": null, 00:11:11.905 "uuid": "f930c584-b2ae-4bd3-af9b-b7eb52a1c60e", 00:11:11.905 "is_configured": false, 00:11:11.905 "data_offset": 0, 00:11:11.905 "data_size": 65536 00:11:11.905 }, 00:11:11.905 { 00:11:11.905 "name": "BaseBdev3", 00:11:11.905 "uuid": "4a5ae503-d04c-4909-ba6e-efb7b32af445", 00:11:11.905 "is_configured": true, 00:11:11.905 "data_offset": 0, 00:11:11.905 "data_size": 65536 00:11:11.905 } 00:11:11.905 ] 00:11:11.905 }' 00:11:11.905 18:47:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:11.905 18:47:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:12.472 18:47:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:12.472 18:47:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:12.472 18:47:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:11:12.472 18:47:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:11:12.732 [2024-07-24 18:47:57.517764] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:12.732 18:47:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:12.732 18:47:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:12.732 18:47:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:12.732 18:47:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:12.732 18:47:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:12.732 18:47:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:12.732 18:47:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:12.732 18:47:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:12.732 18:47:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:12.732 18:47:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:12.732 18:47:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:12.732 18:47:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:12.732 18:47:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:12.732 "name": "Existed_Raid", 00:11:12.732 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:12.732 "strip_size_kb": 64, 00:11:12.732 "state": "configuring", 00:11:12.732 "raid_level": "raid0", 00:11:12.732 "superblock": false, 00:11:12.732 "num_base_bdevs": 3, 00:11:12.732 "num_base_bdevs_discovered": 1, 00:11:12.732 "num_base_bdevs_operational": 3, 00:11:12.732 "base_bdevs_list": [ 00:11:12.732 { 00:11:12.732 "name": "BaseBdev1", 00:11:12.732 "uuid": "986f41f1-67f6-4578-ab93-e591cb900868", 00:11:12.732 "is_configured": true, 00:11:12.732 "data_offset": 0, 00:11:12.732 "data_size": 65536 00:11:12.732 }, 00:11:12.732 { 00:11:12.732 "name": null, 00:11:12.732 "uuid": "f930c584-b2ae-4bd3-af9b-b7eb52a1c60e", 00:11:12.732 "is_configured": false, 00:11:12.732 "data_offset": 0, 00:11:12.732 "data_size": 65536 00:11:12.732 }, 00:11:12.732 { 00:11:12.732 "name": null, 00:11:12.732 "uuid": "4a5ae503-d04c-4909-ba6e-efb7b32af445", 00:11:12.732 "is_configured": false, 00:11:12.732 "data_offset": 0, 00:11:12.732 "data_size": 65536 00:11:12.732 } 00:11:12.732 ] 00:11:12.732 }' 00:11:12.732 18:47:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:12.732 18:47:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:13.298 18:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:13.298 18:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:13.557 18:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:11:13.557 18:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:11:13.557 [2024-07-24 18:47:58.540413] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:13.557 18:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:13.558 18:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:13.558 18:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:13.558 18:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:13.558 18:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:13.558 18:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:13.558 18:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:13.558 18:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:13.558 18:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:13.558 18:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:13.558 18:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:13.558 18:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:13.816 18:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:13.816 "name": "Existed_Raid", 00:11:13.816 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:13.816 "strip_size_kb": 64, 00:11:13.816 "state": "configuring", 00:11:13.816 "raid_level": "raid0", 00:11:13.816 "superblock": false, 00:11:13.816 "num_base_bdevs": 3, 00:11:13.816 "num_base_bdevs_discovered": 2, 00:11:13.816 "num_base_bdevs_operational": 3, 00:11:13.816 "base_bdevs_list": [ 00:11:13.816 { 00:11:13.816 "name": "BaseBdev1", 00:11:13.816 "uuid": "986f41f1-67f6-4578-ab93-e591cb900868", 00:11:13.816 "is_configured": true, 00:11:13.816 "data_offset": 0, 00:11:13.816 "data_size": 65536 00:11:13.816 }, 00:11:13.816 { 00:11:13.816 "name": null, 00:11:13.816 "uuid": "f930c584-b2ae-4bd3-af9b-b7eb52a1c60e", 00:11:13.816 "is_configured": false, 00:11:13.816 "data_offset": 0, 00:11:13.816 "data_size": 65536 00:11:13.816 }, 00:11:13.816 { 00:11:13.816 "name": "BaseBdev3", 00:11:13.816 "uuid": "4a5ae503-d04c-4909-ba6e-efb7b32af445", 00:11:13.816 "is_configured": true, 00:11:13.816 "data_offset": 0, 00:11:13.816 "data_size": 65536 00:11:13.816 } 00:11:13.816 ] 00:11:13.816 }' 00:11:13.816 18:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:13.816 18:47:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:14.383 18:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:14.383 18:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:14.383 18:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:11:14.383 18:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:14.642 [2024-07-24 18:47:59.543030] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:14.642 18:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:14.642 18:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:14.642 18:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:14.642 18:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:14.642 18:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:14.642 18:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:14.642 18:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:14.642 18:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:14.642 18:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:14.642 18:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:14.642 18:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:14.642 18:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:14.901 18:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:14.901 "name": "Existed_Raid", 00:11:14.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:14.901 "strip_size_kb": 64, 00:11:14.901 "state": "configuring", 00:11:14.901 "raid_level": "raid0", 00:11:14.901 "superblock": false, 00:11:14.901 "num_base_bdevs": 3, 00:11:14.901 "num_base_bdevs_discovered": 1, 00:11:14.901 "num_base_bdevs_operational": 3, 00:11:14.901 "base_bdevs_list": [ 00:11:14.901 { 00:11:14.901 "name": null, 00:11:14.901 "uuid": "986f41f1-67f6-4578-ab93-e591cb900868", 00:11:14.901 "is_configured": false, 00:11:14.901 "data_offset": 0, 00:11:14.901 "data_size": 65536 00:11:14.901 }, 00:11:14.901 { 00:11:14.901 "name": null, 00:11:14.901 "uuid": "f930c584-b2ae-4bd3-af9b-b7eb52a1c60e", 00:11:14.901 "is_configured": false, 00:11:14.901 "data_offset": 0, 00:11:14.901 "data_size": 65536 00:11:14.901 }, 00:11:14.901 { 00:11:14.901 "name": "BaseBdev3", 00:11:14.901 "uuid": "4a5ae503-d04c-4909-ba6e-efb7b32af445", 00:11:14.901 "is_configured": true, 00:11:14.901 "data_offset": 0, 00:11:14.901 "data_size": 65536 00:11:14.901 } 00:11:14.901 ] 00:11:14.901 }' 00:11:14.901 18:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:14.901 18:47:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:15.467 18:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:15.467 18:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:15.467 18:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:11:15.468 18:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:11:15.726 [2024-07-24 18:48:00.583664] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:15.726 18:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:15.726 18:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:15.726 18:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:15.726 18:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:15.726 18:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:15.726 18:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:15.726 18:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:15.726 18:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:15.726 18:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:15.726 18:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:15.726 18:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:15.726 18:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:15.985 18:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:15.985 "name": "Existed_Raid", 00:11:15.985 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:15.985 "strip_size_kb": 64, 00:11:15.985 "state": "configuring", 00:11:15.985 "raid_level": "raid0", 00:11:15.985 "superblock": false, 00:11:15.985 "num_base_bdevs": 3, 00:11:15.985 "num_base_bdevs_discovered": 2, 00:11:15.985 "num_base_bdevs_operational": 3, 00:11:15.985 "base_bdevs_list": [ 00:11:15.985 { 00:11:15.985 "name": null, 00:11:15.985 "uuid": "986f41f1-67f6-4578-ab93-e591cb900868", 00:11:15.985 "is_configured": false, 00:11:15.985 "data_offset": 0, 00:11:15.985 "data_size": 65536 00:11:15.985 }, 00:11:15.985 { 00:11:15.985 "name": "BaseBdev2", 00:11:15.985 "uuid": "f930c584-b2ae-4bd3-af9b-b7eb52a1c60e", 00:11:15.985 "is_configured": true, 00:11:15.985 "data_offset": 0, 00:11:15.985 "data_size": 65536 00:11:15.985 }, 00:11:15.985 { 00:11:15.985 "name": "BaseBdev3", 00:11:15.985 "uuid": "4a5ae503-d04c-4909-ba6e-efb7b32af445", 00:11:15.985 "is_configured": true, 00:11:15.985 "data_offset": 0, 00:11:15.985 "data_size": 65536 00:11:15.985 } 00:11:15.985 ] 00:11:15.985 }' 00:11:15.985 18:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:15.985 18:48:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:16.244 18:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:16.244 18:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:16.503 18:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:11:16.503 18:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:16.503 18:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:11:16.761 18:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 986f41f1-67f6-4578-ab93-e591cb900868 00:11:16.761 [2024-07-24 18:48:01.750779] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:11:16.761 [2024-07-24 18:48:01.750805] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x175ca90 00:11:16.761 [2024-07-24 18:48:01.750810] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:11:16.761 [2024-07-24 18:48:01.750956] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15a42d0 00:11:16.761 [2024-07-24 18:48:01.751051] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x175ca90 00:11:16.761 [2024-07-24 18:48:01.751057] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x175ca90 00:11:16.761 [2024-07-24 18:48:01.751184] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:16.761 NewBaseBdev 00:11:16.761 18:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:11:16.761 18:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:11:16.761 18:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:16.761 18:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:16.761 18:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:16.761 18:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:16.761 18:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:17.019 18:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:11:17.278 [ 00:11:17.278 { 00:11:17.278 "name": "NewBaseBdev", 00:11:17.278 "aliases": [ 00:11:17.278 "986f41f1-67f6-4578-ab93-e591cb900868" 00:11:17.278 ], 00:11:17.278 "product_name": "Malloc disk", 00:11:17.278 "block_size": 512, 00:11:17.278 "num_blocks": 65536, 00:11:17.278 "uuid": "986f41f1-67f6-4578-ab93-e591cb900868", 00:11:17.278 "assigned_rate_limits": { 00:11:17.278 "rw_ios_per_sec": 0, 00:11:17.278 "rw_mbytes_per_sec": 0, 00:11:17.278 "r_mbytes_per_sec": 0, 00:11:17.278 "w_mbytes_per_sec": 0 00:11:17.278 }, 00:11:17.278 "claimed": true, 00:11:17.278 "claim_type": "exclusive_write", 00:11:17.278 "zoned": false, 00:11:17.278 "supported_io_types": { 00:11:17.278 "read": true, 00:11:17.278 "write": true, 00:11:17.278 "unmap": true, 00:11:17.278 "flush": true, 00:11:17.278 "reset": true, 00:11:17.278 "nvme_admin": false, 00:11:17.278 "nvme_io": false, 00:11:17.278 "nvme_io_md": false, 00:11:17.278 "write_zeroes": true, 00:11:17.278 "zcopy": true, 00:11:17.278 "get_zone_info": false, 00:11:17.278 "zone_management": false, 00:11:17.278 "zone_append": false, 00:11:17.278 "compare": false, 00:11:17.278 "compare_and_write": false, 00:11:17.278 "abort": true, 00:11:17.278 "seek_hole": false, 00:11:17.278 "seek_data": false, 00:11:17.278 "copy": true, 00:11:17.278 "nvme_iov_md": false 00:11:17.278 }, 00:11:17.278 "memory_domains": [ 00:11:17.278 { 00:11:17.278 "dma_device_id": "system", 00:11:17.278 "dma_device_type": 1 00:11:17.278 }, 00:11:17.278 { 00:11:17.278 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:17.278 "dma_device_type": 2 00:11:17.278 } 00:11:17.278 ], 00:11:17.278 "driver_specific": {} 00:11:17.278 } 00:11:17.278 ] 00:11:17.278 18:48:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:17.278 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:17.278 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:17.278 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:17.278 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:17.278 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:17.278 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:17.278 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:17.278 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:17.278 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:17.278 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:17.278 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:17.278 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:17.541 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:17.541 "name": "Existed_Raid", 00:11:17.541 "uuid": "4061dd7b-a8c8-4e15-ab74-495e7637ece8", 00:11:17.541 "strip_size_kb": 64, 00:11:17.541 "state": "online", 00:11:17.541 "raid_level": "raid0", 00:11:17.541 "superblock": false, 00:11:17.541 "num_base_bdevs": 3, 00:11:17.541 "num_base_bdevs_discovered": 3, 00:11:17.541 "num_base_bdevs_operational": 3, 00:11:17.541 "base_bdevs_list": [ 00:11:17.541 { 00:11:17.541 "name": "NewBaseBdev", 00:11:17.541 "uuid": "986f41f1-67f6-4578-ab93-e591cb900868", 00:11:17.541 "is_configured": true, 00:11:17.541 "data_offset": 0, 00:11:17.541 "data_size": 65536 00:11:17.541 }, 00:11:17.541 { 00:11:17.541 "name": "BaseBdev2", 00:11:17.541 "uuid": "f930c584-b2ae-4bd3-af9b-b7eb52a1c60e", 00:11:17.541 "is_configured": true, 00:11:17.541 "data_offset": 0, 00:11:17.541 "data_size": 65536 00:11:17.541 }, 00:11:17.541 { 00:11:17.541 "name": "BaseBdev3", 00:11:17.541 "uuid": "4a5ae503-d04c-4909-ba6e-efb7b32af445", 00:11:17.541 "is_configured": true, 00:11:17.541 "data_offset": 0, 00:11:17.541 "data_size": 65536 00:11:17.541 } 00:11:17.541 ] 00:11:17.541 }' 00:11:17.541 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:17.541 18:48:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:17.801 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:11:17.801 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:17.801 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:17.801 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:17.801 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:17.801 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:17.801 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:17.801 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:18.060 [2024-07-24 18:48:02.885900] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:18.060 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:18.060 "name": "Existed_Raid", 00:11:18.060 "aliases": [ 00:11:18.060 "4061dd7b-a8c8-4e15-ab74-495e7637ece8" 00:11:18.060 ], 00:11:18.060 "product_name": "Raid Volume", 00:11:18.060 "block_size": 512, 00:11:18.060 "num_blocks": 196608, 00:11:18.060 "uuid": "4061dd7b-a8c8-4e15-ab74-495e7637ece8", 00:11:18.060 "assigned_rate_limits": { 00:11:18.060 "rw_ios_per_sec": 0, 00:11:18.060 "rw_mbytes_per_sec": 0, 00:11:18.060 "r_mbytes_per_sec": 0, 00:11:18.060 "w_mbytes_per_sec": 0 00:11:18.060 }, 00:11:18.060 "claimed": false, 00:11:18.060 "zoned": false, 00:11:18.060 "supported_io_types": { 00:11:18.060 "read": true, 00:11:18.060 "write": true, 00:11:18.060 "unmap": true, 00:11:18.060 "flush": true, 00:11:18.060 "reset": true, 00:11:18.060 "nvme_admin": false, 00:11:18.060 "nvme_io": false, 00:11:18.060 "nvme_io_md": false, 00:11:18.060 "write_zeroes": true, 00:11:18.060 "zcopy": false, 00:11:18.060 "get_zone_info": false, 00:11:18.060 "zone_management": false, 00:11:18.060 "zone_append": false, 00:11:18.060 "compare": false, 00:11:18.060 "compare_and_write": false, 00:11:18.060 "abort": false, 00:11:18.060 "seek_hole": false, 00:11:18.060 "seek_data": false, 00:11:18.060 "copy": false, 00:11:18.060 "nvme_iov_md": false 00:11:18.060 }, 00:11:18.060 "memory_domains": [ 00:11:18.060 { 00:11:18.060 "dma_device_id": "system", 00:11:18.060 "dma_device_type": 1 00:11:18.060 }, 00:11:18.060 { 00:11:18.060 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.060 "dma_device_type": 2 00:11:18.060 }, 00:11:18.060 { 00:11:18.060 "dma_device_id": "system", 00:11:18.060 "dma_device_type": 1 00:11:18.060 }, 00:11:18.060 { 00:11:18.060 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.060 "dma_device_type": 2 00:11:18.060 }, 00:11:18.060 { 00:11:18.060 "dma_device_id": "system", 00:11:18.060 "dma_device_type": 1 00:11:18.060 }, 00:11:18.060 { 00:11:18.060 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.060 "dma_device_type": 2 00:11:18.060 } 00:11:18.060 ], 00:11:18.060 "driver_specific": { 00:11:18.060 "raid": { 00:11:18.060 "uuid": "4061dd7b-a8c8-4e15-ab74-495e7637ece8", 00:11:18.060 "strip_size_kb": 64, 00:11:18.060 "state": "online", 00:11:18.060 "raid_level": "raid0", 00:11:18.060 "superblock": false, 00:11:18.060 "num_base_bdevs": 3, 00:11:18.060 "num_base_bdevs_discovered": 3, 00:11:18.060 "num_base_bdevs_operational": 3, 00:11:18.060 "base_bdevs_list": [ 00:11:18.060 { 00:11:18.060 "name": "NewBaseBdev", 00:11:18.060 "uuid": "986f41f1-67f6-4578-ab93-e591cb900868", 00:11:18.060 "is_configured": true, 00:11:18.060 "data_offset": 0, 00:11:18.060 "data_size": 65536 00:11:18.060 }, 00:11:18.060 { 00:11:18.060 "name": "BaseBdev2", 00:11:18.060 "uuid": "f930c584-b2ae-4bd3-af9b-b7eb52a1c60e", 00:11:18.060 "is_configured": true, 00:11:18.060 "data_offset": 0, 00:11:18.060 "data_size": 65536 00:11:18.060 }, 00:11:18.060 { 00:11:18.060 "name": "BaseBdev3", 00:11:18.060 "uuid": "4a5ae503-d04c-4909-ba6e-efb7b32af445", 00:11:18.060 "is_configured": true, 00:11:18.060 "data_offset": 0, 00:11:18.060 "data_size": 65536 00:11:18.060 } 00:11:18.060 ] 00:11:18.060 } 00:11:18.060 } 00:11:18.060 }' 00:11:18.060 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:18.060 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:11:18.060 BaseBdev2 00:11:18.060 BaseBdev3' 00:11:18.060 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:18.060 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:18.060 18:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:11:18.319 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:18.319 "name": "NewBaseBdev", 00:11:18.319 "aliases": [ 00:11:18.319 "986f41f1-67f6-4578-ab93-e591cb900868" 00:11:18.319 ], 00:11:18.319 "product_name": "Malloc disk", 00:11:18.319 "block_size": 512, 00:11:18.319 "num_blocks": 65536, 00:11:18.319 "uuid": "986f41f1-67f6-4578-ab93-e591cb900868", 00:11:18.319 "assigned_rate_limits": { 00:11:18.319 "rw_ios_per_sec": 0, 00:11:18.319 "rw_mbytes_per_sec": 0, 00:11:18.319 "r_mbytes_per_sec": 0, 00:11:18.319 "w_mbytes_per_sec": 0 00:11:18.319 }, 00:11:18.319 "claimed": true, 00:11:18.319 "claim_type": "exclusive_write", 00:11:18.319 "zoned": false, 00:11:18.319 "supported_io_types": { 00:11:18.319 "read": true, 00:11:18.319 "write": true, 00:11:18.319 "unmap": true, 00:11:18.319 "flush": true, 00:11:18.319 "reset": true, 00:11:18.319 "nvme_admin": false, 00:11:18.319 "nvme_io": false, 00:11:18.319 "nvme_io_md": false, 00:11:18.319 "write_zeroes": true, 00:11:18.319 "zcopy": true, 00:11:18.319 "get_zone_info": false, 00:11:18.319 "zone_management": false, 00:11:18.319 "zone_append": false, 00:11:18.319 "compare": false, 00:11:18.319 "compare_and_write": false, 00:11:18.319 "abort": true, 00:11:18.319 "seek_hole": false, 00:11:18.319 "seek_data": false, 00:11:18.319 "copy": true, 00:11:18.319 "nvme_iov_md": false 00:11:18.319 }, 00:11:18.319 "memory_domains": [ 00:11:18.319 { 00:11:18.319 "dma_device_id": "system", 00:11:18.319 "dma_device_type": 1 00:11:18.319 }, 00:11:18.319 { 00:11:18.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.319 "dma_device_type": 2 00:11:18.319 } 00:11:18.319 ], 00:11:18.319 "driver_specific": {} 00:11:18.319 }' 00:11:18.319 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:18.319 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:18.319 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:18.319 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:18.319 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:18.319 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:18.319 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:18.319 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:18.580 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:18.580 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:18.580 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:18.580 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:18.580 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:18.580 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:18.580 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:18.839 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:18.839 "name": "BaseBdev2", 00:11:18.839 "aliases": [ 00:11:18.839 "f930c584-b2ae-4bd3-af9b-b7eb52a1c60e" 00:11:18.839 ], 00:11:18.839 "product_name": "Malloc disk", 00:11:18.839 "block_size": 512, 00:11:18.839 "num_blocks": 65536, 00:11:18.839 "uuid": "f930c584-b2ae-4bd3-af9b-b7eb52a1c60e", 00:11:18.839 "assigned_rate_limits": { 00:11:18.839 "rw_ios_per_sec": 0, 00:11:18.839 "rw_mbytes_per_sec": 0, 00:11:18.839 "r_mbytes_per_sec": 0, 00:11:18.839 "w_mbytes_per_sec": 0 00:11:18.839 }, 00:11:18.839 "claimed": true, 00:11:18.839 "claim_type": "exclusive_write", 00:11:18.839 "zoned": false, 00:11:18.839 "supported_io_types": { 00:11:18.839 "read": true, 00:11:18.839 "write": true, 00:11:18.839 "unmap": true, 00:11:18.839 "flush": true, 00:11:18.839 "reset": true, 00:11:18.839 "nvme_admin": false, 00:11:18.839 "nvme_io": false, 00:11:18.839 "nvme_io_md": false, 00:11:18.839 "write_zeroes": true, 00:11:18.839 "zcopy": true, 00:11:18.839 "get_zone_info": false, 00:11:18.839 "zone_management": false, 00:11:18.839 "zone_append": false, 00:11:18.839 "compare": false, 00:11:18.839 "compare_and_write": false, 00:11:18.839 "abort": true, 00:11:18.839 "seek_hole": false, 00:11:18.839 "seek_data": false, 00:11:18.839 "copy": true, 00:11:18.839 "nvme_iov_md": false 00:11:18.839 }, 00:11:18.839 "memory_domains": [ 00:11:18.839 { 00:11:18.839 "dma_device_id": "system", 00:11:18.839 "dma_device_type": 1 00:11:18.839 }, 00:11:18.839 { 00:11:18.839 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.839 "dma_device_type": 2 00:11:18.839 } 00:11:18.839 ], 00:11:18.839 "driver_specific": {} 00:11:18.839 }' 00:11:18.839 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:18.839 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:18.839 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:18.839 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:18.839 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:18.839 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:18.839 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:18.839 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:18.839 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:18.839 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:18.839 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:18.839 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:18.839 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:19.098 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:19.098 18:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:19.098 18:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:19.098 "name": "BaseBdev3", 00:11:19.098 "aliases": [ 00:11:19.098 "4a5ae503-d04c-4909-ba6e-efb7b32af445" 00:11:19.098 ], 00:11:19.098 "product_name": "Malloc disk", 00:11:19.098 "block_size": 512, 00:11:19.098 "num_blocks": 65536, 00:11:19.098 "uuid": "4a5ae503-d04c-4909-ba6e-efb7b32af445", 00:11:19.098 "assigned_rate_limits": { 00:11:19.098 "rw_ios_per_sec": 0, 00:11:19.098 "rw_mbytes_per_sec": 0, 00:11:19.098 "r_mbytes_per_sec": 0, 00:11:19.098 "w_mbytes_per_sec": 0 00:11:19.098 }, 00:11:19.098 "claimed": true, 00:11:19.098 "claim_type": "exclusive_write", 00:11:19.098 "zoned": false, 00:11:19.098 "supported_io_types": { 00:11:19.098 "read": true, 00:11:19.098 "write": true, 00:11:19.098 "unmap": true, 00:11:19.098 "flush": true, 00:11:19.098 "reset": true, 00:11:19.098 "nvme_admin": false, 00:11:19.098 "nvme_io": false, 00:11:19.098 "nvme_io_md": false, 00:11:19.098 "write_zeroes": true, 00:11:19.098 "zcopy": true, 00:11:19.098 "get_zone_info": false, 00:11:19.098 "zone_management": false, 00:11:19.098 "zone_append": false, 00:11:19.098 "compare": false, 00:11:19.098 "compare_and_write": false, 00:11:19.098 "abort": true, 00:11:19.098 "seek_hole": false, 00:11:19.098 "seek_data": false, 00:11:19.098 "copy": true, 00:11:19.098 "nvme_iov_md": false 00:11:19.098 }, 00:11:19.098 "memory_domains": [ 00:11:19.098 { 00:11:19.098 "dma_device_id": "system", 00:11:19.098 "dma_device_type": 1 00:11:19.098 }, 00:11:19.098 { 00:11:19.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:19.098 "dma_device_type": 2 00:11:19.098 } 00:11:19.098 ], 00:11:19.098 "driver_specific": {} 00:11:19.098 }' 00:11:19.098 18:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:19.098 18:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:19.098 18:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:19.098 18:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:19.357 18:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:19.357 18:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:19.357 18:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:19.357 18:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:19.357 18:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:19.357 18:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:19.357 18:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:19.357 18:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:19.358 18:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:19.617 [2024-07-24 18:48:04.433739] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:19.617 [2024-07-24 18:48:04.433756] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:19.617 [2024-07-24 18:48:04.433797] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:19.617 [2024-07-24 18:48:04.433836] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:19.617 [2024-07-24 18:48:04.433843] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x175ca90 name Existed_Raid, state offline 00:11:19.617 18:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2066857 00:11:19.617 18:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2066857 ']' 00:11:19.617 18:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2066857 00:11:19.617 18:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:19.617 18:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:19.617 18:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2066857 00:11:19.617 18:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:19.617 18:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:19.617 18:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2066857' 00:11:19.617 killing process with pid 2066857 00:11:19.617 18:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2066857 00:11:19.617 [2024-07-24 18:48:04.493106] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:19.617 18:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2066857 00:11:19.617 [2024-07-24 18:48:04.535799] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:19.876 18:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:19.876 00:11:19.876 real 0m21.424s 00:11:19.876 user 0m39.753s 00:11:19.876 sys 0m3.327s 00:11:19.876 18:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:19.876 18:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:19.876 ************************************ 00:11:19.876 END TEST raid_state_function_test 00:11:19.876 ************************************ 00:11:19.876 18:48:04 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:11:19.876 18:48:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:19.876 18:48:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:19.876 18:48:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:20.136 ************************************ 00:11:20.136 START TEST raid_state_function_test_sb 00:11:20.136 ************************************ 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2071226 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2071226' 00:11:20.136 Process raid pid: 2071226 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2071226 /var/tmp/spdk-raid.sock 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2071226 ']' 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:20.136 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:20.136 18:48:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:20.136 [2024-07-24 18:48:04.945356] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:11:20.136 [2024-07-24 18:48:04.945393] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:20.136 [2024-07-24 18:48:05.009880] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:20.136 [2024-07-24 18:48:05.085842] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:20.136 [2024-07-24 18:48:05.136184] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:20.136 [2024-07-24 18:48:05.136204] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:21.071 18:48:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:21.071 18:48:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:21.071 18:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:21.071 [2024-07-24 18:48:05.887390] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:21.071 [2024-07-24 18:48:05.887418] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:21.071 [2024-07-24 18:48:05.887424] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:21.071 [2024-07-24 18:48:05.887429] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:21.071 [2024-07-24 18:48:05.887435] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:21.071 [2024-07-24 18:48:05.887456] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:21.071 18:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:21.071 18:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:21.071 18:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:21.071 18:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:21.071 18:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:21.071 18:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:21.072 18:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:21.072 18:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:21.072 18:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:21.072 18:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:21.072 18:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:21.072 18:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:21.329 18:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:21.330 "name": "Existed_Raid", 00:11:21.330 "uuid": "8d684dbd-afdf-4ce0-8949-84a8ef7fe6b9", 00:11:21.330 "strip_size_kb": 64, 00:11:21.330 "state": "configuring", 00:11:21.330 "raid_level": "raid0", 00:11:21.330 "superblock": true, 00:11:21.330 "num_base_bdevs": 3, 00:11:21.330 "num_base_bdevs_discovered": 0, 00:11:21.330 "num_base_bdevs_operational": 3, 00:11:21.330 "base_bdevs_list": [ 00:11:21.330 { 00:11:21.330 "name": "BaseBdev1", 00:11:21.330 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:21.330 "is_configured": false, 00:11:21.330 "data_offset": 0, 00:11:21.330 "data_size": 0 00:11:21.330 }, 00:11:21.330 { 00:11:21.330 "name": "BaseBdev2", 00:11:21.330 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:21.330 "is_configured": false, 00:11:21.330 "data_offset": 0, 00:11:21.330 "data_size": 0 00:11:21.330 }, 00:11:21.330 { 00:11:21.330 "name": "BaseBdev3", 00:11:21.330 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:21.330 "is_configured": false, 00:11:21.330 "data_offset": 0, 00:11:21.330 "data_size": 0 00:11:21.330 } 00:11:21.330 ] 00:11:21.330 }' 00:11:21.330 18:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:21.330 18:48:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:21.588 18:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:21.847 [2024-07-24 18:48:06.717456] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:21.847 [2024-07-24 18:48:06.717487] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcb0ba0 name Existed_Raid, state configuring 00:11:21.847 18:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:22.107 [2024-07-24 18:48:06.889908] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:22.107 [2024-07-24 18:48:06.889926] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:22.107 [2024-07-24 18:48:06.889931] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:22.107 [2024-07-24 18:48:06.889936] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:22.107 [2024-07-24 18:48:06.889940] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:22.107 [2024-07-24 18:48:06.889944] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:22.107 18:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:22.107 [2024-07-24 18:48:07.062593] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:22.107 BaseBdev1 00:11:22.107 18:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:22.107 18:48:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:22.108 18:48:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:22.108 18:48:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:22.108 18:48:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:22.108 18:48:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:22.108 18:48:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:22.419 18:48:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:22.684 [ 00:11:22.684 { 00:11:22.684 "name": "BaseBdev1", 00:11:22.684 "aliases": [ 00:11:22.684 "502cd30b-575d-42b4-85a6-6dcfa2c8d1bf" 00:11:22.684 ], 00:11:22.684 "product_name": "Malloc disk", 00:11:22.684 "block_size": 512, 00:11:22.684 "num_blocks": 65536, 00:11:22.684 "uuid": "502cd30b-575d-42b4-85a6-6dcfa2c8d1bf", 00:11:22.684 "assigned_rate_limits": { 00:11:22.684 "rw_ios_per_sec": 0, 00:11:22.684 "rw_mbytes_per_sec": 0, 00:11:22.684 "r_mbytes_per_sec": 0, 00:11:22.684 "w_mbytes_per_sec": 0 00:11:22.684 }, 00:11:22.684 "claimed": true, 00:11:22.684 "claim_type": "exclusive_write", 00:11:22.684 "zoned": false, 00:11:22.684 "supported_io_types": { 00:11:22.684 "read": true, 00:11:22.684 "write": true, 00:11:22.684 "unmap": true, 00:11:22.684 "flush": true, 00:11:22.684 "reset": true, 00:11:22.684 "nvme_admin": false, 00:11:22.684 "nvme_io": false, 00:11:22.684 "nvme_io_md": false, 00:11:22.684 "write_zeroes": true, 00:11:22.684 "zcopy": true, 00:11:22.684 "get_zone_info": false, 00:11:22.684 "zone_management": false, 00:11:22.684 "zone_append": false, 00:11:22.684 "compare": false, 00:11:22.684 "compare_and_write": false, 00:11:22.684 "abort": true, 00:11:22.684 "seek_hole": false, 00:11:22.684 "seek_data": false, 00:11:22.684 "copy": true, 00:11:22.684 "nvme_iov_md": false 00:11:22.684 }, 00:11:22.684 "memory_domains": [ 00:11:22.684 { 00:11:22.684 "dma_device_id": "system", 00:11:22.684 "dma_device_type": 1 00:11:22.684 }, 00:11:22.684 { 00:11:22.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:22.684 "dma_device_type": 2 00:11:22.684 } 00:11:22.684 ], 00:11:22.684 "driver_specific": {} 00:11:22.684 } 00:11:22.684 ] 00:11:22.684 18:48:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:22.684 18:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:22.684 18:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:22.684 18:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:22.684 18:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:22.684 18:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:22.684 18:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:22.684 18:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:22.684 18:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:22.684 18:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:22.684 18:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:22.684 18:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:22.684 18:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:22.684 18:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:22.684 "name": "Existed_Raid", 00:11:22.684 "uuid": "0fab9ea7-2e94-4597-9f27-b293539b52a4", 00:11:22.684 "strip_size_kb": 64, 00:11:22.684 "state": "configuring", 00:11:22.684 "raid_level": "raid0", 00:11:22.684 "superblock": true, 00:11:22.684 "num_base_bdevs": 3, 00:11:22.684 "num_base_bdevs_discovered": 1, 00:11:22.684 "num_base_bdevs_operational": 3, 00:11:22.684 "base_bdevs_list": [ 00:11:22.684 { 00:11:22.684 "name": "BaseBdev1", 00:11:22.684 "uuid": "502cd30b-575d-42b4-85a6-6dcfa2c8d1bf", 00:11:22.684 "is_configured": true, 00:11:22.684 "data_offset": 2048, 00:11:22.684 "data_size": 63488 00:11:22.684 }, 00:11:22.684 { 00:11:22.684 "name": "BaseBdev2", 00:11:22.684 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:22.684 "is_configured": false, 00:11:22.684 "data_offset": 0, 00:11:22.684 "data_size": 0 00:11:22.684 }, 00:11:22.684 { 00:11:22.684 "name": "BaseBdev3", 00:11:22.684 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:22.684 "is_configured": false, 00:11:22.684 "data_offset": 0, 00:11:22.684 "data_size": 0 00:11:22.684 } 00:11:22.684 ] 00:11:22.684 }' 00:11:22.684 18:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:22.684 18:48:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:23.251 18:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:23.251 [2024-07-24 18:48:08.233625] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:23.251 [2024-07-24 18:48:08.233657] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcb0470 name Existed_Raid, state configuring 00:11:23.251 18:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:23.509 [2024-07-24 18:48:08.402100] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:23.509 [2024-07-24 18:48:08.403230] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:23.509 [2024-07-24 18:48:08.403258] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:23.509 [2024-07-24 18:48:08.403264] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:23.509 [2024-07-24 18:48:08.403269] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:23.509 18:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:23.509 18:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:23.509 18:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:23.509 18:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:23.509 18:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:23.509 18:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:23.509 18:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:23.509 18:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:23.509 18:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:23.510 18:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:23.510 18:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:23.510 18:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:23.510 18:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:23.510 18:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:23.768 18:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:23.768 "name": "Existed_Raid", 00:11:23.768 "uuid": "21feaf1a-5292-4b94-a54c-5b7fcd4b2058", 00:11:23.768 "strip_size_kb": 64, 00:11:23.768 "state": "configuring", 00:11:23.768 "raid_level": "raid0", 00:11:23.768 "superblock": true, 00:11:23.768 "num_base_bdevs": 3, 00:11:23.768 "num_base_bdevs_discovered": 1, 00:11:23.768 "num_base_bdevs_operational": 3, 00:11:23.768 "base_bdevs_list": [ 00:11:23.768 { 00:11:23.768 "name": "BaseBdev1", 00:11:23.768 "uuid": "502cd30b-575d-42b4-85a6-6dcfa2c8d1bf", 00:11:23.768 "is_configured": true, 00:11:23.768 "data_offset": 2048, 00:11:23.768 "data_size": 63488 00:11:23.768 }, 00:11:23.768 { 00:11:23.768 "name": "BaseBdev2", 00:11:23.768 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:23.768 "is_configured": false, 00:11:23.768 "data_offset": 0, 00:11:23.768 "data_size": 0 00:11:23.768 }, 00:11:23.768 { 00:11:23.768 "name": "BaseBdev3", 00:11:23.768 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:23.768 "is_configured": false, 00:11:23.768 "data_offset": 0, 00:11:23.768 "data_size": 0 00:11:23.768 } 00:11:23.768 ] 00:11:23.768 }' 00:11:23.768 18:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:23.768 18:48:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:24.334 18:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:24.334 [2024-07-24 18:48:09.266933] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:24.334 BaseBdev2 00:11:24.334 18:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:24.334 18:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:24.334 18:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:24.334 18:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:24.334 18:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:24.334 18:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:24.334 18:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:24.591 18:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:24.849 [ 00:11:24.849 { 00:11:24.849 "name": "BaseBdev2", 00:11:24.849 "aliases": [ 00:11:24.849 "b841d28f-55e8-49d4-b6e4-e0707aa3f3fc" 00:11:24.849 ], 00:11:24.849 "product_name": "Malloc disk", 00:11:24.849 "block_size": 512, 00:11:24.849 "num_blocks": 65536, 00:11:24.849 "uuid": "b841d28f-55e8-49d4-b6e4-e0707aa3f3fc", 00:11:24.849 "assigned_rate_limits": { 00:11:24.849 "rw_ios_per_sec": 0, 00:11:24.849 "rw_mbytes_per_sec": 0, 00:11:24.849 "r_mbytes_per_sec": 0, 00:11:24.849 "w_mbytes_per_sec": 0 00:11:24.849 }, 00:11:24.849 "claimed": true, 00:11:24.849 "claim_type": "exclusive_write", 00:11:24.849 "zoned": false, 00:11:24.849 "supported_io_types": { 00:11:24.849 "read": true, 00:11:24.849 "write": true, 00:11:24.849 "unmap": true, 00:11:24.849 "flush": true, 00:11:24.849 "reset": true, 00:11:24.849 "nvme_admin": false, 00:11:24.849 "nvme_io": false, 00:11:24.849 "nvme_io_md": false, 00:11:24.849 "write_zeroes": true, 00:11:24.849 "zcopy": true, 00:11:24.849 "get_zone_info": false, 00:11:24.849 "zone_management": false, 00:11:24.849 "zone_append": false, 00:11:24.849 "compare": false, 00:11:24.849 "compare_and_write": false, 00:11:24.849 "abort": true, 00:11:24.849 "seek_hole": false, 00:11:24.849 "seek_data": false, 00:11:24.849 "copy": true, 00:11:24.849 "nvme_iov_md": false 00:11:24.849 }, 00:11:24.849 "memory_domains": [ 00:11:24.849 { 00:11:24.849 "dma_device_id": "system", 00:11:24.849 "dma_device_type": 1 00:11:24.849 }, 00:11:24.849 { 00:11:24.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:24.849 "dma_device_type": 2 00:11:24.849 } 00:11:24.849 ], 00:11:24.849 "driver_specific": {} 00:11:24.849 } 00:11:24.849 ] 00:11:24.849 18:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:24.849 18:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:24.849 18:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:24.849 18:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:24.850 18:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:24.850 18:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:24.850 18:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:24.850 18:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:24.850 18:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:24.850 18:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:24.850 18:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:24.850 18:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:24.850 18:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:24.850 18:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:24.850 18:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:24.850 18:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:24.850 "name": "Existed_Raid", 00:11:24.850 "uuid": "21feaf1a-5292-4b94-a54c-5b7fcd4b2058", 00:11:24.850 "strip_size_kb": 64, 00:11:24.850 "state": "configuring", 00:11:24.850 "raid_level": "raid0", 00:11:24.850 "superblock": true, 00:11:24.850 "num_base_bdevs": 3, 00:11:24.850 "num_base_bdevs_discovered": 2, 00:11:24.850 "num_base_bdevs_operational": 3, 00:11:24.850 "base_bdevs_list": [ 00:11:24.850 { 00:11:24.850 "name": "BaseBdev1", 00:11:24.850 "uuid": "502cd30b-575d-42b4-85a6-6dcfa2c8d1bf", 00:11:24.850 "is_configured": true, 00:11:24.850 "data_offset": 2048, 00:11:24.850 "data_size": 63488 00:11:24.850 }, 00:11:24.850 { 00:11:24.850 "name": "BaseBdev2", 00:11:24.850 "uuid": "b841d28f-55e8-49d4-b6e4-e0707aa3f3fc", 00:11:24.850 "is_configured": true, 00:11:24.850 "data_offset": 2048, 00:11:24.850 "data_size": 63488 00:11:24.850 }, 00:11:24.850 { 00:11:24.850 "name": "BaseBdev3", 00:11:24.850 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:24.850 "is_configured": false, 00:11:24.850 "data_offset": 0, 00:11:24.850 "data_size": 0 00:11:24.850 } 00:11:24.850 ] 00:11:24.850 }' 00:11:24.850 18:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:24.850 18:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:25.413 18:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:25.671 [2024-07-24 18:48:10.456680] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:25.671 [2024-07-24 18:48:10.456808] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xcb1360 00:11:25.671 [2024-07-24 18:48:10.456818] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:25.671 [2024-07-24 18:48:10.456941] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9b67d0 00:11:25.671 [2024-07-24 18:48:10.457023] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcb1360 00:11:25.671 [2024-07-24 18:48:10.457028] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xcb1360 00:11:25.671 [2024-07-24 18:48:10.457090] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:25.671 BaseBdev3 00:11:25.671 18:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:11:25.671 18:48:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:11:25.671 18:48:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:25.671 18:48:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:25.671 18:48:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:25.671 18:48:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:25.671 18:48:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:25.671 18:48:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:25.929 [ 00:11:25.929 { 00:11:25.929 "name": "BaseBdev3", 00:11:25.929 "aliases": [ 00:11:25.929 "67f72fd7-86ea-4ffc-83fe-7954d9dce768" 00:11:25.929 ], 00:11:25.929 "product_name": "Malloc disk", 00:11:25.929 "block_size": 512, 00:11:25.929 "num_blocks": 65536, 00:11:25.929 "uuid": "67f72fd7-86ea-4ffc-83fe-7954d9dce768", 00:11:25.929 "assigned_rate_limits": { 00:11:25.929 "rw_ios_per_sec": 0, 00:11:25.929 "rw_mbytes_per_sec": 0, 00:11:25.929 "r_mbytes_per_sec": 0, 00:11:25.929 "w_mbytes_per_sec": 0 00:11:25.929 }, 00:11:25.929 "claimed": true, 00:11:25.929 "claim_type": "exclusive_write", 00:11:25.929 "zoned": false, 00:11:25.929 "supported_io_types": { 00:11:25.929 "read": true, 00:11:25.929 "write": true, 00:11:25.929 "unmap": true, 00:11:25.929 "flush": true, 00:11:25.929 "reset": true, 00:11:25.929 "nvme_admin": false, 00:11:25.929 "nvme_io": false, 00:11:25.929 "nvme_io_md": false, 00:11:25.929 "write_zeroes": true, 00:11:25.929 "zcopy": true, 00:11:25.929 "get_zone_info": false, 00:11:25.929 "zone_management": false, 00:11:25.929 "zone_append": false, 00:11:25.929 "compare": false, 00:11:25.929 "compare_and_write": false, 00:11:25.929 "abort": true, 00:11:25.929 "seek_hole": false, 00:11:25.929 "seek_data": false, 00:11:25.929 "copy": true, 00:11:25.929 "nvme_iov_md": false 00:11:25.929 }, 00:11:25.929 "memory_domains": [ 00:11:25.929 { 00:11:25.929 "dma_device_id": "system", 00:11:25.929 "dma_device_type": 1 00:11:25.929 }, 00:11:25.929 { 00:11:25.929 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:25.929 "dma_device_type": 2 00:11:25.929 } 00:11:25.929 ], 00:11:25.929 "driver_specific": {} 00:11:25.929 } 00:11:25.929 ] 00:11:25.929 18:48:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:25.929 18:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:25.929 18:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:25.929 18:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:25.929 18:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:25.929 18:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:25.929 18:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:25.929 18:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:25.929 18:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:25.929 18:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:25.929 18:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:25.929 18:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:25.929 18:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:25.930 18:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:25.930 18:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:26.188 18:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:26.188 "name": "Existed_Raid", 00:11:26.188 "uuid": "21feaf1a-5292-4b94-a54c-5b7fcd4b2058", 00:11:26.188 "strip_size_kb": 64, 00:11:26.188 "state": "online", 00:11:26.188 "raid_level": "raid0", 00:11:26.188 "superblock": true, 00:11:26.188 "num_base_bdevs": 3, 00:11:26.188 "num_base_bdevs_discovered": 3, 00:11:26.188 "num_base_bdevs_operational": 3, 00:11:26.188 "base_bdevs_list": [ 00:11:26.188 { 00:11:26.188 "name": "BaseBdev1", 00:11:26.188 "uuid": "502cd30b-575d-42b4-85a6-6dcfa2c8d1bf", 00:11:26.188 "is_configured": true, 00:11:26.188 "data_offset": 2048, 00:11:26.188 "data_size": 63488 00:11:26.188 }, 00:11:26.188 { 00:11:26.188 "name": "BaseBdev2", 00:11:26.188 "uuid": "b841d28f-55e8-49d4-b6e4-e0707aa3f3fc", 00:11:26.188 "is_configured": true, 00:11:26.188 "data_offset": 2048, 00:11:26.188 "data_size": 63488 00:11:26.188 }, 00:11:26.188 { 00:11:26.188 "name": "BaseBdev3", 00:11:26.188 "uuid": "67f72fd7-86ea-4ffc-83fe-7954d9dce768", 00:11:26.188 "is_configured": true, 00:11:26.188 "data_offset": 2048, 00:11:26.188 "data_size": 63488 00:11:26.188 } 00:11:26.188 ] 00:11:26.188 }' 00:11:26.188 18:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:26.188 18:48:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:26.755 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:26.755 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:26.755 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:26.755 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:26.755 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:26.755 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:26.755 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:26.755 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:26.755 [2024-07-24 18:48:11.623902] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:26.755 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:26.755 "name": "Existed_Raid", 00:11:26.755 "aliases": [ 00:11:26.755 "21feaf1a-5292-4b94-a54c-5b7fcd4b2058" 00:11:26.755 ], 00:11:26.755 "product_name": "Raid Volume", 00:11:26.755 "block_size": 512, 00:11:26.755 "num_blocks": 190464, 00:11:26.755 "uuid": "21feaf1a-5292-4b94-a54c-5b7fcd4b2058", 00:11:26.755 "assigned_rate_limits": { 00:11:26.755 "rw_ios_per_sec": 0, 00:11:26.755 "rw_mbytes_per_sec": 0, 00:11:26.755 "r_mbytes_per_sec": 0, 00:11:26.755 "w_mbytes_per_sec": 0 00:11:26.755 }, 00:11:26.755 "claimed": false, 00:11:26.755 "zoned": false, 00:11:26.755 "supported_io_types": { 00:11:26.755 "read": true, 00:11:26.755 "write": true, 00:11:26.755 "unmap": true, 00:11:26.755 "flush": true, 00:11:26.755 "reset": true, 00:11:26.755 "nvme_admin": false, 00:11:26.755 "nvme_io": false, 00:11:26.755 "nvme_io_md": false, 00:11:26.755 "write_zeroes": true, 00:11:26.755 "zcopy": false, 00:11:26.755 "get_zone_info": false, 00:11:26.755 "zone_management": false, 00:11:26.755 "zone_append": false, 00:11:26.755 "compare": false, 00:11:26.755 "compare_and_write": false, 00:11:26.755 "abort": false, 00:11:26.755 "seek_hole": false, 00:11:26.755 "seek_data": false, 00:11:26.755 "copy": false, 00:11:26.755 "nvme_iov_md": false 00:11:26.755 }, 00:11:26.755 "memory_domains": [ 00:11:26.755 { 00:11:26.755 "dma_device_id": "system", 00:11:26.755 "dma_device_type": 1 00:11:26.755 }, 00:11:26.755 { 00:11:26.755 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:26.755 "dma_device_type": 2 00:11:26.755 }, 00:11:26.755 { 00:11:26.755 "dma_device_id": "system", 00:11:26.755 "dma_device_type": 1 00:11:26.755 }, 00:11:26.755 { 00:11:26.755 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:26.755 "dma_device_type": 2 00:11:26.755 }, 00:11:26.755 { 00:11:26.756 "dma_device_id": "system", 00:11:26.756 "dma_device_type": 1 00:11:26.756 }, 00:11:26.756 { 00:11:26.756 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:26.756 "dma_device_type": 2 00:11:26.756 } 00:11:26.756 ], 00:11:26.756 "driver_specific": { 00:11:26.756 "raid": { 00:11:26.756 "uuid": "21feaf1a-5292-4b94-a54c-5b7fcd4b2058", 00:11:26.756 "strip_size_kb": 64, 00:11:26.756 "state": "online", 00:11:26.756 "raid_level": "raid0", 00:11:26.756 "superblock": true, 00:11:26.756 "num_base_bdevs": 3, 00:11:26.756 "num_base_bdevs_discovered": 3, 00:11:26.756 "num_base_bdevs_operational": 3, 00:11:26.756 "base_bdevs_list": [ 00:11:26.756 { 00:11:26.756 "name": "BaseBdev1", 00:11:26.756 "uuid": "502cd30b-575d-42b4-85a6-6dcfa2c8d1bf", 00:11:26.756 "is_configured": true, 00:11:26.756 "data_offset": 2048, 00:11:26.756 "data_size": 63488 00:11:26.756 }, 00:11:26.756 { 00:11:26.756 "name": "BaseBdev2", 00:11:26.756 "uuid": "b841d28f-55e8-49d4-b6e4-e0707aa3f3fc", 00:11:26.756 "is_configured": true, 00:11:26.756 "data_offset": 2048, 00:11:26.756 "data_size": 63488 00:11:26.756 }, 00:11:26.756 { 00:11:26.756 "name": "BaseBdev3", 00:11:26.756 "uuid": "67f72fd7-86ea-4ffc-83fe-7954d9dce768", 00:11:26.756 "is_configured": true, 00:11:26.756 "data_offset": 2048, 00:11:26.756 "data_size": 63488 00:11:26.756 } 00:11:26.756 ] 00:11:26.756 } 00:11:26.756 } 00:11:26.756 }' 00:11:26.756 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:26.756 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:26.756 BaseBdev2 00:11:26.756 BaseBdev3' 00:11:26.756 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:26.756 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:26.756 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:27.015 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:27.015 "name": "BaseBdev1", 00:11:27.015 "aliases": [ 00:11:27.015 "502cd30b-575d-42b4-85a6-6dcfa2c8d1bf" 00:11:27.015 ], 00:11:27.015 "product_name": "Malloc disk", 00:11:27.015 "block_size": 512, 00:11:27.015 "num_blocks": 65536, 00:11:27.015 "uuid": "502cd30b-575d-42b4-85a6-6dcfa2c8d1bf", 00:11:27.015 "assigned_rate_limits": { 00:11:27.015 "rw_ios_per_sec": 0, 00:11:27.015 "rw_mbytes_per_sec": 0, 00:11:27.015 "r_mbytes_per_sec": 0, 00:11:27.015 "w_mbytes_per_sec": 0 00:11:27.015 }, 00:11:27.015 "claimed": true, 00:11:27.015 "claim_type": "exclusive_write", 00:11:27.015 "zoned": false, 00:11:27.015 "supported_io_types": { 00:11:27.015 "read": true, 00:11:27.015 "write": true, 00:11:27.015 "unmap": true, 00:11:27.015 "flush": true, 00:11:27.015 "reset": true, 00:11:27.015 "nvme_admin": false, 00:11:27.015 "nvme_io": false, 00:11:27.015 "nvme_io_md": false, 00:11:27.015 "write_zeroes": true, 00:11:27.015 "zcopy": true, 00:11:27.015 "get_zone_info": false, 00:11:27.015 "zone_management": false, 00:11:27.015 "zone_append": false, 00:11:27.015 "compare": false, 00:11:27.015 "compare_and_write": false, 00:11:27.015 "abort": true, 00:11:27.015 "seek_hole": false, 00:11:27.015 "seek_data": false, 00:11:27.015 "copy": true, 00:11:27.015 "nvme_iov_md": false 00:11:27.015 }, 00:11:27.015 "memory_domains": [ 00:11:27.015 { 00:11:27.015 "dma_device_id": "system", 00:11:27.015 "dma_device_type": 1 00:11:27.015 }, 00:11:27.015 { 00:11:27.015 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.015 "dma_device_type": 2 00:11:27.015 } 00:11:27.015 ], 00:11:27.015 "driver_specific": {} 00:11:27.015 }' 00:11:27.015 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:27.015 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:27.015 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:27.015 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:27.015 18:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:27.273 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:27.273 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:27.273 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:27.273 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:27.274 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:27.274 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:27.274 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:27.274 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:27.274 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:27.274 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:27.532 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:27.532 "name": "BaseBdev2", 00:11:27.532 "aliases": [ 00:11:27.532 "b841d28f-55e8-49d4-b6e4-e0707aa3f3fc" 00:11:27.532 ], 00:11:27.532 "product_name": "Malloc disk", 00:11:27.532 "block_size": 512, 00:11:27.532 "num_blocks": 65536, 00:11:27.532 "uuid": "b841d28f-55e8-49d4-b6e4-e0707aa3f3fc", 00:11:27.532 "assigned_rate_limits": { 00:11:27.532 "rw_ios_per_sec": 0, 00:11:27.532 "rw_mbytes_per_sec": 0, 00:11:27.532 "r_mbytes_per_sec": 0, 00:11:27.532 "w_mbytes_per_sec": 0 00:11:27.532 }, 00:11:27.532 "claimed": true, 00:11:27.532 "claim_type": "exclusive_write", 00:11:27.532 "zoned": false, 00:11:27.532 "supported_io_types": { 00:11:27.532 "read": true, 00:11:27.532 "write": true, 00:11:27.532 "unmap": true, 00:11:27.532 "flush": true, 00:11:27.532 "reset": true, 00:11:27.532 "nvme_admin": false, 00:11:27.532 "nvme_io": false, 00:11:27.532 "nvme_io_md": false, 00:11:27.532 "write_zeroes": true, 00:11:27.532 "zcopy": true, 00:11:27.532 "get_zone_info": false, 00:11:27.532 "zone_management": false, 00:11:27.532 "zone_append": false, 00:11:27.532 "compare": false, 00:11:27.532 "compare_and_write": false, 00:11:27.532 "abort": true, 00:11:27.532 "seek_hole": false, 00:11:27.532 "seek_data": false, 00:11:27.532 "copy": true, 00:11:27.532 "nvme_iov_md": false 00:11:27.532 }, 00:11:27.532 "memory_domains": [ 00:11:27.532 { 00:11:27.532 "dma_device_id": "system", 00:11:27.532 "dma_device_type": 1 00:11:27.532 }, 00:11:27.532 { 00:11:27.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.532 "dma_device_type": 2 00:11:27.532 } 00:11:27.532 ], 00:11:27.532 "driver_specific": {} 00:11:27.532 }' 00:11:27.532 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:27.532 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:27.532 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:27.532 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:27.532 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:27.532 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:27.532 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:27.790 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:27.790 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:27.790 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:27.790 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:27.790 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:27.790 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:27.790 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:27.790 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:28.048 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:28.048 "name": "BaseBdev3", 00:11:28.048 "aliases": [ 00:11:28.048 "67f72fd7-86ea-4ffc-83fe-7954d9dce768" 00:11:28.048 ], 00:11:28.048 "product_name": "Malloc disk", 00:11:28.048 "block_size": 512, 00:11:28.048 "num_blocks": 65536, 00:11:28.049 "uuid": "67f72fd7-86ea-4ffc-83fe-7954d9dce768", 00:11:28.049 "assigned_rate_limits": { 00:11:28.049 "rw_ios_per_sec": 0, 00:11:28.049 "rw_mbytes_per_sec": 0, 00:11:28.049 "r_mbytes_per_sec": 0, 00:11:28.049 "w_mbytes_per_sec": 0 00:11:28.049 }, 00:11:28.049 "claimed": true, 00:11:28.049 "claim_type": "exclusive_write", 00:11:28.049 "zoned": false, 00:11:28.049 "supported_io_types": { 00:11:28.049 "read": true, 00:11:28.049 "write": true, 00:11:28.049 "unmap": true, 00:11:28.049 "flush": true, 00:11:28.049 "reset": true, 00:11:28.049 "nvme_admin": false, 00:11:28.049 "nvme_io": false, 00:11:28.049 "nvme_io_md": false, 00:11:28.049 "write_zeroes": true, 00:11:28.049 "zcopy": true, 00:11:28.049 "get_zone_info": false, 00:11:28.049 "zone_management": false, 00:11:28.049 "zone_append": false, 00:11:28.049 "compare": false, 00:11:28.049 "compare_and_write": false, 00:11:28.049 "abort": true, 00:11:28.049 "seek_hole": false, 00:11:28.049 "seek_data": false, 00:11:28.049 "copy": true, 00:11:28.049 "nvme_iov_md": false 00:11:28.049 }, 00:11:28.049 "memory_domains": [ 00:11:28.049 { 00:11:28.049 "dma_device_id": "system", 00:11:28.049 "dma_device_type": 1 00:11:28.049 }, 00:11:28.049 { 00:11:28.049 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.049 "dma_device_type": 2 00:11:28.049 } 00:11:28.049 ], 00:11:28.049 "driver_specific": {} 00:11:28.049 }' 00:11:28.049 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:28.049 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:28.049 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:28.049 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:28.049 18:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:28.049 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:28.049 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:28.049 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:28.307 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:28.307 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:28.307 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:28.307 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:28.307 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:28.565 [2024-07-24 18:48:13.320307] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:28.565 [2024-07-24 18:48:13.320329] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:28.565 [2024-07-24 18:48:13.320358] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:28.565 "name": "Existed_Raid", 00:11:28.565 "uuid": "21feaf1a-5292-4b94-a54c-5b7fcd4b2058", 00:11:28.565 "strip_size_kb": 64, 00:11:28.565 "state": "offline", 00:11:28.565 "raid_level": "raid0", 00:11:28.565 "superblock": true, 00:11:28.565 "num_base_bdevs": 3, 00:11:28.565 "num_base_bdevs_discovered": 2, 00:11:28.565 "num_base_bdevs_operational": 2, 00:11:28.565 "base_bdevs_list": [ 00:11:28.565 { 00:11:28.565 "name": null, 00:11:28.565 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:28.565 "is_configured": false, 00:11:28.565 "data_offset": 2048, 00:11:28.565 "data_size": 63488 00:11:28.565 }, 00:11:28.565 { 00:11:28.565 "name": "BaseBdev2", 00:11:28.565 "uuid": "b841d28f-55e8-49d4-b6e4-e0707aa3f3fc", 00:11:28.565 "is_configured": true, 00:11:28.565 "data_offset": 2048, 00:11:28.565 "data_size": 63488 00:11:28.565 }, 00:11:28.565 { 00:11:28.565 "name": "BaseBdev3", 00:11:28.565 "uuid": "67f72fd7-86ea-4ffc-83fe-7954d9dce768", 00:11:28.565 "is_configured": true, 00:11:28.565 "data_offset": 2048, 00:11:28.565 "data_size": 63488 00:11:28.565 } 00:11:28.565 ] 00:11:28.565 }' 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:28.565 18:48:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:29.130 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:29.130 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:29.130 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:29.130 18:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:29.388 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:29.388 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:29.388 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:29.388 [2024-07-24 18:48:14.315771] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:29.388 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:29.388 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:29.388 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:29.388 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:29.647 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:29.647 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:29.647 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:11:29.905 [2024-07-24 18:48:14.662440] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:29.905 [2024-07-24 18:48:14.662479] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcb1360 name Existed_Raid, state offline 00:11:29.905 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:29.905 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:29.905 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:29.905 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:29.905 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:29.905 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:29.905 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:11:29.905 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:11:29.905 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:29.906 18:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:30.164 BaseBdev2 00:11:30.164 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:11:30.164 18:48:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:30.164 18:48:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:30.164 18:48:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:30.164 18:48:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:30.164 18:48:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:30.164 18:48:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:30.164 18:48:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:30.423 [ 00:11:30.423 { 00:11:30.423 "name": "BaseBdev2", 00:11:30.423 "aliases": [ 00:11:30.423 "c0740e12-6338-4fae-ad0c-24a8c62a7a4d" 00:11:30.423 ], 00:11:30.423 "product_name": "Malloc disk", 00:11:30.423 "block_size": 512, 00:11:30.423 "num_blocks": 65536, 00:11:30.423 "uuid": "c0740e12-6338-4fae-ad0c-24a8c62a7a4d", 00:11:30.423 "assigned_rate_limits": { 00:11:30.423 "rw_ios_per_sec": 0, 00:11:30.423 "rw_mbytes_per_sec": 0, 00:11:30.423 "r_mbytes_per_sec": 0, 00:11:30.423 "w_mbytes_per_sec": 0 00:11:30.423 }, 00:11:30.423 "claimed": false, 00:11:30.423 "zoned": false, 00:11:30.423 "supported_io_types": { 00:11:30.423 "read": true, 00:11:30.423 "write": true, 00:11:30.423 "unmap": true, 00:11:30.423 "flush": true, 00:11:30.423 "reset": true, 00:11:30.423 "nvme_admin": false, 00:11:30.423 "nvme_io": false, 00:11:30.423 "nvme_io_md": false, 00:11:30.423 "write_zeroes": true, 00:11:30.423 "zcopy": true, 00:11:30.423 "get_zone_info": false, 00:11:30.423 "zone_management": false, 00:11:30.423 "zone_append": false, 00:11:30.423 "compare": false, 00:11:30.423 "compare_and_write": false, 00:11:30.423 "abort": true, 00:11:30.423 "seek_hole": false, 00:11:30.423 "seek_data": false, 00:11:30.423 "copy": true, 00:11:30.423 "nvme_iov_md": false 00:11:30.423 }, 00:11:30.423 "memory_domains": [ 00:11:30.423 { 00:11:30.423 "dma_device_id": "system", 00:11:30.423 "dma_device_type": 1 00:11:30.423 }, 00:11:30.423 { 00:11:30.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:30.423 "dma_device_type": 2 00:11:30.423 } 00:11:30.423 ], 00:11:30.423 "driver_specific": {} 00:11:30.423 } 00:11:30.423 ] 00:11:30.423 18:48:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:30.423 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:11:30.423 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:30.423 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:30.682 BaseBdev3 00:11:30.682 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:11:30.682 18:48:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:11:30.682 18:48:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:30.682 18:48:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:30.682 18:48:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:30.682 18:48:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:30.682 18:48:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:30.682 18:48:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:30.941 [ 00:11:30.941 { 00:11:30.941 "name": "BaseBdev3", 00:11:30.941 "aliases": [ 00:11:30.941 "254d929e-3fa8-4ab7-b392-86a17736c666" 00:11:30.941 ], 00:11:30.941 "product_name": "Malloc disk", 00:11:30.941 "block_size": 512, 00:11:30.941 "num_blocks": 65536, 00:11:30.941 "uuid": "254d929e-3fa8-4ab7-b392-86a17736c666", 00:11:30.941 "assigned_rate_limits": { 00:11:30.941 "rw_ios_per_sec": 0, 00:11:30.941 "rw_mbytes_per_sec": 0, 00:11:30.941 "r_mbytes_per_sec": 0, 00:11:30.941 "w_mbytes_per_sec": 0 00:11:30.941 }, 00:11:30.941 "claimed": false, 00:11:30.941 "zoned": false, 00:11:30.941 "supported_io_types": { 00:11:30.941 "read": true, 00:11:30.941 "write": true, 00:11:30.941 "unmap": true, 00:11:30.941 "flush": true, 00:11:30.941 "reset": true, 00:11:30.941 "nvme_admin": false, 00:11:30.941 "nvme_io": false, 00:11:30.941 "nvme_io_md": false, 00:11:30.941 "write_zeroes": true, 00:11:30.941 "zcopy": true, 00:11:30.941 "get_zone_info": false, 00:11:30.941 "zone_management": false, 00:11:30.941 "zone_append": false, 00:11:30.941 "compare": false, 00:11:30.941 "compare_and_write": false, 00:11:30.941 "abort": true, 00:11:30.941 "seek_hole": false, 00:11:30.941 "seek_data": false, 00:11:30.941 "copy": true, 00:11:30.941 "nvme_iov_md": false 00:11:30.941 }, 00:11:30.941 "memory_domains": [ 00:11:30.941 { 00:11:30.941 "dma_device_id": "system", 00:11:30.941 "dma_device_type": 1 00:11:30.941 }, 00:11:30.941 { 00:11:30.941 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:30.941 "dma_device_type": 2 00:11:30.941 } 00:11:30.941 ], 00:11:30.941 "driver_specific": {} 00:11:30.941 } 00:11:30.941 ] 00:11:30.941 18:48:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:30.941 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:11:30.941 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:30.941 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:30.941 [2024-07-24 18:48:15.930934] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:30.941 [2024-07-24 18:48:15.930966] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:30.941 [2024-07-24 18:48:15.930977] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:30.941 [2024-07-24 18:48:15.931857] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:30.941 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:30.941 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:30.941 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:30.941 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:30.941 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:30.941 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:30.941 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:30.941 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:30.941 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:30.941 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:30.941 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:30.941 18:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:31.200 18:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:31.200 "name": "Existed_Raid", 00:11:31.200 "uuid": "268ca89e-3d14-4492-9316-2e7576f0b8ae", 00:11:31.200 "strip_size_kb": 64, 00:11:31.200 "state": "configuring", 00:11:31.200 "raid_level": "raid0", 00:11:31.200 "superblock": true, 00:11:31.200 "num_base_bdevs": 3, 00:11:31.200 "num_base_bdevs_discovered": 2, 00:11:31.200 "num_base_bdevs_operational": 3, 00:11:31.200 "base_bdevs_list": [ 00:11:31.200 { 00:11:31.200 "name": "BaseBdev1", 00:11:31.200 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:31.200 "is_configured": false, 00:11:31.200 "data_offset": 0, 00:11:31.200 "data_size": 0 00:11:31.200 }, 00:11:31.200 { 00:11:31.200 "name": "BaseBdev2", 00:11:31.200 "uuid": "c0740e12-6338-4fae-ad0c-24a8c62a7a4d", 00:11:31.200 "is_configured": true, 00:11:31.200 "data_offset": 2048, 00:11:31.200 "data_size": 63488 00:11:31.200 }, 00:11:31.200 { 00:11:31.200 "name": "BaseBdev3", 00:11:31.200 "uuid": "254d929e-3fa8-4ab7-b392-86a17736c666", 00:11:31.200 "is_configured": true, 00:11:31.200 "data_offset": 2048, 00:11:31.200 "data_size": 63488 00:11:31.200 } 00:11:31.200 ] 00:11:31.200 }' 00:11:31.200 18:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:31.200 18:48:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:31.768 18:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:11:31.768 [2024-07-24 18:48:16.733009] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:31.768 18:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:31.768 18:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:31.768 18:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:31.768 18:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:31.768 18:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:31.768 18:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:31.768 18:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:31.768 18:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:31.768 18:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:31.768 18:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:31.768 18:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:31.768 18:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:32.027 18:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:32.028 "name": "Existed_Raid", 00:11:32.028 "uuid": "268ca89e-3d14-4492-9316-2e7576f0b8ae", 00:11:32.028 "strip_size_kb": 64, 00:11:32.028 "state": "configuring", 00:11:32.028 "raid_level": "raid0", 00:11:32.028 "superblock": true, 00:11:32.028 "num_base_bdevs": 3, 00:11:32.028 "num_base_bdevs_discovered": 1, 00:11:32.028 "num_base_bdevs_operational": 3, 00:11:32.028 "base_bdevs_list": [ 00:11:32.028 { 00:11:32.028 "name": "BaseBdev1", 00:11:32.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:32.028 "is_configured": false, 00:11:32.028 "data_offset": 0, 00:11:32.028 "data_size": 0 00:11:32.028 }, 00:11:32.028 { 00:11:32.028 "name": null, 00:11:32.028 "uuid": "c0740e12-6338-4fae-ad0c-24a8c62a7a4d", 00:11:32.028 "is_configured": false, 00:11:32.028 "data_offset": 2048, 00:11:32.028 "data_size": 63488 00:11:32.028 }, 00:11:32.028 { 00:11:32.028 "name": "BaseBdev3", 00:11:32.028 "uuid": "254d929e-3fa8-4ab7-b392-86a17736c666", 00:11:32.028 "is_configured": true, 00:11:32.028 "data_offset": 2048, 00:11:32.028 "data_size": 63488 00:11:32.028 } 00:11:32.028 ] 00:11:32.028 }' 00:11:32.028 18:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:32.028 18:48:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:32.595 18:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:32.595 18:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:32.595 18:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:11:32.595 18:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:32.855 [2024-07-24 18:48:17.746357] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:32.855 BaseBdev1 00:11:32.855 18:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:11:32.855 18:48:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:32.855 18:48:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:32.855 18:48:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:32.855 18:48:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:32.855 18:48:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:32.855 18:48:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:33.114 18:48:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:33.114 [ 00:11:33.114 { 00:11:33.114 "name": "BaseBdev1", 00:11:33.114 "aliases": [ 00:11:33.114 "f37f60ba-2ec1-44c0-8ddb-ee64ae6d96e4" 00:11:33.114 ], 00:11:33.114 "product_name": "Malloc disk", 00:11:33.114 "block_size": 512, 00:11:33.114 "num_blocks": 65536, 00:11:33.114 "uuid": "f37f60ba-2ec1-44c0-8ddb-ee64ae6d96e4", 00:11:33.114 "assigned_rate_limits": { 00:11:33.114 "rw_ios_per_sec": 0, 00:11:33.114 "rw_mbytes_per_sec": 0, 00:11:33.114 "r_mbytes_per_sec": 0, 00:11:33.114 "w_mbytes_per_sec": 0 00:11:33.114 }, 00:11:33.114 "claimed": true, 00:11:33.114 "claim_type": "exclusive_write", 00:11:33.114 "zoned": false, 00:11:33.114 "supported_io_types": { 00:11:33.114 "read": true, 00:11:33.114 "write": true, 00:11:33.114 "unmap": true, 00:11:33.114 "flush": true, 00:11:33.114 "reset": true, 00:11:33.114 "nvme_admin": false, 00:11:33.114 "nvme_io": false, 00:11:33.114 "nvme_io_md": false, 00:11:33.114 "write_zeroes": true, 00:11:33.114 "zcopy": true, 00:11:33.114 "get_zone_info": false, 00:11:33.114 "zone_management": false, 00:11:33.114 "zone_append": false, 00:11:33.114 "compare": false, 00:11:33.114 "compare_and_write": false, 00:11:33.114 "abort": true, 00:11:33.114 "seek_hole": false, 00:11:33.114 "seek_data": false, 00:11:33.114 "copy": true, 00:11:33.114 "nvme_iov_md": false 00:11:33.114 }, 00:11:33.114 "memory_domains": [ 00:11:33.114 { 00:11:33.114 "dma_device_id": "system", 00:11:33.114 "dma_device_type": 1 00:11:33.114 }, 00:11:33.114 { 00:11:33.114 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.114 "dma_device_type": 2 00:11:33.114 } 00:11:33.114 ], 00:11:33.114 "driver_specific": {} 00:11:33.114 } 00:11:33.114 ] 00:11:33.114 18:48:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:33.114 18:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:33.114 18:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:33.114 18:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:33.114 18:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:33.114 18:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:33.114 18:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:33.114 18:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:33.114 18:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:33.114 18:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:33.114 18:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:33.114 18:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:33.114 18:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:33.373 18:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:33.373 "name": "Existed_Raid", 00:11:33.373 "uuid": "268ca89e-3d14-4492-9316-2e7576f0b8ae", 00:11:33.373 "strip_size_kb": 64, 00:11:33.373 "state": "configuring", 00:11:33.373 "raid_level": "raid0", 00:11:33.373 "superblock": true, 00:11:33.373 "num_base_bdevs": 3, 00:11:33.373 "num_base_bdevs_discovered": 2, 00:11:33.373 "num_base_bdevs_operational": 3, 00:11:33.373 "base_bdevs_list": [ 00:11:33.373 { 00:11:33.373 "name": "BaseBdev1", 00:11:33.373 "uuid": "f37f60ba-2ec1-44c0-8ddb-ee64ae6d96e4", 00:11:33.373 "is_configured": true, 00:11:33.373 "data_offset": 2048, 00:11:33.373 "data_size": 63488 00:11:33.373 }, 00:11:33.373 { 00:11:33.373 "name": null, 00:11:33.373 "uuid": "c0740e12-6338-4fae-ad0c-24a8c62a7a4d", 00:11:33.373 "is_configured": false, 00:11:33.373 "data_offset": 2048, 00:11:33.373 "data_size": 63488 00:11:33.373 }, 00:11:33.373 { 00:11:33.373 "name": "BaseBdev3", 00:11:33.373 "uuid": "254d929e-3fa8-4ab7-b392-86a17736c666", 00:11:33.373 "is_configured": true, 00:11:33.373 "data_offset": 2048, 00:11:33.373 "data_size": 63488 00:11:33.373 } 00:11:33.373 ] 00:11:33.373 }' 00:11:33.373 18:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:33.373 18:48:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:33.940 18:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:33.940 18:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:33.940 18:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:11:33.940 18:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:11:34.198 [2024-07-24 18:48:19.069809] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:34.198 18:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:34.198 18:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:34.198 18:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:34.198 18:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:34.198 18:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:34.198 18:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:34.198 18:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:34.198 18:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:34.198 18:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:34.198 18:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:34.198 18:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:34.198 18:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:34.457 18:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:34.457 "name": "Existed_Raid", 00:11:34.457 "uuid": "268ca89e-3d14-4492-9316-2e7576f0b8ae", 00:11:34.457 "strip_size_kb": 64, 00:11:34.457 "state": "configuring", 00:11:34.457 "raid_level": "raid0", 00:11:34.457 "superblock": true, 00:11:34.457 "num_base_bdevs": 3, 00:11:34.457 "num_base_bdevs_discovered": 1, 00:11:34.457 "num_base_bdevs_operational": 3, 00:11:34.457 "base_bdevs_list": [ 00:11:34.457 { 00:11:34.457 "name": "BaseBdev1", 00:11:34.457 "uuid": "f37f60ba-2ec1-44c0-8ddb-ee64ae6d96e4", 00:11:34.457 "is_configured": true, 00:11:34.457 "data_offset": 2048, 00:11:34.457 "data_size": 63488 00:11:34.457 }, 00:11:34.457 { 00:11:34.457 "name": null, 00:11:34.457 "uuid": "c0740e12-6338-4fae-ad0c-24a8c62a7a4d", 00:11:34.457 "is_configured": false, 00:11:34.457 "data_offset": 2048, 00:11:34.457 "data_size": 63488 00:11:34.457 }, 00:11:34.457 { 00:11:34.457 "name": null, 00:11:34.457 "uuid": "254d929e-3fa8-4ab7-b392-86a17736c666", 00:11:34.457 "is_configured": false, 00:11:34.457 "data_offset": 2048, 00:11:34.457 "data_size": 63488 00:11:34.457 } 00:11:34.457 ] 00:11:34.457 }' 00:11:34.457 18:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:34.457 18:48:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:35.024 18:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:35.024 18:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:35.024 18:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:11:35.024 18:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:11:35.284 [2024-07-24 18:48:20.064397] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:35.284 18:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:35.284 18:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:35.284 18:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:35.284 18:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:35.284 18:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:35.284 18:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:35.284 18:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:35.284 18:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:35.284 18:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:35.284 18:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:35.284 18:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:35.284 18:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:35.284 18:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:35.284 "name": "Existed_Raid", 00:11:35.284 "uuid": "268ca89e-3d14-4492-9316-2e7576f0b8ae", 00:11:35.284 "strip_size_kb": 64, 00:11:35.284 "state": "configuring", 00:11:35.284 "raid_level": "raid0", 00:11:35.284 "superblock": true, 00:11:35.284 "num_base_bdevs": 3, 00:11:35.284 "num_base_bdevs_discovered": 2, 00:11:35.284 "num_base_bdevs_operational": 3, 00:11:35.284 "base_bdevs_list": [ 00:11:35.284 { 00:11:35.284 "name": "BaseBdev1", 00:11:35.284 "uuid": "f37f60ba-2ec1-44c0-8ddb-ee64ae6d96e4", 00:11:35.284 "is_configured": true, 00:11:35.284 "data_offset": 2048, 00:11:35.284 "data_size": 63488 00:11:35.284 }, 00:11:35.284 { 00:11:35.284 "name": null, 00:11:35.284 "uuid": "c0740e12-6338-4fae-ad0c-24a8c62a7a4d", 00:11:35.284 "is_configured": false, 00:11:35.284 "data_offset": 2048, 00:11:35.284 "data_size": 63488 00:11:35.284 }, 00:11:35.284 { 00:11:35.284 "name": "BaseBdev3", 00:11:35.284 "uuid": "254d929e-3fa8-4ab7-b392-86a17736c666", 00:11:35.284 "is_configured": true, 00:11:35.284 "data_offset": 2048, 00:11:35.284 "data_size": 63488 00:11:35.284 } 00:11:35.284 ] 00:11:35.284 }' 00:11:35.284 18:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:35.284 18:48:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:35.851 18:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:35.851 18:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:36.109 18:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:11:36.109 18:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:36.109 [2024-07-24 18:48:21.042947] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:36.110 18:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:36.110 18:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:36.110 18:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:36.110 18:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:36.110 18:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:36.110 18:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:36.110 18:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:36.110 18:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:36.110 18:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:36.110 18:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:36.110 18:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:36.110 18:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:36.368 18:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:36.368 "name": "Existed_Raid", 00:11:36.368 "uuid": "268ca89e-3d14-4492-9316-2e7576f0b8ae", 00:11:36.368 "strip_size_kb": 64, 00:11:36.368 "state": "configuring", 00:11:36.368 "raid_level": "raid0", 00:11:36.368 "superblock": true, 00:11:36.368 "num_base_bdevs": 3, 00:11:36.368 "num_base_bdevs_discovered": 1, 00:11:36.368 "num_base_bdevs_operational": 3, 00:11:36.368 "base_bdevs_list": [ 00:11:36.368 { 00:11:36.368 "name": null, 00:11:36.368 "uuid": "f37f60ba-2ec1-44c0-8ddb-ee64ae6d96e4", 00:11:36.368 "is_configured": false, 00:11:36.368 "data_offset": 2048, 00:11:36.368 "data_size": 63488 00:11:36.368 }, 00:11:36.368 { 00:11:36.368 "name": null, 00:11:36.368 "uuid": "c0740e12-6338-4fae-ad0c-24a8c62a7a4d", 00:11:36.368 "is_configured": false, 00:11:36.368 "data_offset": 2048, 00:11:36.368 "data_size": 63488 00:11:36.368 }, 00:11:36.368 { 00:11:36.368 "name": "BaseBdev3", 00:11:36.368 "uuid": "254d929e-3fa8-4ab7-b392-86a17736c666", 00:11:36.368 "is_configured": true, 00:11:36.368 "data_offset": 2048, 00:11:36.368 "data_size": 63488 00:11:36.368 } 00:11:36.368 ] 00:11:36.368 }' 00:11:36.368 18:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:36.368 18:48:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:36.935 18:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:36.935 18:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:36.935 18:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:11:36.935 18:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:11:37.194 [2024-07-24 18:48:22.039276] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:37.194 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:37.194 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:37.194 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:37.194 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:37.194 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:37.194 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:37.194 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:37.194 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:37.194 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:37.194 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:37.194 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.194 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:37.453 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:37.453 "name": "Existed_Raid", 00:11:37.453 "uuid": "268ca89e-3d14-4492-9316-2e7576f0b8ae", 00:11:37.453 "strip_size_kb": 64, 00:11:37.453 "state": "configuring", 00:11:37.453 "raid_level": "raid0", 00:11:37.453 "superblock": true, 00:11:37.453 "num_base_bdevs": 3, 00:11:37.453 "num_base_bdevs_discovered": 2, 00:11:37.453 "num_base_bdevs_operational": 3, 00:11:37.453 "base_bdevs_list": [ 00:11:37.453 { 00:11:37.453 "name": null, 00:11:37.453 "uuid": "f37f60ba-2ec1-44c0-8ddb-ee64ae6d96e4", 00:11:37.453 "is_configured": false, 00:11:37.453 "data_offset": 2048, 00:11:37.453 "data_size": 63488 00:11:37.453 }, 00:11:37.453 { 00:11:37.453 "name": "BaseBdev2", 00:11:37.453 "uuid": "c0740e12-6338-4fae-ad0c-24a8c62a7a4d", 00:11:37.453 "is_configured": true, 00:11:37.453 "data_offset": 2048, 00:11:37.453 "data_size": 63488 00:11:37.453 }, 00:11:37.453 { 00:11:37.453 "name": "BaseBdev3", 00:11:37.453 "uuid": "254d929e-3fa8-4ab7-b392-86a17736c666", 00:11:37.453 "is_configured": true, 00:11:37.453 "data_offset": 2048, 00:11:37.453 "data_size": 63488 00:11:37.453 } 00:11:37.453 ] 00:11:37.453 }' 00:11:37.453 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:37.453 18:48:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:37.712 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.712 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:37.971 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:11:37.971 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.971 18:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:11:38.230 18:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f37f60ba-2ec1-44c0-8ddb-ee64ae6d96e4 00:11:38.230 [2024-07-24 18:48:23.221095] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:11:38.230 [2024-07-24 18:48:23.221224] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xcb1c50 00:11:38.230 [2024-07-24 18:48:23.221232] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:38.230 [2024-07-24 18:48:23.221352] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc9d2d0 00:11:38.230 [2024-07-24 18:48:23.221438] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcb1c50 00:11:38.230 [2024-07-24 18:48:23.221444] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xcb1c50 00:11:38.230 [2024-07-24 18:48:23.221515] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:38.230 NewBaseBdev 00:11:38.230 18:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:11:38.230 18:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:11:38.230 18:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:38.230 18:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:38.230 18:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:38.230 18:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:38.230 18:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:38.490 18:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:11:38.749 [ 00:11:38.749 { 00:11:38.749 "name": "NewBaseBdev", 00:11:38.749 "aliases": [ 00:11:38.749 "f37f60ba-2ec1-44c0-8ddb-ee64ae6d96e4" 00:11:38.749 ], 00:11:38.749 "product_name": "Malloc disk", 00:11:38.749 "block_size": 512, 00:11:38.749 "num_blocks": 65536, 00:11:38.749 "uuid": "f37f60ba-2ec1-44c0-8ddb-ee64ae6d96e4", 00:11:38.749 "assigned_rate_limits": { 00:11:38.749 "rw_ios_per_sec": 0, 00:11:38.749 "rw_mbytes_per_sec": 0, 00:11:38.749 "r_mbytes_per_sec": 0, 00:11:38.749 "w_mbytes_per_sec": 0 00:11:38.749 }, 00:11:38.749 "claimed": true, 00:11:38.749 "claim_type": "exclusive_write", 00:11:38.749 "zoned": false, 00:11:38.749 "supported_io_types": { 00:11:38.749 "read": true, 00:11:38.749 "write": true, 00:11:38.749 "unmap": true, 00:11:38.749 "flush": true, 00:11:38.749 "reset": true, 00:11:38.749 "nvme_admin": false, 00:11:38.749 "nvme_io": false, 00:11:38.749 "nvme_io_md": false, 00:11:38.749 "write_zeroes": true, 00:11:38.749 "zcopy": true, 00:11:38.749 "get_zone_info": false, 00:11:38.749 "zone_management": false, 00:11:38.749 "zone_append": false, 00:11:38.749 "compare": false, 00:11:38.749 "compare_and_write": false, 00:11:38.749 "abort": true, 00:11:38.749 "seek_hole": false, 00:11:38.749 "seek_data": false, 00:11:38.749 "copy": true, 00:11:38.749 "nvme_iov_md": false 00:11:38.749 }, 00:11:38.749 "memory_domains": [ 00:11:38.749 { 00:11:38.749 "dma_device_id": "system", 00:11:38.749 "dma_device_type": 1 00:11:38.749 }, 00:11:38.749 { 00:11:38.749 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.749 "dma_device_type": 2 00:11:38.749 } 00:11:38.749 ], 00:11:38.749 "driver_specific": {} 00:11:38.749 } 00:11:38.749 ] 00:11:38.749 18:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:38.749 18:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:38.749 18:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:38.749 18:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:38.749 18:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:38.749 18:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:38.749 18:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:38.749 18:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:38.749 18:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:38.749 18:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:38.749 18:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:38.749 18:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:38.749 18:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:38.749 18:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:38.749 "name": "Existed_Raid", 00:11:38.749 "uuid": "268ca89e-3d14-4492-9316-2e7576f0b8ae", 00:11:38.749 "strip_size_kb": 64, 00:11:38.749 "state": "online", 00:11:38.749 "raid_level": "raid0", 00:11:38.749 "superblock": true, 00:11:38.749 "num_base_bdevs": 3, 00:11:38.749 "num_base_bdevs_discovered": 3, 00:11:38.749 "num_base_bdevs_operational": 3, 00:11:38.749 "base_bdevs_list": [ 00:11:38.749 { 00:11:38.749 "name": "NewBaseBdev", 00:11:38.749 "uuid": "f37f60ba-2ec1-44c0-8ddb-ee64ae6d96e4", 00:11:38.749 "is_configured": true, 00:11:38.749 "data_offset": 2048, 00:11:38.749 "data_size": 63488 00:11:38.749 }, 00:11:38.749 { 00:11:38.749 "name": "BaseBdev2", 00:11:38.749 "uuid": "c0740e12-6338-4fae-ad0c-24a8c62a7a4d", 00:11:38.749 "is_configured": true, 00:11:38.749 "data_offset": 2048, 00:11:38.749 "data_size": 63488 00:11:38.749 }, 00:11:38.749 { 00:11:38.749 "name": "BaseBdev3", 00:11:38.749 "uuid": "254d929e-3fa8-4ab7-b392-86a17736c666", 00:11:38.749 "is_configured": true, 00:11:38.749 "data_offset": 2048, 00:11:38.749 "data_size": 63488 00:11:38.749 } 00:11:38.749 ] 00:11:38.749 }' 00:11:38.749 18:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:38.749 18:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:39.317 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:11:39.317 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:39.317 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:39.317 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:39.317 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:39.317 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:39.317 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:39.317 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:39.576 [2024-07-24 18:48:24.356279] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:39.576 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:39.576 "name": "Existed_Raid", 00:11:39.576 "aliases": [ 00:11:39.576 "268ca89e-3d14-4492-9316-2e7576f0b8ae" 00:11:39.576 ], 00:11:39.576 "product_name": "Raid Volume", 00:11:39.576 "block_size": 512, 00:11:39.576 "num_blocks": 190464, 00:11:39.576 "uuid": "268ca89e-3d14-4492-9316-2e7576f0b8ae", 00:11:39.576 "assigned_rate_limits": { 00:11:39.576 "rw_ios_per_sec": 0, 00:11:39.576 "rw_mbytes_per_sec": 0, 00:11:39.576 "r_mbytes_per_sec": 0, 00:11:39.576 "w_mbytes_per_sec": 0 00:11:39.576 }, 00:11:39.576 "claimed": false, 00:11:39.576 "zoned": false, 00:11:39.576 "supported_io_types": { 00:11:39.576 "read": true, 00:11:39.576 "write": true, 00:11:39.576 "unmap": true, 00:11:39.576 "flush": true, 00:11:39.576 "reset": true, 00:11:39.576 "nvme_admin": false, 00:11:39.576 "nvme_io": false, 00:11:39.576 "nvme_io_md": false, 00:11:39.576 "write_zeroes": true, 00:11:39.576 "zcopy": false, 00:11:39.576 "get_zone_info": false, 00:11:39.576 "zone_management": false, 00:11:39.576 "zone_append": false, 00:11:39.576 "compare": false, 00:11:39.576 "compare_and_write": false, 00:11:39.576 "abort": false, 00:11:39.576 "seek_hole": false, 00:11:39.576 "seek_data": false, 00:11:39.576 "copy": false, 00:11:39.576 "nvme_iov_md": false 00:11:39.576 }, 00:11:39.576 "memory_domains": [ 00:11:39.576 { 00:11:39.576 "dma_device_id": "system", 00:11:39.576 "dma_device_type": 1 00:11:39.576 }, 00:11:39.576 { 00:11:39.576 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:39.576 "dma_device_type": 2 00:11:39.576 }, 00:11:39.576 { 00:11:39.576 "dma_device_id": "system", 00:11:39.576 "dma_device_type": 1 00:11:39.576 }, 00:11:39.576 { 00:11:39.576 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:39.576 "dma_device_type": 2 00:11:39.576 }, 00:11:39.576 { 00:11:39.576 "dma_device_id": "system", 00:11:39.576 "dma_device_type": 1 00:11:39.576 }, 00:11:39.576 { 00:11:39.576 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:39.576 "dma_device_type": 2 00:11:39.576 } 00:11:39.576 ], 00:11:39.576 "driver_specific": { 00:11:39.576 "raid": { 00:11:39.576 "uuid": "268ca89e-3d14-4492-9316-2e7576f0b8ae", 00:11:39.576 "strip_size_kb": 64, 00:11:39.576 "state": "online", 00:11:39.576 "raid_level": "raid0", 00:11:39.576 "superblock": true, 00:11:39.576 "num_base_bdevs": 3, 00:11:39.576 "num_base_bdevs_discovered": 3, 00:11:39.576 "num_base_bdevs_operational": 3, 00:11:39.576 "base_bdevs_list": [ 00:11:39.576 { 00:11:39.576 "name": "NewBaseBdev", 00:11:39.576 "uuid": "f37f60ba-2ec1-44c0-8ddb-ee64ae6d96e4", 00:11:39.576 "is_configured": true, 00:11:39.576 "data_offset": 2048, 00:11:39.576 "data_size": 63488 00:11:39.576 }, 00:11:39.576 { 00:11:39.576 "name": "BaseBdev2", 00:11:39.576 "uuid": "c0740e12-6338-4fae-ad0c-24a8c62a7a4d", 00:11:39.576 "is_configured": true, 00:11:39.576 "data_offset": 2048, 00:11:39.576 "data_size": 63488 00:11:39.576 }, 00:11:39.576 { 00:11:39.576 "name": "BaseBdev3", 00:11:39.576 "uuid": "254d929e-3fa8-4ab7-b392-86a17736c666", 00:11:39.576 "is_configured": true, 00:11:39.576 "data_offset": 2048, 00:11:39.576 "data_size": 63488 00:11:39.576 } 00:11:39.576 ] 00:11:39.576 } 00:11:39.576 } 00:11:39.576 }' 00:11:39.576 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:39.576 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:11:39.576 BaseBdev2 00:11:39.576 BaseBdev3' 00:11:39.576 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:39.576 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:11:39.576 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:39.576 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:39.576 "name": "NewBaseBdev", 00:11:39.576 "aliases": [ 00:11:39.576 "f37f60ba-2ec1-44c0-8ddb-ee64ae6d96e4" 00:11:39.576 ], 00:11:39.576 "product_name": "Malloc disk", 00:11:39.576 "block_size": 512, 00:11:39.576 "num_blocks": 65536, 00:11:39.576 "uuid": "f37f60ba-2ec1-44c0-8ddb-ee64ae6d96e4", 00:11:39.576 "assigned_rate_limits": { 00:11:39.576 "rw_ios_per_sec": 0, 00:11:39.576 "rw_mbytes_per_sec": 0, 00:11:39.576 "r_mbytes_per_sec": 0, 00:11:39.576 "w_mbytes_per_sec": 0 00:11:39.577 }, 00:11:39.577 "claimed": true, 00:11:39.577 "claim_type": "exclusive_write", 00:11:39.577 "zoned": false, 00:11:39.577 "supported_io_types": { 00:11:39.577 "read": true, 00:11:39.577 "write": true, 00:11:39.577 "unmap": true, 00:11:39.577 "flush": true, 00:11:39.577 "reset": true, 00:11:39.577 "nvme_admin": false, 00:11:39.577 "nvme_io": false, 00:11:39.577 "nvme_io_md": false, 00:11:39.577 "write_zeroes": true, 00:11:39.577 "zcopy": true, 00:11:39.577 "get_zone_info": false, 00:11:39.577 "zone_management": false, 00:11:39.577 "zone_append": false, 00:11:39.577 "compare": false, 00:11:39.577 "compare_and_write": false, 00:11:39.577 "abort": true, 00:11:39.577 "seek_hole": false, 00:11:39.577 "seek_data": false, 00:11:39.577 "copy": true, 00:11:39.577 "nvme_iov_md": false 00:11:39.577 }, 00:11:39.577 "memory_domains": [ 00:11:39.577 { 00:11:39.577 "dma_device_id": "system", 00:11:39.577 "dma_device_type": 1 00:11:39.577 }, 00:11:39.577 { 00:11:39.577 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:39.577 "dma_device_type": 2 00:11:39.577 } 00:11:39.577 ], 00:11:39.577 "driver_specific": {} 00:11:39.577 }' 00:11:39.577 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:39.836 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:39.836 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:39.836 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:39.836 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:39.836 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:39.836 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:39.836 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:39.836 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:39.836 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:39.836 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:40.094 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:40.094 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:40.094 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:40.094 18:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:40.094 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:40.094 "name": "BaseBdev2", 00:11:40.094 "aliases": [ 00:11:40.094 "c0740e12-6338-4fae-ad0c-24a8c62a7a4d" 00:11:40.094 ], 00:11:40.094 "product_name": "Malloc disk", 00:11:40.094 "block_size": 512, 00:11:40.094 "num_blocks": 65536, 00:11:40.094 "uuid": "c0740e12-6338-4fae-ad0c-24a8c62a7a4d", 00:11:40.094 "assigned_rate_limits": { 00:11:40.094 "rw_ios_per_sec": 0, 00:11:40.094 "rw_mbytes_per_sec": 0, 00:11:40.094 "r_mbytes_per_sec": 0, 00:11:40.094 "w_mbytes_per_sec": 0 00:11:40.094 }, 00:11:40.094 "claimed": true, 00:11:40.094 "claim_type": "exclusive_write", 00:11:40.094 "zoned": false, 00:11:40.094 "supported_io_types": { 00:11:40.094 "read": true, 00:11:40.094 "write": true, 00:11:40.094 "unmap": true, 00:11:40.094 "flush": true, 00:11:40.094 "reset": true, 00:11:40.094 "nvme_admin": false, 00:11:40.094 "nvme_io": false, 00:11:40.094 "nvme_io_md": false, 00:11:40.094 "write_zeroes": true, 00:11:40.094 "zcopy": true, 00:11:40.094 "get_zone_info": false, 00:11:40.094 "zone_management": false, 00:11:40.094 "zone_append": false, 00:11:40.094 "compare": false, 00:11:40.094 "compare_and_write": false, 00:11:40.094 "abort": true, 00:11:40.094 "seek_hole": false, 00:11:40.094 "seek_data": false, 00:11:40.094 "copy": true, 00:11:40.094 "nvme_iov_md": false 00:11:40.094 }, 00:11:40.094 "memory_domains": [ 00:11:40.094 { 00:11:40.094 "dma_device_id": "system", 00:11:40.094 "dma_device_type": 1 00:11:40.094 }, 00:11:40.094 { 00:11:40.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:40.094 "dma_device_type": 2 00:11:40.094 } 00:11:40.094 ], 00:11:40.094 "driver_specific": {} 00:11:40.094 }' 00:11:40.094 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:40.094 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:40.353 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:40.353 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:40.353 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:40.353 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:40.353 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:40.353 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:40.353 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:40.353 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:40.353 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:40.353 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:40.353 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:40.353 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:40.353 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:40.648 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:40.648 "name": "BaseBdev3", 00:11:40.648 "aliases": [ 00:11:40.649 "254d929e-3fa8-4ab7-b392-86a17736c666" 00:11:40.649 ], 00:11:40.649 "product_name": "Malloc disk", 00:11:40.649 "block_size": 512, 00:11:40.649 "num_blocks": 65536, 00:11:40.649 "uuid": "254d929e-3fa8-4ab7-b392-86a17736c666", 00:11:40.649 "assigned_rate_limits": { 00:11:40.649 "rw_ios_per_sec": 0, 00:11:40.649 "rw_mbytes_per_sec": 0, 00:11:40.649 "r_mbytes_per_sec": 0, 00:11:40.649 "w_mbytes_per_sec": 0 00:11:40.649 }, 00:11:40.649 "claimed": true, 00:11:40.649 "claim_type": "exclusive_write", 00:11:40.649 "zoned": false, 00:11:40.649 "supported_io_types": { 00:11:40.649 "read": true, 00:11:40.649 "write": true, 00:11:40.649 "unmap": true, 00:11:40.649 "flush": true, 00:11:40.649 "reset": true, 00:11:40.649 "nvme_admin": false, 00:11:40.649 "nvme_io": false, 00:11:40.649 "nvme_io_md": false, 00:11:40.649 "write_zeroes": true, 00:11:40.649 "zcopy": true, 00:11:40.649 "get_zone_info": false, 00:11:40.649 "zone_management": false, 00:11:40.649 "zone_append": false, 00:11:40.649 "compare": false, 00:11:40.649 "compare_and_write": false, 00:11:40.649 "abort": true, 00:11:40.649 "seek_hole": false, 00:11:40.649 "seek_data": false, 00:11:40.649 "copy": true, 00:11:40.649 "nvme_iov_md": false 00:11:40.649 }, 00:11:40.649 "memory_domains": [ 00:11:40.649 { 00:11:40.649 "dma_device_id": "system", 00:11:40.649 "dma_device_type": 1 00:11:40.649 }, 00:11:40.649 { 00:11:40.649 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:40.649 "dma_device_type": 2 00:11:40.649 } 00:11:40.649 ], 00:11:40.649 "driver_specific": {} 00:11:40.649 }' 00:11:40.649 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:40.649 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:40.649 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:40.649 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:40.649 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:40.926 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:40.926 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:40.926 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:40.926 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:40.926 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:40.926 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:40.926 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:40.926 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:41.186 [2024-07-24 18:48:25.956222] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:41.186 [2024-07-24 18:48:25.956244] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:41.186 [2024-07-24 18:48:25.956284] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:41.186 [2024-07-24 18:48:25.956319] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:41.186 [2024-07-24 18:48:25.956325] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcb1c50 name Existed_Raid, state offline 00:11:41.186 18:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2071226 00:11:41.186 18:48:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2071226 ']' 00:11:41.186 18:48:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2071226 00:11:41.186 18:48:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:41.186 18:48:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:41.186 18:48:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2071226 00:11:41.186 18:48:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:41.186 18:48:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:41.186 18:48:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2071226' 00:11:41.186 killing process with pid 2071226 00:11:41.186 18:48:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2071226 00:11:41.186 [2024-07-24 18:48:26.020685] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:41.186 18:48:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2071226 00:11:41.186 [2024-07-24 18:48:26.044216] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:41.446 18:48:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:41.446 00:11:41.446 real 0m21.331s 00:11:41.446 user 0m39.763s 00:11:41.446 sys 0m3.320s 00:11:41.446 18:48:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:41.446 18:48:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:41.446 ************************************ 00:11:41.446 END TEST raid_state_function_test_sb 00:11:41.446 ************************************ 00:11:41.446 18:48:26 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:11:41.446 18:48:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:41.446 18:48:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:41.446 18:48:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:41.446 ************************************ 00:11:41.446 START TEST raid_superblock_test 00:11:41.446 ************************************ 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2075650 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2075650 /var/tmp/spdk-raid.sock 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2075650 ']' 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:41.446 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:41.446 18:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:41.446 [2024-07-24 18:48:26.336946] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:11:41.446 [2024-07-24 18:48:26.336989] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2075650 ] 00:11:41.446 [2024-07-24 18:48:26.401259] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:41.705 [2024-07-24 18:48:26.476188] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:41.706 [2024-07-24 18:48:26.532346] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:41.706 [2024-07-24 18:48:26.532377] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:42.273 18:48:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:42.273 18:48:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:11:42.273 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:42.273 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:42.273 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:42.273 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:42.273 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:42.273 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:42.273 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:42.273 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:42.273 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:42.533 malloc1 00:11:42.533 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:42.533 [2024-07-24 18:48:27.452742] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:42.533 [2024-07-24 18:48:27.452781] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:42.533 [2024-07-24 18:48:27.452792] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1758e20 00:11:42.533 [2024-07-24 18:48:27.452798] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:42.533 [2024-07-24 18:48:27.453931] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:42.533 [2024-07-24 18:48:27.453954] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:42.533 pt1 00:11:42.533 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:42.533 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:42.533 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:42.533 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:42.533 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:42.533 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:42.533 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:42.533 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:42.533 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:42.791 malloc2 00:11:42.791 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:43.050 [2024-07-24 18:48:27.813274] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:43.050 [2024-07-24 18:48:27.813305] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:43.050 [2024-07-24 18:48:27.813314] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1902ed0 00:11:43.050 [2024-07-24 18:48:27.813319] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:43.050 [2024-07-24 18:48:27.814306] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:43.050 [2024-07-24 18:48:27.814328] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:43.050 pt2 00:11:43.050 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:43.050 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:43.050 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:11:43.050 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:11:43.050 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:11:43.050 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:43.050 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:43.050 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:43.050 18:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:11:43.050 malloc3 00:11:43.050 18:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:11:43.309 [2024-07-24 18:48:28.165659] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:11:43.309 [2024-07-24 18:48:28.165687] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:43.309 [2024-07-24 18:48:28.165700] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1906a30 00:11:43.309 [2024-07-24 18:48:28.165706] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:43.309 [2024-07-24 18:48:28.166590] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:43.309 [2024-07-24 18:48:28.166609] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:11:43.309 pt3 00:11:43.309 18:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:43.309 18:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:43.309 18:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:11:43.568 [2024-07-24 18:48:28.338128] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:43.568 [2024-07-24 18:48:28.338919] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:43.568 [2024-07-24 18:48:28.338955] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:11:43.568 [2024-07-24 18:48:28.339053] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1907a40 00:11:43.568 [2024-07-24 18:48:28.339060] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:43.568 [2024-07-24 18:48:28.339183] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1902050 00:11:43.568 [2024-07-24 18:48:28.339271] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1907a40 00:11:43.568 [2024-07-24 18:48:28.339277] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1907a40 00:11:43.568 [2024-07-24 18:48:28.339333] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:43.568 18:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:11:43.568 18:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:43.568 18:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:43.568 18:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:43.568 18:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:43.568 18:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:43.568 18:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:43.568 18:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:43.568 18:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:43.568 18:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:43.568 18:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:43.568 18:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:43.568 18:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:43.568 "name": "raid_bdev1", 00:11:43.568 "uuid": "b49c96bf-0e0b-4c71-9a4b-2046e42f4bc2", 00:11:43.568 "strip_size_kb": 64, 00:11:43.568 "state": "online", 00:11:43.568 "raid_level": "raid0", 00:11:43.568 "superblock": true, 00:11:43.568 "num_base_bdevs": 3, 00:11:43.568 "num_base_bdevs_discovered": 3, 00:11:43.568 "num_base_bdevs_operational": 3, 00:11:43.568 "base_bdevs_list": [ 00:11:43.568 { 00:11:43.568 "name": "pt1", 00:11:43.568 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:43.568 "is_configured": true, 00:11:43.568 "data_offset": 2048, 00:11:43.568 "data_size": 63488 00:11:43.568 }, 00:11:43.568 { 00:11:43.568 "name": "pt2", 00:11:43.568 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:43.568 "is_configured": true, 00:11:43.568 "data_offset": 2048, 00:11:43.568 "data_size": 63488 00:11:43.568 }, 00:11:43.568 { 00:11:43.568 "name": "pt3", 00:11:43.568 "uuid": "00000000-0000-0000-0000-000000000003", 00:11:43.568 "is_configured": true, 00:11:43.568 "data_offset": 2048, 00:11:43.568 "data_size": 63488 00:11:43.568 } 00:11:43.568 ] 00:11:43.568 }' 00:11:43.568 18:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:43.568 18:48:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:44.134 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:44.134 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:44.134 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:44.134 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:44.134 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:44.134 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:44.134 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:44.134 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:44.393 [2024-07-24 18:48:29.168434] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:44.393 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:44.393 "name": "raid_bdev1", 00:11:44.393 "aliases": [ 00:11:44.393 "b49c96bf-0e0b-4c71-9a4b-2046e42f4bc2" 00:11:44.393 ], 00:11:44.393 "product_name": "Raid Volume", 00:11:44.393 "block_size": 512, 00:11:44.393 "num_blocks": 190464, 00:11:44.393 "uuid": "b49c96bf-0e0b-4c71-9a4b-2046e42f4bc2", 00:11:44.393 "assigned_rate_limits": { 00:11:44.393 "rw_ios_per_sec": 0, 00:11:44.393 "rw_mbytes_per_sec": 0, 00:11:44.393 "r_mbytes_per_sec": 0, 00:11:44.393 "w_mbytes_per_sec": 0 00:11:44.393 }, 00:11:44.393 "claimed": false, 00:11:44.393 "zoned": false, 00:11:44.393 "supported_io_types": { 00:11:44.393 "read": true, 00:11:44.393 "write": true, 00:11:44.393 "unmap": true, 00:11:44.393 "flush": true, 00:11:44.393 "reset": true, 00:11:44.393 "nvme_admin": false, 00:11:44.393 "nvme_io": false, 00:11:44.393 "nvme_io_md": false, 00:11:44.393 "write_zeroes": true, 00:11:44.393 "zcopy": false, 00:11:44.393 "get_zone_info": false, 00:11:44.393 "zone_management": false, 00:11:44.393 "zone_append": false, 00:11:44.393 "compare": false, 00:11:44.393 "compare_and_write": false, 00:11:44.393 "abort": false, 00:11:44.393 "seek_hole": false, 00:11:44.393 "seek_data": false, 00:11:44.393 "copy": false, 00:11:44.393 "nvme_iov_md": false 00:11:44.393 }, 00:11:44.393 "memory_domains": [ 00:11:44.393 { 00:11:44.393 "dma_device_id": "system", 00:11:44.393 "dma_device_type": 1 00:11:44.393 }, 00:11:44.393 { 00:11:44.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:44.393 "dma_device_type": 2 00:11:44.393 }, 00:11:44.393 { 00:11:44.393 "dma_device_id": "system", 00:11:44.393 "dma_device_type": 1 00:11:44.393 }, 00:11:44.393 { 00:11:44.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:44.393 "dma_device_type": 2 00:11:44.393 }, 00:11:44.393 { 00:11:44.393 "dma_device_id": "system", 00:11:44.393 "dma_device_type": 1 00:11:44.393 }, 00:11:44.393 { 00:11:44.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:44.393 "dma_device_type": 2 00:11:44.393 } 00:11:44.393 ], 00:11:44.393 "driver_specific": { 00:11:44.393 "raid": { 00:11:44.393 "uuid": "b49c96bf-0e0b-4c71-9a4b-2046e42f4bc2", 00:11:44.393 "strip_size_kb": 64, 00:11:44.393 "state": "online", 00:11:44.393 "raid_level": "raid0", 00:11:44.393 "superblock": true, 00:11:44.393 "num_base_bdevs": 3, 00:11:44.393 "num_base_bdevs_discovered": 3, 00:11:44.393 "num_base_bdevs_operational": 3, 00:11:44.393 "base_bdevs_list": [ 00:11:44.393 { 00:11:44.393 "name": "pt1", 00:11:44.393 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:44.393 "is_configured": true, 00:11:44.393 "data_offset": 2048, 00:11:44.393 "data_size": 63488 00:11:44.393 }, 00:11:44.393 { 00:11:44.393 "name": "pt2", 00:11:44.393 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:44.393 "is_configured": true, 00:11:44.393 "data_offset": 2048, 00:11:44.393 "data_size": 63488 00:11:44.393 }, 00:11:44.393 { 00:11:44.393 "name": "pt3", 00:11:44.393 "uuid": "00000000-0000-0000-0000-000000000003", 00:11:44.393 "is_configured": true, 00:11:44.393 "data_offset": 2048, 00:11:44.393 "data_size": 63488 00:11:44.393 } 00:11:44.393 ] 00:11:44.393 } 00:11:44.393 } 00:11:44.393 }' 00:11:44.393 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:44.393 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:44.393 pt2 00:11:44.393 pt3' 00:11:44.393 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:44.393 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:44.393 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:44.393 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:44.393 "name": "pt1", 00:11:44.393 "aliases": [ 00:11:44.393 "00000000-0000-0000-0000-000000000001" 00:11:44.393 ], 00:11:44.393 "product_name": "passthru", 00:11:44.393 "block_size": 512, 00:11:44.393 "num_blocks": 65536, 00:11:44.393 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:44.393 "assigned_rate_limits": { 00:11:44.393 "rw_ios_per_sec": 0, 00:11:44.393 "rw_mbytes_per_sec": 0, 00:11:44.393 "r_mbytes_per_sec": 0, 00:11:44.393 "w_mbytes_per_sec": 0 00:11:44.393 }, 00:11:44.393 "claimed": true, 00:11:44.393 "claim_type": "exclusive_write", 00:11:44.393 "zoned": false, 00:11:44.393 "supported_io_types": { 00:11:44.393 "read": true, 00:11:44.393 "write": true, 00:11:44.393 "unmap": true, 00:11:44.393 "flush": true, 00:11:44.393 "reset": true, 00:11:44.393 "nvme_admin": false, 00:11:44.393 "nvme_io": false, 00:11:44.393 "nvme_io_md": false, 00:11:44.393 "write_zeroes": true, 00:11:44.393 "zcopy": true, 00:11:44.393 "get_zone_info": false, 00:11:44.393 "zone_management": false, 00:11:44.393 "zone_append": false, 00:11:44.393 "compare": false, 00:11:44.393 "compare_and_write": false, 00:11:44.393 "abort": true, 00:11:44.393 "seek_hole": false, 00:11:44.393 "seek_data": false, 00:11:44.393 "copy": true, 00:11:44.393 "nvme_iov_md": false 00:11:44.393 }, 00:11:44.393 "memory_domains": [ 00:11:44.393 { 00:11:44.393 "dma_device_id": "system", 00:11:44.393 "dma_device_type": 1 00:11:44.393 }, 00:11:44.393 { 00:11:44.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:44.393 "dma_device_type": 2 00:11:44.393 } 00:11:44.393 ], 00:11:44.393 "driver_specific": { 00:11:44.393 "passthru": { 00:11:44.393 "name": "pt1", 00:11:44.393 "base_bdev_name": "malloc1" 00:11:44.393 } 00:11:44.393 } 00:11:44.393 }' 00:11:44.393 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:44.653 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:44.653 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:44.653 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:44.653 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:44.653 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:44.653 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:44.653 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:44.653 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:44.653 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:44.912 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:44.912 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:44.912 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:44.912 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:44.912 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:44.912 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:44.912 "name": "pt2", 00:11:44.912 "aliases": [ 00:11:44.913 "00000000-0000-0000-0000-000000000002" 00:11:44.913 ], 00:11:44.913 "product_name": "passthru", 00:11:44.913 "block_size": 512, 00:11:44.913 "num_blocks": 65536, 00:11:44.913 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:44.913 "assigned_rate_limits": { 00:11:44.913 "rw_ios_per_sec": 0, 00:11:44.913 "rw_mbytes_per_sec": 0, 00:11:44.913 "r_mbytes_per_sec": 0, 00:11:44.913 "w_mbytes_per_sec": 0 00:11:44.913 }, 00:11:44.913 "claimed": true, 00:11:44.913 "claim_type": "exclusive_write", 00:11:44.913 "zoned": false, 00:11:44.913 "supported_io_types": { 00:11:44.913 "read": true, 00:11:44.913 "write": true, 00:11:44.913 "unmap": true, 00:11:44.913 "flush": true, 00:11:44.913 "reset": true, 00:11:44.913 "nvme_admin": false, 00:11:44.913 "nvme_io": false, 00:11:44.913 "nvme_io_md": false, 00:11:44.913 "write_zeroes": true, 00:11:44.913 "zcopy": true, 00:11:44.913 "get_zone_info": false, 00:11:44.913 "zone_management": false, 00:11:44.913 "zone_append": false, 00:11:44.913 "compare": false, 00:11:44.913 "compare_and_write": false, 00:11:44.913 "abort": true, 00:11:44.913 "seek_hole": false, 00:11:44.913 "seek_data": false, 00:11:44.913 "copy": true, 00:11:44.913 "nvme_iov_md": false 00:11:44.913 }, 00:11:44.913 "memory_domains": [ 00:11:44.913 { 00:11:44.913 "dma_device_id": "system", 00:11:44.913 "dma_device_type": 1 00:11:44.913 }, 00:11:44.913 { 00:11:44.913 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:44.913 "dma_device_type": 2 00:11:44.913 } 00:11:44.913 ], 00:11:44.913 "driver_specific": { 00:11:44.913 "passthru": { 00:11:44.913 "name": "pt2", 00:11:44.913 "base_bdev_name": "malloc2" 00:11:44.913 } 00:11:44.913 } 00:11:44.913 }' 00:11:44.913 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:44.913 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:45.172 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:45.172 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:45.172 18:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:45.172 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:45.172 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:45.172 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:45.172 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:45.172 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:45.172 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:45.431 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:45.431 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:45.431 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:11:45.431 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:45.431 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:45.431 "name": "pt3", 00:11:45.431 "aliases": [ 00:11:45.431 "00000000-0000-0000-0000-000000000003" 00:11:45.431 ], 00:11:45.431 "product_name": "passthru", 00:11:45.431 "block_size": 512, 00:11:45.431 "num_blocks": 65536, 00:11:45.431 "uuid": "00000000-0000-0000-0000-000000000003", 00:11:45.431 "assigned_rate_limits": { 00:11:45.431 "rw_ios_per_sec": 0, 00:11:45.431 "rw_mbytes_per_sec": 0, 00:11:45.431 "r_mbytes_per_sec": 0, 00:11:45.431 "w_mbytes_per_sec": 0 00:11:45.431 }, 00:11:45.431 "claimed": true, 00:11:45.431 "claim_type": "exclusive_write", 00:11:45.431 "zoned": false, 00:11:45.431 "supported_io_types": { 00:11:45.431 "read": true, 00:11:45.431 "write": true, 00:11:45.431 "unmap": true, 00:11:45.431 "flush": true, 00:11:45.431 "reset": true, 00:11:45.431 "nvme_admin": false, 00:11:45.431 "nvme_io": false, 00:11:45.431 "nvme_io_md": false, 00:11:45.431 "write_zeroes": true, 00:11:45.431 "zcopy": true, 00:11:45.431 "get_zone_info": false, 00:11:45.431 "zone_management": false, 00:11:45.431 "zone_append": false, 00:11:45.431 "compare": false, 00:11:45.431 "compare_and_write": false, 00:11:45.431 "abort": true, 00:11:45.431 "seek_hole": false, 00:11:45.431 "seek_data": false, 00:11:45.431 "copy": true, 00:11:45.431 "nvme_iov_md": false 00:11:45.431 }, 00:11:45.431 "memory_domains": [ 00:11:45.431 { 00:11:45.431 "dma_device_id": "system", 00:11:45.431 "dma_device_type": 1 00:11:45.431 }, 00:11:45.431 { 00:11:45.431 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:45.431 "dma_device_type": 2 00:11:45.431 } 00:11:45.431 ], 00:11:45.431 "driver_specific": { 00:11:45.431 "passthru": { 00:11:45.431 "name": "pt3", 00:11:45.431 "base_bdev_name": "malloc3" 00:11:45.431 } 00:11:45.431 } 00:11:45.431 }' 00:11:45.431 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:45.431 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:45.431 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:45.431 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:45.690 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:45.690 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:45.690 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:45.690 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:45.690 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:45.690 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:45.690 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:45.690 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:45.690 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:45.690 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:45.949 [2024-07-24 18:48:30.844785] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:45.949 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=b49c96bf-0e0b-4c71-9a4b-2046e42f4bc2 00:11:45.949 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z b49c96bf-0e0b-4c71-9a4b-2046e42f4bc2 ']' 00:11:45.949 18:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:46.208 [2024-07-24 18:48:31.025056] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:46.208 [2024-07-24 18:48:31.025069] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:46.208 [2024-07-24 18:48:31.025108] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:46.208 [2024-07-24 18:48:31.025144] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:46.208 [2024-07-24 18:48:31.025149] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1907a40 name raid_bdev1, state offline 00:11:46.208 18:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:46.208 18:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:46.208 18:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:46.208 18:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:46.208 18:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:46.208 18:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:46.467 18:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:46.467 18:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:46.726 18:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:46.726 18:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:11:46.726 18:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:46.726 18:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:46.985 18:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:11:46.985 18:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:11:46.985 18:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:11:46.985 18:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:11:46.985 18:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:46.985 18:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:46.985 18:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:46.985 18:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:46.985 18:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:46.985 18:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:46.985 18:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:46.985 18:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:46.985 18:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:11:47.244 [2024-07-24 18:48:32.063714] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:47.244 [2024-07-24 18:48:32.064709] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:47.244 [2024-07-24 18:48:32.064742] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:11:47.244 [2024-07-24 18:48:32.064775] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:47.244 [2024-07-24 18:48:32.064802] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:47.244 [2024-07-24 18:48:32.064814] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:11:47.244 [2024-07-24 18:48:32.064840] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:47.244 [2024-07-24 18:48:32.064845] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1903100 name raid_bdev1, state configuring 00:11:47.244 request: 00:11:47.244 { 00:11:47.244 "name": "raid_bdev1", 00:11:47.244 "raid_level": "raid0", 00:11:47.244 "base_bdevs": [ 00:11:47.244 "malloc1", 00:11:47.244 "malloc2", 00:11:47.244 "malloc3" 00:11:47.244 ], 00:11:47.244 "strip_size_kb": 64, 00:11:47.244 "superblock": false, 00:11:47.244 "method": "bdev_raid_create", 00:11:47.244 "req_id": 1 00:11:47.244 } 00:11:47.244 Got JSON-RPC error response 00:11:47.244 response: 00:11:47.244 { 00:11:47.244 "code": -17, 00:11:47.244 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:47.244 } 00:11:47.244 18:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:11:47.244 18:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:47.244 18:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:11:47.244 18:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:47.244 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:11:47.244 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:47.503 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:11:47.503 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:11:47.503 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:47.503 [2024-07-24 18:48:32.412579] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:47.503 [2024-07-24 18:48:32.412608] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:47.503 [2024-07-24 18:48:32.412618] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1903570 00:11:47.503 [2024-07-24 18:48:32.412624] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:47.503 [2024-07-24 18:48:32.413936] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:47.503 [2024-07-24 18:48:32.413958] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:47.503 [2024-07-24 18:48:32.414004] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:47.503 [2024-07-24 18:48:32.414022] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:47.503 pt1 00:11:47.503 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:11:47.503 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:47.503 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:47.503 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:47.503 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:47.503 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:47.503 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:47.503 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:47.503 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:47.503 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:47.503 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:47.503 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:47.762 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:47.762 "name": "raid_bdev1", 00:11:47.762 "uuid": "b49c96bf-0e0b-4c71-9a4b-2046e42f4bc2", 00:11:47.762 "strip_size_kb": 64, 00:11:47.762 "state": "configuring", 00:11:47.762 "raid_level": "raid0", 00:11:47.762 "superblock": true, 00:11:47.762 "num_base_bdevs": 3, 00:11:47.762 "num_base_bdevs_discovered": 1, 00:11:47.762 "num_base_bdevs_operational": 3, 00:11:47.762 "base_bdevs_list": [ 00:11:47.762 { 00:11:47.762 "name": "pt1", 00:11:47.762 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:47.762 "is_configured": true, 00:11:47.762 "data_offset": 2048, 00:11:47.762 "data_size": 63488 00:11:47.762 }, 00:11:47.762 { 00:11:47.762 "name": null, 00:11:47.762 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:47.762 "is_configured": false, 00:11:47.762 "data_offset": 2048, 00:11:47.762 "data_size": 63488 00:11:47.762 }, 00:11:47.762 { 00:11:47.762 "name": null, 00:11:47.762 "uuid": "00000000-0000-0000-0000-000000000003", 00:11:47.762 "is_configured": false, 00:11:47.762 "data_offset": 2048, 00:11:47.762 "data_size": 63488 00:11:47.762 } 00:11:47.762 ] 00:11:47.762 }' 00:11:47.762 18:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:47.762 18:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:48.330 18:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:11:48.331 18:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:48.331 [2024-07-24 18:48:33.258784] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:48.331 [2024-07-24 18:48:33.258815] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:48.331 [2024-07-24 18:48:33.258828] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1759a40 00:11:48.331 [2024-07-24 18:48:33.258836] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:48.331 [2024-07-24 18:48:33.259119] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:48.331 [2024-07-24 18:48:33.259132] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:48.331 [2024-07-24 18:48:33.259175] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:48.331 [2024-07-24 18:48:33.259189] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:48.331 pt2 00:11:48.331 18:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:48.588 [2024-07-24 18:48:33.439261] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:11:48.588 18:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:11:48.588 18:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:48.588 18:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:48.588 18:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:48.588 18:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:48.588 18:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:48.588 18:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:48.588 18:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:48.588 18:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:48.588 18:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:48.588 18:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:48.588 18:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:48.846 18:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:48.846 "name": "raid_bdev1", 00:11:48.846 "uuid": "b49c96bf-0e0b-4c71-9a4b-2046e42f4bc2", 00:11:48.846 "strip_size_kb": 64, 00:11:48.846 "state": "configuring", 00:11:48.846 "raid_level": "raid0", 00:11:48.846 "superblock": true, 00:11:48.846 "num_base_bdevs": 3, 00:11:48.846 "num_base_bdevs_discovered": 1, 00:11:48.846 "num_base_bdevs_operational": 3, 00:11:48.846 "base_bdevs_list": [ 00:11:48.846 { 00:11:48.846 "name": "pt1", 00:11:48.846 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:48.846 "is_configured": true, 00:11:48.846 "data_offset": 2048, 00:11:48.846 "data_size": 63488 00:11:48.846 }, 00:11:48.846 { 00:11:48.846 "name": null, 00:11:48.846 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:48.846 "is_configured": false, 00:11:48.846 "data_offset": 2048, 00:11:48.846 "data_size": 63488 00:11:48.846 }, 00:11:48.846 { 00:11:48.846 "name": null, 00:11:48.846 "uuid": "00000000-0000-0000-0000-000000000003", 00:11:48.846 "is_configured": false, 00:11:48.846 "data_offset": 2048, 00:11:48.846 "data_size": 63488 00:11:48.846 } 00:11:48.846 ] 00:11:48.846 }' 00:11:48.846 18:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:48.846 18:48:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:49.413 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:11:49.413 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:49.413 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:49.413 [2024-07-24 18:48:34.265373] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:49.413 [2024-07-24 18:48:34.265408] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:49.413 [2024-07-24 18:48:34.265418] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17591f0 00:11:49.413 [2024-07-24 18:48:34.265424] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:49.413 [2024-07-24 18:48:34.265685] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:49.413 [2024-07-24 18:48:34.265695] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:49.413 [2024-07-24 18:48:34.265736] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:49.413 [2024-07-24 18:48:34.265748] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:49.413 pt2 00:11:49.413 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:49.413 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:49.413 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:11:49.672 [2024-07-24 18:48:34.429795] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:11:49.672 [2024-07-24 18:48:34.429817] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:49.672 [2024-07-24 18:48:34.429825] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19056d0 00:11:49.672 [2024-07-24 18:48:34.429830] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:49.672 [2024-07-24 18:48:34.430042] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:49.672 [2024-07-24 18:48:34.430051] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:11:49.672 [2024-07-24 18:48:34.430085] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:11:49.672 [2024-07-24 18:48:34.430096] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:11:49.672 [2024-07-24 18:48:34.430163] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x190a0c0 00:11:49.672 [2024-07-24 18:48:34.430168] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:49.672 [2024-07-24 18:48:34.430271] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17d4310 00:11:49.672 [2024-07-24 18:48:34.430351] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x190a0c0 00:11:49.672 [2024-07-24 18:48:34.430357] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x190a0c0 00:11:49.672 [2024-07-24 18:48:34.430416] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:49.672 pt3 00:11:49.672 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:49.673 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:49.673 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:11:49.673 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:49.673 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:49.673 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:49.673 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:49.673 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:49.673 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:49.673 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:49.673 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:49.673 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:49.673 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:49.673 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:49.673 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:49.673 "name": "raid_bdev1", 00:11:49.673 "uuid": "b49c96bf-0e0b-4c71-9a4b-2046e42f4bc2", 00:11:49.673 "strip_size_kb": 64, 00:11:49.673 "state": "online", 00:11:49.673 "raid_level": "raid0", 00:11:49.673 "superblock": true, 00:11:49.673 "num_base_bdevs": 3, 00:11:49.673 "num_base_bdevs_discovered": 3, 00:11:49.673 "num_base_bdevs_operational": 3, 00:11:49.673 "base_bdevs_list": [ 00:11:49.673 { 00:11:49.673 "name": "pt1", 00:11:49.673 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:49.673 "is_configured": true, 00:11:49.673 "data_offset": 2048, 00:11:49.673 "data_size": 63488 00:11:49.673 }, 00:11:49.673 { 00:11:49.673 "name": "pt2", 00:11:49.673 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:49.673 "is_configured": true, 00:11:49.673 "data_offset": 2048, 00:11:49.673 "data_size": 63488 00:11:49.673 }, 00:11:49.673 { 00:11:49.673 "name": "pt3", 00:11:49.673 "uuid": "00000000-0000-0000-0000-000000000003", 00:11:49.673 "is_configured": true, 00:11:49.673 "data_offset": 2048, 00:11:49.673 "data_size": 63488 00:11:49.673 } 00:11:49.673 ] 00:11:49.673 }' 00:11:49.673 18:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:49.673 18:48:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:50.240 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:50.240 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:50.240 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:50.240 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:50.240 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:50.240 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:50.240 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:50.240 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:50.500 [2024-07-24 18:48:35.260298] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:50.500 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:50.500 "name": "raid_bdev1", 00:11:50.500 "aliases": [ 00:11:50.500 "b49c96bf-0e0b-4c71-9a4b-2046e42f4bc2" 00:11:50.500 ], 00:11:50.500 "product_name": "Raid Volume", 00:11:50.500 "block_size": 512, 00:11:50.500 "num_blocks": 190464, 00:11:50.500 "uuid": "b49c96bf-0e0b-4c71-9a4b-2046e42f4bc2", 00:11:50.500 "assigned_rate_limits": { 00:11:50.500 "rw_ios_per_sec": 0, 00:11:50.500 "rw_mbytes_per_sec": 0, 00:11:50.500 "r_mbytes_per_sec": 0, 00:11:50.500 "w_mbytes_per_sec": 0 00:11:50.500 }, 00:11:50.500 "claimed": false, 00:11:50.500 "zoned": false, 00:11:50.500 "supported_io_types": { 00:11:50.500 "read": true, 00:11:50.500 "write": true, 00:11:50.500 "unmap": true, 00:11:50.500 "flush": true, 00:11:50.500 "reset": true, 00:11:50.500 "nvme_admin": false, 00:11:50.500 "nvme_io": false, 00:11:50.500 "nvme_io_md": false, 00:11:50.500 "write_zeroes": true, 00:11:50.500 "zcopy": false, 00:11:50.500 "get_zone_info": false, 00:11:50.500 "zone_management": false, 00:11:50.500 "zone_append": false, 00:11:50.500 "compare": false, 00:11:50.500 "compare_and_write": false, 00:11:50.500 "abort": false, 00:11:50.500 "seek_hole": false, 00:11:50.500 "seek_data": false, 00:11:50.500 "copy": false, 00:11:50.500 "nvme_iov_md": false 00:11:50.500 }, 00:11:50.500 "memory_domains": [ 00:11:50.500 { 00:11:50.500 "dma_device_id": "system", 00:11:50.500 "dma_device_type": 1 00:11:50.500 }, 00:11:50.500 { 00:11:50.500 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:50.500 "dma_device_type": 2 00:11:50.500 }, 00:11:50.500 { 00:11:50.500 "dma_device_id": "system", 00:11:50.500 "dma_device_type": 1 00:11:50.500 }, 00:11:50.500 { 00:11:50.500 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:50.500 "dma_device_type": 2 00:11:50.500 }, 00:11:50.500 { 00:11:50.500 "dma_device_id": "system", 00:11:50.500 "dma_device_type": 1 00:11:50.500 }, 00:11:50.500 { 00:11:50.500 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:50.500 "dma_device_type": 2 00:11:50.500 } 00:11:50.500 ], 00:11:50.500 "driver_specific": { 00:11:50.500 "raid": { 00:11:50.500 "uuid": "b49c96bf-0e0b-4c71-9a4b-2046e42f4bc2", 00:11:50.500 "strip_size_kb": 64, 00:11:50.500 "state": "online", 00:11:50.500 "raid_level": "raid0", 00:11:50.500 "superblock": true, 00:11:50.500 "num_base_bdevs": 3, 00:11:50.500 "num_base_bdevs_discovered": 3, 00:11:50.500 "num_base_bdevs_operational": 3, 00:11:50.500 "base_bdevs_list": [ 00:11:50.500 { 00:11:50.500 "name": "pt1", 00:11:50.500 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:50.500 "is_configured": true, 00:11:50.500 "data_offset": 2048, 00:11:50.500 "data_size": 63488 00:11:50.500 }, 00:11:50.500 { 00:11:50.500 "name": "pt2", 00:11:50.500 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:50.500 "is_configured": true, 00:11:50.500 "data_offset": 2048, 00:11:50.500 "data_size": 63488 00:11:50.500 }, 00:11:50.500 { 00:11:50.500 "name": "pt3", 00:11:50.500 "uuid": "00000000-0000-0000-0000-000000000003", 00:11:50.500 "is_configured": true, 00:11:50.500 "data_offset": 2048, 00:11:50.500 "data_size": 63488 00:11:50.500 } 00:11:50.500 ] 00:11:50.500 } 00:11:50.500 } 00:11:50.500 }' 00:11:50.500 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:50.500 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:50.500 pt2 00:11:50.500 pt3' 00:11:50.500 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:50.500 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:50.500 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:50.500 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:50.500 "name": "pt1", 00:11:50.500 "aliases": [ 00:11:50.500 "00000000-0000-0000-0000-000000000001" 00:11:50.500 ], 00:11:50.500 "product_name": "passthru", 00:11:50.500 "block_size": 512, 00:11:50.500 "num_blocks": 65536, 00:11:50.500 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:50.500 "assigned_rate_limits": { 00:11:50.500 "rw_ios_per_sec": 0, 00:11:50.500 "rw_mbytes_per_sec": 0, 00:11:50.500 "r_mbytes_per_sec": 0, 00:11:50.500 "w_mbytes_per_sec": 0 00:11:50.500 }, 00:11:50.500 "claimed": true, 00:11:50.500 "claim_type": "exclusive_write", 00:11:50.500 "zoned": false, 00:11:50.500 "supported_io_types": { 00:11:50.500 "read": true, 00:11:50.500 "write": true, 00:11:50.500 "unmap": true, 00:11:50.500 "flush": true, 00:11:50.500 "reset": true, 00:11:50.500 "nvme_admin": false, 00:11:50.500 "nvme_io": false, 00:11:50.500 "nvme_io_md": false, 00:11:50.500 "write_zeroes": true, 00:11:50.500 "zcopy": true, 00:11:50.500 "get_zone_info": false, 00:11:50.500 "zone_management": false, 00:11:50.500 "zone_append": false, 00:11:50.500 "compare": false, 00:11:50.500 "compare_and_write": false, 00:11:50.500 "abort": true, 00:11:50.500 "seek_hole": false, 00:11:50.500 "seek_data": false, 00:11:50.501 "copy": true, 00:11:50.501 "nvme_iov_md": false 00:11:50.501 }, 00:11:50.501 "memory_domains": [ 00:11:50.501 { 00:11:50.501 "dma_device_id": "system", 00:11:50.501 "dma_device_type": 1 00:11:50.501 }, 00:11:50.501 { 00:11:50.501 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:50.501 "dma_device_type": 2 00:11:50.501 } 00:11:50.501 ], 00:11:50.501 "driver_specific": { 00:11:50.501 "passthru": { 00:11:50.501 "name": "pt1", 00:11:50.501 "base_bdev_name": "malloc1" 00:11:50.501 } 00:11:50.501 } 00:11:50.501 }' 00:11:50.501 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:50.760 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:50.760 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:50.760 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:50.760 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:50.760 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:50.760 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:50.760 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:50.760 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:50.760 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:50.760 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:51.019 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:51.019 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:51.019 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:51.019 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:51.019 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:51.019 "name": "pt2", 00:11:51.019 "aliases": [ 00:11:51.019 "00000000-0000-0000-0000-000000000002" 00:11:51.019 ], 00:11:51.019 "product_name": "passthru", 00:11:51.019 "block_size": 512, 00:11:51.019 "num_blocks": 65536, 00:11:51.019 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:51.019 "assigned_rate_limits": { 00:11:51.019 "rw_ios_per_sec": 0, 00:11:51.019 "rw_mbytes_per_sec": 0, 00:11:51.019 "r_mbytes_per_sec": 0, 00:11:51.019 "w_mbytes_per_sec": 0 00:11:51.019 }, 00:11:51.019 "claimed": true, 00:11:51.019 "claim_type": "exclusive_write", 00:11:51.019 "zoned": false, 00:11:51.019 "supported_io_types": { 00:11:51.019 "read": true, 00:11:51.019 "write": true, 00:11:51.019 "unmap": true, 00:11:51.019 "flush": true, 00:11:51.019 "reset": true, 00:11:51.019 "nvme_admin": false, 00:11:51.019 "nvme_io": false, 00:11:51.019 "nvme_io_md": false, 00:11:51.019 "write_zeroes": true, 00:11:51.019 "zcopy": true, 00:11:51.019 "get_zone_info": false, 00:11:51.019 "zone_management": false, 00:11:51.019 "zone_append": false, 00:11:51.019 "compare": false, 00:11:51.019 "compare_and_write": false, 00:11:51.019 "abort": true, 00:11:51.019 "seek_hole": false, 00:11:51.019 "seek_data": false, 00:11:51.019 "copy": true, 00:11:51.019 "nvme_iov_md": false 00:11:51.019 }, 00:11:51.019 "memory_domains": [ 00:11:51.019 { 00:11:51.019 "dma_device_id": "system", 00:11:51.019 "dma_device_type": 1 00:11:51.019 }, 00:11:51.019 { 00:11:51.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.019 "dma_device_type": 2 00:11:51.019 } 00:11:51.019 ], 00:11:51.019 "driver_specific": { 00:11:51.019 "passthru": { 00:11:51.019 "name": "pt2", 00:11:51.019 "base_bdev_name": "malloc2" 00:11:51.019 } 00:11:51.019 } 00:11:51.019 }' 00:11:51.019 18:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:51.019 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:51.277 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:51.277 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:51.277 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:51.277 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:51.277 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:51.277 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:51.277 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:51.277 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:51.277 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:51.277 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:51.277 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:51.277 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:11:51.277 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:51.536 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:51.536 "name": "pt3", 00:11:51.536 "aliases": [ 00:11:51.536 "00000000-0000-0000-0000-000000000003" 00:11:51.536 ], 00:11:51.536 "product_name": "passthru", 00:11:51.536 "block_size": 512, 00:11:51.536 "num_blocks": 65536, 00:11:51.536 "uuid": "00000000-0000-0000-0000-000000000003", 00:11:51.536 "assigned_rate_limits": { 00:11:51.536 "rw_ios_per_sec": 0, 00:11:51.536 "rw_mbytes_per_sec": 0, 00:11:51.536 "r_mbytes_per_sec": 0, 00:11:51.536 "w_mbytes_per_sec": 0 00:11:51.536 }, 00:11:51.536 "claimed": true, 00:11:51.536 "claim_type": "exclusive_write", 00:11:51.536 "zoned": false, 00:11:51.536 "supported_io_types": { 00:11:51.536 "read": true, 00:11:51.536 "write": true, 00:11:51.536 "unmap": true, 00:11:51.536 "flush": true, 00:11:51.536 "reset": true, 00:11:51.536 "nvme_admin": false, 00:11:51.536 "nvme_io": false, 00:11:51.536 "nvme_io_md": false, 00:11:51.536 "write_zeroes": true, 00:11:51.536 "zcopy": true, 00:11:51.536 "get_zone_info": false, 00:11:51.536 "zone_management": false, 00:11:51.536 "zone_append": false, 00:11:51.536 "compare": false, 00:11:51.536 "compare_and_write": false, 00:11:51.536 "abort": true, 00:11:51.536 "seek_hole": false, 00:11:51.536 "seek_data": false, 00:11:51.536 "copy": true, 00:11:51.536 "nvme_iov_md": false 00:11:51.536 }, 00:11:51.536 "memory_domains": [ 00:11:51.536 { 00:11:51.536 "dma_device_id": "system", 00:11:51.536 "dma_device_type": 1 00:11:51.536 }, 00:11:51.536 { 00:11:51.536 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.536 "dma_device_type": 2 00:11:51.536 } 00:11:51.536 ], 00:11:51.536 "driver_specific": { 00:11:51.536 "passthru": { 00:11:51.536 "name": "pt3", 00:11:51.536 "base_bdev_name": "malloc3" 00:11:51.536 } 00:11:51.536 } 00:11:51.536 }' 00:11:51.536 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:51.536 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:51.536 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:51.536 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:51.795 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:51.795 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:51.795 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:51.795 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:51.795 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:51.795 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:51.795 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:51.795 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:51.795 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:51.795 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:52.054 [2024-07-24 18:48:36.916569] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:52.054 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' b49c96bf-0e0b-4c71-9a4b-2046e42f4bc2 '!=' b49c96bf-0e0b-4c71-9a4b-2046e42f4bc2 ']' 00:11:52.054 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:11:52.054 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:52.054 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:52.054 18:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2075650 00:11:52.054 18:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2075650 ']' 00:11:52.054 18:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2075650 00:11:52.054 18:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:11:52.054 18:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:52.054 18:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2075650 00:11:52.055 18:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:52.055 18:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:52.055 18:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2075650' 00:11:52.055 killing process with pid 2075650 00:11:52.055 18:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2075650 00:11:52.055 [2024-07-24 18:48:36.967771] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:52.055 [2024-07-24 18:48:36.967812] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:52.055 18:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2075650 00:11:52.055 [2024-07-24 18:48:36.967849] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:52.055 [2024-07-24 18:48:36.967855] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x190a0c0 name raid_bdev1, state offline 00:11:52.055 [2024-07-24 18:48:36.991211] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:52.314 18:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:11:52.314 00:11:52.314 real 0m10.881s 00:11:52.314 user 0m19.794s 00:11:52.314 sys 0m1.693s 00:11:52.314 18:48:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:52.314 18:48:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:52.314 ************************************ 00:11:52.314 END TEST raid_superblock_test 00:11:52.314 ************************************ 00:11:52.314 18:48:37 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:11:52.314 18:48:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:52.314 18:48:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:52.314 18:48:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:52.314 ************************************ 00:11:52.314 START TEST raid_read_error_test 00:11:52.314 ************************************ 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.M2hUakLmmC 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2077768 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2077768 /var/tmp/spdk-raid.sock 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2077768 ']' 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:52.314 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:52.314 18:48:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:52.314 [2024-07-24 18:48:37.294996] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:11:52.314 [2024-07-24 18:48:37.295035] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2077768 ] 00:11:52.573 [2024-07-24 18:48:37.353757] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:52.573 [2024-07-24 18:48:37.433041] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:52.573 [2024-07-24 18:48:37.490171] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:52.573 [2024-07-24 18:48:37.490198] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:53.140 18:48:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:53.140 18:48:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:53.140 18:48:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:53.140 18:48:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:53.399 BaseBdev1_malloc 00:11:53.399 18:48:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:53.657 true 00:11:53.657 18:48:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:53.657 [2024-07-24 18:48:38.570033] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:53.658 [2024-07-24 18:48:38.570063] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:53.658 [2024-07-24 18:48:38.570073] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x258fd20 00:11:53.658 [2024-07-24 18:48:38.570079] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:53.658 [2024-07-24 18:48:38.571238] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:53.658 [2024-07-24 18:48:38.571258] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:53.658 BaseBdev1 00:11:53.658 18:48:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:53.658 18:48:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:53.917 BaseBdev2_malloc 00:11:53.917 18:48:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:53.917 true 00:11:53.917 18:48:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:54.175 [2024-07-24 18:48:39.074973] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:54.175 [2024-07-24 18:48:39.075006] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:54.175 [2024-07-24 18:48:39.075017] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2594d50 00:11:54.175 [2024-07-24 18:48:39.075023] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:54.175 [2024-07-24 18:48:39.076093] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:54.175 [2024-07-24 18:48:39.076114] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:54.175 BaseBdev2 00:11:54.175 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:54.175 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:11:54.434 BaseBdev3_malloc 00:11:54.434 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:11:54.434 true 00:11:54.434 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:11:54.692 [2024-07-24 18:48:39.575816] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:11:54.692 [2024-07-24 18:48:39.575850] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:54.692 [2024-07-24 18:48:39.575861] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2593ef0 00:11:54.692 [2024-07-24 18:48:39.575868] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:54.692 [2024-07-24 18:48:39.576949] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:54.692 [2024-07-24 18:48:39.576969] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:11:54.692 BaseBdev3 00:11:54.692 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:11:54.952 [2024-07-24 18:48:39.744275] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:54.952 [2024-07-24 18:48:39.745185] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:54.952 [2024-07-24 18:48:39.745231] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:54.952 [2024-07-24 18:48:39.745368] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2597a00 00:11:54.952 [2024-07-24 18:48:39.745374] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:54.952 [2024-07-24 18:48:39.745519] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23eb750 00:11:54.952 [2024-07-24 18:48:39.745625] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2597a00 00:11:54.952 [2024-07-24 18:48:39.745630] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2597a00 00:11:54.952 [2024-07-24 18:48:39.745699] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:54.952 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:11:54.952 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:54.952 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:54.952 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:54.952 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:54.952 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:54.952 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:54.952 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:54.952 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:54.952 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:54.952 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:54.952 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:54.952 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:54.952 "name": "raid_bdev1", 00:11:54.952 "uuid": "278de501-1f5e-427e-98a1-500563159847", 00:11:54.952 "strip_size_kb": 64, 00:11:54.952 "state": "online", 00:11:54.952 "raid_level": "raid0", 00:11:54.952 "superblock": true, 00:11:54.952 "num_base_bdevs": 3, 00:11:54.952 "num_base_bdevs_discovered": 3, 00:11:54.952 "num_base_bdevs_operational": 3, 00:11:54.952 "base_bdevs_list": [ 00:11:54.952 { 00:11:54.952 "name": "BaseBdev1", 00:11:54.952 "uuid": "f11d47d3-7a88-57c8-a6bb-b63097c649df", 00:11:54.952 "is_configured": true, 00:11:54.952 "data_offset": 2048, 00:11:54.952 "data_size": 63488 00:11:54.952 }, 00:11:54.952 { 00:11:54.952 "name": "BaseBdev2", 00:11:54.952 "uuid": "0f67646d-3e4d-5107-aaf9-b7084a59f56c", 00:11:54.952 "is_configured": true, 00:11:54.952 "data_offset": 2048, 00:11:54.952 "data_size": 63488 00:11:54.952 }, 00:11:54.952 { 00:11:54.952 "name": "BaseBdev3", 00:11:54.952 "uuid": "6eb8a23a-c120-5469-aa92-d7a2ae33568b", 00:11:54.952 "is_configured": true, 00:11:54.952 "data_offset": 2048, 00:11:54.952 "data_size": 63488 00:11:54.952 } 00:11:54.952 ] 00:11:54.953 }' 00:11:54.953 18:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:54.953 18:48:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:55.520 18:48:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:55.520 18:48:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:55.520 [2024-07-24 18:48:40.502513] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2597930 00:11:56.457 18:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:56.716 18:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:56.716 18:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:56.716 18:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:11:56.716 18:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:11:56.716 18:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:56.716 18:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:56.716 18:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:56.716 18:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:56.716 18:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:56.716 18:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:56.716 18:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:56.716 18:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:56.716 18:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:56.716 18:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:56.716 18:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:56.975 18:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:56.975 "name": "raid_bdev1", 00:11:56.975 "uuid": "278de501-1f5e-427e-98a1-500563159847", 00:11:56.975 "strip_size_kb": 64, 00:11:56.975 "state": "online", 00:11:56.975 "raid_level": "raid0", 00:11:56.975 "superblock": true, 00:11:56.975 "num_base_bdevs": 3, 00:11:56.975 "num_base_bdevs_discovered": 3, 00:11:56.975 "num_base_bdevs_operational": 3, 00:11:56.975 "base_bdevs_list": [ 00:11:56.975 { 00:11:56.975 "name": "BaseBdev1", 00:11:56.975 "uuid": "f11d47d3-7a88-57c8-a6bb-b63097c649df", 00:11:56.975 "is_configured": true, 00:11:56.975 "data_offset": 2048, 00:11:56.975 "data_size": 63488 00:11:56.975 }, 00:11:56.975 { 00:11:56.975 "name": "BaseBdev2", 00:11:56.975 "uuid": "0f67646d-3e4d-5107-aaf9-b7084a59f56c", 00:11:56.975 "is_configured": true, 00:11:56.975 "data_offset": 2048, 00:11:56.975 "data_size": 63488 00:11:56.975 }, 00:11:56.975 { 00:11:56.975 "name": "BaseBdev3", 00:11:56.975 "uuid": "6eb8a23a-c120-5469-aa92-d7a2ae33568b", 00:11:56.975 "is_configured": true, 00:11:56.975 "data_offset": 2048, 00:11:56.975 "data_size": 63488 00:11:56.975 } 00:11:56.975 ] 00:11:56.975 }' 00:11:56.975 18:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:56.975 18:48:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:57.543 18:48:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:57.543 [2024-07-24 18:48:42.435612] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:57.543 [2024-07-24 18:48:42.435638] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:57.543 [2024-07-24 18:48:42.437727] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:57.543 [2024-07-24 18:48:42.437750] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:57.543 [2024-07-24 18:48:42.437771] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:57.543 [2024-07-24 18:48:42.437777] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2597a00 name raid_bdev1, state offline 00:11:57.543 0 00:11:57.543 18:48:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2077768 00:11:57.543 18:48:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2077768 ']' 00:11:57.543 18:48:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2077768 00:11:57.543 18:48:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:11:57.543 18:48:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:57.543 18:48:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2077768 00:11:57.543 18:48:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:57.543 18:48:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:57.543 18:48:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2077768' 00:11:57.543 killing process with pid 2077768 00:11:57.543 18:48:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2077768 00:11:57.543 [2024-07-24 18:48:42.495765] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:57.543 18:48:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2077768 00:11:57.543 [2024-07-24 18:48:42.513993] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:57.802 18:48:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.M2hUakLmmC 00:11:57.802 18:48:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:57.802 18:48:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:57.802 18:48:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:11:57.802 18:48:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:57.802 18:48:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:57.802 18:48:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:57.802 18:48:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:11:57.802 00:11:57.802 real 0m5.467s 00:11:57.802 user 0m8.512s 00:11:57.802 sys 0m0.785s 00:11:57.802 18:48:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:57.802 18:48:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:57.802 ************************************ 00:11:57.802 END TEST raid_read_error_test 00:11:57.802 ************************************ 00:11:57.802 18:48:42 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:11:57.802 18:48:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:57.802 18:48:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:57.802 18:48:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:57.802 ************************************ 00:11:57.802 START TEST raid_write_error_test 00:11:57.802 ************************************ 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.hG28JiLkRT 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2078785 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2078785 /var/tmp/spdk-raid.sock 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2078785 ']' 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:57.802 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:57.802 18:48:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:58.061 [2024-07-24 18:48:42.834222] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:11:58.061 [2024-07-24 18:48:42.834263] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2078785 ] 00:11:58.061 [2024-07-24 18:48:42.897119] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:58.061 [2024-07-24 18:48:42.978354] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:58.061 [2024-07-24 18:48:43.037521] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:58.061 [2024-07-24 18:48:43.037551] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:58.628 18:48:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:58.628 18:48:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:58.628 18:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:58.628 18:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:58.886 BaseBdev1_malloc 00:11:58.886 18:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:59.145 true 00:11:59.145 18:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:59.145 [2024-07-24 18:48:44.090716] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:59.145 [2024-07-24 18:48:44.090747] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:59.145 [2024-07-24 18:48:44.090759] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b24d20 00:11:59.145 [2024-07-24 18:48:44.090764] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:59.145 [2024-07-24 18:48:44.091969] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:59.145 [2024-07-24 18:48:44.091988] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:59.145 BaseBdev1 00:11:59.145 18:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:59.145 18:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:59.405 BaseBdev2_malloc 00:11:59.405 18:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:59.685 true 00:11:59.685 18:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:59.685 [2024-07-24 18:48:44.627681] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:59.685 [2024-07-24 18:48:44.627713] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:59.685 [2024-07-24 18:48:44.627728] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b29d50 00:11:59.685 [2024-07-24 18:48:44.627749] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:59.685 [2024-07-24 18:48:44.628825] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:59.685 [2024-07-24 18:48:44.628845] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:59.685 BaseBdev2 00:11:59.685 18:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:59.685 18:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:12:00.015 BaseBdev3_malloc 00:12:00.015 18:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:12:00.015 true 00:12:00.015 18:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:12:00.274 [2024-07-24 18:48:45.116472] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:12:00.274 [2024-07-24 18:48:45.116503] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:00.274 [2024-07-24 18:48:45.116513] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b28ef0 00:12:00.274 [2024-07-24 18:48:45.116519] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:00.274 [2024-07-24 18:48:45.117543] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:00.274 [2024-07-24 18:48:45.117563] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:12:00.274 BaseBdev3 00:12:00.274 18:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:12:00.534 [2024-07-24 18:48:45.284936] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:00.534 [2024-07-24 18:48:45.285861] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:00.534 [2024-07-24 18:48:45.285909] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:00.534 [2024-07-24 18:48:45.286050] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b2ca00 00:12:00.534 [2024-07-24 18:48:45.286057] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:00.534 [2024-07-24 18:48:45.286198] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1980750 00:12:00.534 [2024-07-24 18:48:45.286303] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b2ca00 00:12:00.534 [2024-07-24 18:48:45.286309] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b2ca00 00:12:00.534 [2024-07-24 18:48:45.286378] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:00.534 18:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:00.534 18:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:00.534 18:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:00.534 18:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:00.534 18:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:00.534 18:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:00.534 18:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:00.534 18:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:00.534 18:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:00.534 18:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:00.534 18:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.534 18:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:00.534 18:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:00.534 "name": "raid_bdev1", 00:12:00.534 "uuid": "b799bc67-9a2e-4e2f-bf6d-2e0157399d19", 00:12:00.534 "strip_size_kb": 64, 00:12:00.534 "state": "online", 00:12:00.534 "raid_level": "raid0", 00:12:00.534 "superblock": true, 00:12:00.534 "num_base_bdevs": 3, 00:12:00.534 "num_base_bdevs_discovered": 3, 00:12:00.534 "num_base_bdevs_operational": 3, 00:12:00.534 "base_bdevs_list": [ 00:12:00.534 { 00:12:00.534 "name": "BaseBdev1", 00:12:00.534 "uuid": "ae5a2c1e-cec2-5cbe-bd64-1b2cd119da9f", 00:12:00.534 "is_configured": true, 00:12:00.534 "data_offset": 2048, 00:12:00.534 "data_size": 63488 00:12:00.534 }, 00:12:00.534 { 00:12:00.534 "name": "BaseBdev2", 00:12:00.534 "uuid": "a9feb4e2-ce19-5007-aa8a-40e9b05df111", 00:12:00.534 "is_configured": true, 00:12:00.534 "data_offset": 2048, 00:12:00.534 "data_size": 63488 00:12:00.534 }, 00:12:00.534 { 00:12:00.534 "name": "BaseBdev3", 00:12:00.534 "uuid": "a38802d2-b7c6-5657-9995-ad55838fd997", 00:12:00.534 "is_configured": true, 00:12:00.534 "data_offset": 2048, 00:12:00.534 "data_size": 63488 00:12:00.534 } 00:12:00.534 ] 00:12:00.534 }' 00:12:00.534 18:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:00.534 18:48:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:01.101 18:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:01.101 18:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:01.101 [2024-07-24 18:48:46.023050] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b2c930 00:12:02.037 18:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:02.295 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:02.295 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:02.295 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:12:02.295 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:02.295 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:02.295 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:02.295 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:02.295 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:02.295 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:02.295 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:02.295 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:02.295 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:02.295 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:02.295 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.295 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:02.553 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:02.553 "name": "raid_bdev1", 00:12:02.553 "uuid": "b799bc67-9a2e-4e2f-bf6d-2e0157399d19", 00:12:02.553 "strip_size_kb": 64, 00:12:02.553 "state": "online", 00:12:02.553 "raid_level": "raid0", 00:12:02.553 "superblock": true, 00:12:02.553 "num_base_bdevs": 3, 00:12:02.553 "num_base_bdevs_discovered": 3, 00:12:02.553 "num_base_bdevs_operational": 3, 00:12:02.553 "base_bdevs_list": [ 00:12:02.553 { 00:12:02.553 "name": "BaseBdev1", 00:12:02.553 "uuid": "ae5a2c1e-cec2-5cbe-bd64-1b2cd119da9f", 00:12:02.553 "is_configured": true, 00:12:02.553 "data_offset": 2048, 00:12:02.553 "data_size": 63488 00:12:02.553 }, 00:12:02.553 { 00:12:02.553 "name": "BaseBdev2", 00:12:02.553 "uuid": "a9feb4e2-ce19-5007-aa8a-40e9b05df111", 00:12:02.553 "is_configured": true, 00:12:02.553 "data_offset": 2048, 00:12:02.554 "data_size": 63488 00:12:02.554 }, 00:12:02.554 { 00:12:02.554 "name": "BaseBdev3", 00:12:02.554 "uuid": "a38802d2-b7c6-5657-9995-ad55838fd997", 00:12:02.554 "is_configured": true, 00:12:02.554 "data_offset": 2048, 00:12:02.554 "data_size": 63488 00:12:02.554 } 00:12:02.554 ] 00:12:02.554 }' 00:12:02.554 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:02.554 18:48:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:02.812 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:03.071 [2024-07-24 18:48:47.931454] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:03.071 [2024-07-24 18:48:47.931510] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:03.071 [2024-07-24 18:48:47.933574] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:03.071 [2024-07-24 18:48:47.933598] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:03.071 [2024-07-24 18:48:47.933619] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:03.071 [2024-07-24 18:48:47.933624] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b2ca00 name raid_bdev1, state offline 00:12:03.071 0 00:12:03.071 18:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2078785 00:12:03.071 18:48:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2078785 ']' 00:12:03.071 18:48:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2078785 00:12:03.071 18:48:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:03.071 18:48:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:03.071 18:48:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2078785 00:12:03.071 18:48:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:03.071 18:48:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:03.071 18:48:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2078785' 00:12:03.071 killing process with pid 2078785 00:12:03.071 18:48:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2078785 00:12:03.071 [2024-07-24 18:48:47.993544] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:03.071 18:48:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2078785 00:12:03.071 [2024-07-24 18:48:48.011230] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:03.330 18:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.hG28JiLkRT 00:12:03.330 18:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:03.331 18:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:03.331 18:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.53 00:12:03.331 18:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:12:03.331 18:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:03.331 18:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:03.331 18:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.53 != \0\.\0\0 ]] 00:12:03.331 00:12:03.331 real 0m5.430s 00:12:03.331 user 0m8.421s 00:12:03.331 sys 0m0.798s 00:12:03.331 18:48:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:03.331 18:48:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:03.331 ************************************ 00:12:03.331 END TEST raid_write_error_test 00:12:03.331 ************************************ 00:12:03.331 18:48:48 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:03.331 18:48:48 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:12:03.331 18:48:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:03.331 18:48:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:03.331 18:48:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:03.331 ************************************ 00:12:03.331 START TEST raid_state_function_test 00:12:03.331 ************************************ 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2079786 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2079786' 00:12:03.331 Process raid pid: 2079786 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2079786 /var/tmp/spdk-raid.sock 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2079786 ']' 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:03.331 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:03.331 18:48:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:03.331 [2024-07-24 18:48:48.325921] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:12:03.331 [2024-07-24 18:48:48.325965] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:03.590 [2024-07-24 18:48:48.389477] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:03.590 [2024-07-24 18:48:48.461216] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:03.590 [2024-07-24 18:48:48.512247] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:03.590 [2024-07-24 18:48:48.512274] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:04.158 18:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:04.158 18:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:04.158 18:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:04.416 [2024-07-24 18:48:49.263235] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:04.416 [2024-07-24 18:48:49.263261] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:04.416 [2024-07-24 18:48:49.263266] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:04.416 [2024-07-24 18:48:49.263271] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:04.416 [2024-07-24 18:48:49.263277] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:04.416 [2024-07-24 18:48:49.263282] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:04.416 18:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:04.416 18:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:04.416 18:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:04.416 18:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:04.416 18:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:04.416 18:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:04.416 18:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:04.416 18:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:04.416 18:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:04.416 18:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:04.416 18:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:04.416 18:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:04.675 18:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:04.675 "name": "Existed_Raid", 00:12:04.675 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:04.675 "strip_size_kb": 64, 00:12:04.675 "state": "configuring", 00:12:04.676 "raid_level": "concat", 00:12:04.676 "superblock": false, 00:12:04.676 "num_base_bdevs": 3, 00:12:04.676 "num_base_bdevs_discovered": 0, 00:12:04.676 "num_base_bdevs_operational": 3, 00:12:04.676 "base_bdevs_list": [ 00:12:04.676 { 00:12:04.676 "name": "BaseBdev1", 00:12:04.676 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:04.676 "is_configured": false, 00:12:04.676 "data_offset": 0, 00:12:04.676 "data_size": 0 00:12:04.676 }, 00:12:04.676 { 00:12:04.676 "name": "BaseBdev2", 00:12:04.676 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:04.676 "is_configured": false, 00:12:04.676 "data_offset": 0, 00:12:04.676 "data_size": 0 00:12:04.676 }, 00:12:04.676 { 00:12:04.676 "name": "BaseBdev3", 00:12:04.676 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:04.676 "is_configured": false, 00:12:04.676 "data_offset": 0, 00:12:04.676 "data_size": 0 00:12:04.676 } 00:12:04.676 ] 00:12:04.676 }' 00:12:04.676 18:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:04.676 18:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:05.240 18:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:05.240 [2024-07-24 18:48:50.117369] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:05.240 [2024-07-24 18:48:50.117394] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc68ba0 name Existed_Raid, state configuring 00:12:05.240 18:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:05.499 [2024-07-24 18:48:50.285814] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:05.499 [2024-07-24 18:48:50.285830] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:05.499 [2024-07-24 18:48:50.285834] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:05.499 [2024-07-24 18:48:50.285839] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:05.499 [2024-07-24 18:48:50.285843] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:05.499 [2024-07-24 18:48:50.285847] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:05.499 18:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:05.499 [2024-07-24 18:48:50.466604] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:05.499 BaseBdev1 00:12:05.499 18:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:05.499 18:48:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:05.499 18:48:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:05.499 18:48:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:05.499 18:48:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:05.499 18:48:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:05.499 18:48:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:05.757 18:48:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:06.029 [ 00:12:06.029 { 00:12:06.029 "name": "BaseBdev1", 00:12:06.029 "aliases": [ 00:12:06.029 "8f270669-a480-4aca-8fbd-73cd1190963c" 00:12:06.029 ], 00:12:06.029 "product_name": "Malloc disk", 00:12:06.029 "block_size": 512, 00:12:06.029 "num_blocks": 65536, 00:12:06.029 "uuid": "8f270669-a480-4aca-8fbd-73cd1190963c", 00:12:06.029 "assigned_rate_limits": { 00:12:06.029 "rw_ios_per_sec": 0, 00:12:06.029 "rw_mbytes_per_sec": 0, 00:12:06.029 "r_mbytes_per_sec": 0, 00:12:06.029 "w_mbytes_per_sec": 0 00:12:06.029 }, 00:12:06.029 "claimed": true, 00:12:06.029 "claim_type": "exclusive_write", 00:12:06.029 "zoned": false, 00:12:06.029 "supported_io_types": { 00:12:06.029 "read": true, 00:12:06.029 "write": true, 00:12:06.029 "unmap": true, 00:12:06.029 "flush": true, 00:12:06.029 "reset": true, 00:12:06.029 "nvme_admin": false, 00:12:06.029 "nvme_io": false, 00:12:06.029 "nvme_io_md": false, 00:12:06.029 "write_zeroes": true, 00:12:06.029 "zcopy": true, 00:12:06.029 "get_zone_info": false, 00:12:06.029 "zone_management": false, 00:12:06.029 "zone_append": false, 00:12:06.029 "compare": false, 00:12:06.029 "compare_and_write": false, 00:12:06.029 "abort": true, 00:12:06.029 "seek_hole": false, 00:12:06.029 "seek_data": false, 00:12:06.029 "copy": true, 00:12:06.029 "nvme_iov_md": false 00:12:06.029 }, 00:12:06.029 "memory_domains": [ 00:12:06.029 { 00:12:06.029 "dma_device_id": "system", 00:12:06.029 "dma_device_type": 1 00:12:06.029 }, 00:12:06.029 { 00:12:06.029 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:06.029 "dma_device_type": 2 00:12:06.029 } 00:12:06.029 ], 00:12:06.029 "driver_specific": {} 00:12:06.029 } 00:12:06.029 ] 00:12:06.029 18:48:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:06.029 18:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:06.029 18:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:06.029 18:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:06.029 18:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:06.029 18:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:06.029 18:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:06.029 18:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:06.029 18:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:06.029 18:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:06.029 18:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:06.029 18:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:06.029 18:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.029 18:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:06.029 "name": "Existed_Raid", 00:12:06.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:06.029 "strip_size_kb": 64, 00:12:06.029 "state": "configuring", 00:12:06.029 "raid_level": "concat", 00:12:06.029 "superblock": false, 00:12:06.029 "num_base_bdevs": 3, 00:12:06.029 "num_base_bdevs_discovered": 1, 00:12:06.029 "num_base_bdevs_operational": 3, 00:12:06.029 "base_bdevs_list": [ 00:12:06.029 { 00:12:06.029 "name": "BaseBdev1", 00:12:06.029 "uuid": "8f270669-a480-4aca-8fbd-73cd1190963c", 00:12:06.029 "is_configured": true, 00:12:06.029 "data_offset": 0, 00:12:06.029 "data_size": 65536 00:12:06.029 }, 00:12:06.029 { 00:12:06.029 "name": "BaseBdev2", 00:12:06.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:06.029 "is_configured": false, 00:12:06.029 "data_offset": 0, 00:12:06.029 "data_size": 0 00:12:06.029 }, 00:12:06.029 { 00:12:06.029 "name": "BaseBdev3", 00:12:06.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:06.029 "is_configured": false, 00:12:06.029 "data_offset": 0, 00:12:06.029 "data_size": 0 00:12:06.029 } 00:12:06.029 ] 00:12:06.029 }' 00:12:06.029 18:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:06.029 18:48:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:06.595 18:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:06.854 [2024-07-24 18:48:51.629603] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:06.854 [2024-07-24 18:48:51.629632] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc68470 name Existed_Raid, state configuring 00:12:06.854 18:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:06.854 [2024-07-24 18:48:51.802065] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:06.854 [2024-07-24 18:48:51.803057] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:06.854 [2024-07-24 18:48:51.803083] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:06.854 [2024-07-24 18:48:51.803087] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:06.854 [2024-07-24 18:48:51.803092] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:06.854 18:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:06.854 18:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:06.854 18:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:06.854 18:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:06.854 18:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:06.854 18:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:06.854 18:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:06.854 18:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:06.854 18:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:06.854 18:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:06.854 18:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:06.854 18:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:06.854 18:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.854 18:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:07.112 18:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:07.112 "name": "Existed_Raid", 00:12:07.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:07.112 "strip_size_kb": 64, 00:12:07.112 "state": "configuring", 00:12:07.112 "raid_level": "concat", 00:12:07.112 "superblock": false, 00:12:07.112 "num_base_bdevs": 3, 00:12:07.112 "num_base_bdevs_discovered": 1, 00:12:07.112 "num_base_bdevs_operational": 3, 00:12:07.112 "base_bdevs_list": [ 00:12:07.112 { 00:12:07.112 "name": "BaseBdev1", 00:12:07.112 "uuid": "8f270669-a480-4aca-8fbd-73cd1190963c", 00:12:07.112 "is_configured": true, 00:12:07.112 "data_offset": 0, 00:12:07.112 "data_size": 65536 00:12:07.112 }, 00:12:07.112 { 00:12:07.112 "name": "BaseBdev2", 00:12:07.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:07.112 "is_configured": false, 00:12:07.112 "data_offset": 0, 00:12:07.112 "data_size": 0 00:12:07.112 }, 00:12:07.112 { 00:12:07.112 "name": "BaseBdev3", 00:12:07.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:07.112 "is_configured": false, 00:12:07.112 "data_offset": 0, 00:12:07.112 "data_size": 0 00:12:07.112 } 00:12:07.112 ] 00:12:07.112 }' 00:12:07.112 18:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:07.112 18:48:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.678 18:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:07.678 [2024-07-24 18:48:52.638905] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:07.678 BaseBdev2 00:12:07.678 18:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:07.678 18:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:07.678 18:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:07.678 18:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:07.678 18:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:07.678 18:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:07.678 18:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:07.936 18:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:08.194 [ 00:12:08.194 { 00:12:08.194 "name": "BaseBdev2", 00:12:08.194 "aliases": [ 00:12:08.194 "347b844a-195f-456d-b255-963e624d9cb4" 00:12:08.194 ], 00:12:08.194 "product_name": "Malloc disk", 00:12:08.194 "block_size": 512, 00:12:08.194 "num_blocks": 65536, 00:12:08.194 "uuid": "347b844a-195f-456d-b255-963e624d9cb4", 00:12:08.194 "assigned_rate_limits": { 00:12:08.194 "rw_ios_per_sec": 0, 00:12:08.194 "rw_mbytes_per_sec": 0, 00:12:08.194 "r_mbytes_per_sec": 0, 00:12:08.194 "w_mbytes_per_sec": 0 00:12:08.194 }, 00:12:08.194 "claimed": true, 00:12:08.194 "claim_type": "exclusive_write", 00:12:08.194 "zoned": false, 00:12:08.194 "supported_io_types": { 00:12:08.194 "read": true, 00:12:08.194 "write": true, 00:12:08.194 "unmap": true, 00:12:08.194 "flush": true, 00:12:08.194 "reset": true, 00:12:08.194 "nvme_admin": false, 00:12:08.194 "nvme_io": false, 00:12:08.194 "nvme_io_md": false, 00:12:08.194 "write_zeroes": true, 00:12:08.194 "zcopy": true, 00:12:08.194 "get_zone_info": false, 00:12:08.194 "zone_management": false, 00:12:08.194 "zone_append": false, 00:12:08.194 "compare": false, 00:12:08.194 "compare_and_write": false, 00:12:08.194 "abort": true, 00:12:08.194 "seek_hole": false, 00:12:08.194 "seek_data": false, 00:12:08.194 "copy": true, 00:12:08.194 "nvme_iov_md": false 00:12:08.194 }, 00:12:08.194 "memory_domains": [ 00:12:08.194 { 00:12:08.194 "dma_device_id": "system", 00:12:08.194 "dma_device_type": 1 00:12:08.194 }, 00:12:08.194 { 00:12:08.194 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.194 "dma_device_type": 2 00:12:08.194 } 00:12:08.194 ], 00:12:08.194 "driver_specific": {} 00:12:08.194 } 00:12:08.194 ] 00:12:08.194 18:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:08.194 18:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:08.194 18:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:08.194 18:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:08.194 18:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:08.194 18:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:08.194 18:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:08.194 18:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:08.194 18:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:08.194 18:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:08.194 18:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:08.194 18:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:08.194 18:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:08.194 18:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:08.194 18:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:08.194 18:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:08.194 "name": "Existed_Raid", 00:12:08.194 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:08.194 "strip_size_kb": 64, 00:12:08.194 "state": "configuring", 00:12:08.194 "raid_level": "concat", 00:12:08.194 "superblock": false, 00:12:08.194 "num_base_bdevs": 3, 00:12:08.194 "num_base_bdevs_discovered": 2, 00:12:08.194 "num_base_bdevs_operational": 3, 00:12:08.194 "base_bdevs_list": [ 00:12:08.194 { 00:12:08.194 "name": "BaseBdev1", 00:12:08.194 "uuid": "8f270669-a480-4aca-8fbd-73cd1190963c", 00:12:08.194 "is_configured": true, 00:12:08.194 "data_offset": 0, 00:12:08.194 "data_size": 65536 00:12:08.194 }, 00:12:08.194 { 00:12:08.194 "name": "BaseBdev2", 00:12:08.195 "uuid": "347b844a-195f-456d-b255-963e624d9cb4", 00:12:08.195 "is_configured": true, 00:12:08.195 "data_offset": 0, 00:12:08.195 "data_size": 65536 00:12:08.195 }, 00:12:08.195 { 00:12:08.195 "name": "BaseBdev3", 00:12:08.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:08.195 "is_configured": false, 00:12:08.195 "data_offset": 0, 00:12:08.195 "data_size": 0 00:12:08.195 } 00:12:08.195 ] 00:12:08.195 }' 00:12:08.195 18:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:08.195 18:48:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:08.760 18:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:09.018 [2024-07-24 18:48:53.816577] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:09.018 [2024-07-24 18:48:53.816608] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xc69360 00:12:09.018 [2024-07-24 18:48:53.816612] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:09.018 [2024-07-24 18:48:53.816736] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe0dcd0 00:12:09.018 [2024-07-24 18:48:53.816822] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc69360 00:12:09.018 [2024-07-24 18:48:53.816827] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xc69360 00:12:09.018 [2024-07-24 18:48:53.816944] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:09.018 BaseBdev3 00:12:09.018 18:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:12:09.018 18:48:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:09.018 18:48:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:09.018 18:48:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:09.018 18:48:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:09.018 18:48:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:09.018 18:48:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:09.018 18:48:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:09.276 [ 00:12:09.276 { 00:12:09.276 "name": "BaseBdev3", 00:12:09.276 "aliases": [ 00:12:09.276 "3b825a4f-b7f8-4250-a758-928309bc4858" 00:12:09.276 ], 00:12:09.276 "product_name": "Malloc disk", 00:12:09.276 "block_size": 512, 00:12:09.276 "num_blocks": 65536, 00:12:09.276 "uuid": "3b825a4f-b7f8-4250-a758-928309bc4858", 00:12:09.276 "assigned_rate_limits": { 00:12:09.276 "rw_ios_per_sec": 0, 00:12:09.276 "rw_mbytes_per_sec": 0, 00:12:09.276 "r_mbytes_per_sec": 0, 00:12:09.276 "w_mbytes_per_sec": 0 00:12:09.276 }, 00:12:09.276 "claimed": true, 00:12:09.276 "claim_type": "exclusive_write", 00:12:09.276 "zoned": false, 00:12:09.276 "supported_io_types": { 00:12:09.276 "read": true, 00:12:09.276 "write": true, 00:12:09.276 "unmap": true, 00:12:09.276 "flush": true, 00:12:09.276 "reset": true, 00:12:09.276 "nvme_admin": false, 00:12:09.276 "nvme_io": false, 00:12:09.276 "nvme_io_md": false, 00:12:09.276 "write_zeroes": true, 00:12:09.276 "zcopy": true, 00:12:09.276 "get_zone_info": false, 00:12:09.276 "zone_management": false, 00:12:09.276 "zone_append": false, 00:12:09.276 "compare": false, 00:12:09.276 "compare_and_write": false, 00:12:09.276 "abort": true, 00:12:09.276 "seek_hole": false, 00:12:09.276 "seek_data": false, 00:12:09.276 "copy": true, 00:12:09.276 "nvme_iov_md": false 00:12:09.276 }, 00:12:09.276 "memory_domains": [ 00:12:09.276 { 00:12:09.276 "dma_device_id": "system", 00:12:09.276 "dma_device_type": 1 00:12:09.276 }, 00:12:09.276 { 00:12:09.276 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:09.276 "dma_device_type": 2 00:12:09.276 } 00:12:09.276 ], 00:12:09.276 "driver_specific": {} 00:12:09.276 } 00:12:09.276 ] 00:12:09.276 18:48:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:09.276 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:09.276 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:09.276 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:12:09.276 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:09.276 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:09.276 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:09.276 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:09.276 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:09.276 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:09.276 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:09.276 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:09.276 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:09.276 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.276 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:09.534 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:09.534 "name": "Existed_Raid", 00:12:09.534 "uuid": "9fe60e14-d788-4cf9-89f2-ef423673e075", 00:12:09.534 "strip_size_kb": 64, 00:12:09.534 "state": "online", 00:12:09.534 "raid_level": "concat", 00:12:09.534 "superblock": false, 00:12:09.534 "num_base_bdevs": 3, 00:12:09.534 "num_base_bdevs_discovered": 3, 00:12:09.534 "num_base_bdevs_operational": 3, 00:12:09.534 "base_bdevs_list": [ 00:12:09.534 { 00:12:09.534 "name": "BaseBdev1", 00:12:09.534 "uuid": "8f270669-a480-4aca-8fbd-73cd1190963c", 00:12:09.534 "is_configured": true, 00:12:09.534 "data_offset": 0, 00:12:09.534 "data_size": 65536 00:12:09.534 }, 00:12:09.534 { 00:12:09.534 "name": "BaseBdev2", 00:12:09.534 "uuid": "347b844a-195f-456d-b255-963e624d9cb4", 00:12:09.534 "is_configured": true, 00:12:09.534 "data_offset": 0, 00:12:09.534 "data_size": 65536 00:12:09.534 }, 00:12:09.534 { 00:12:09.534 "name": "BaseBdev3", 00:12:09.534 "uuid": "3b825a4f-b7f8-4250-a758-928309bc4858", 00:12:09.534 "is_configured": true, 00:12:09.534 "data_offset": 0, 00:12:09.534 "data_size": 65536 00:12:09.534 } 00:12:09.534 ] 00:12:09.534 }' 00:12:09.535 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:09.535 18:48:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:10.099 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:10.100 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:10.100 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:10.100 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:10.100 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:10.100 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:10.100 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:10.100 18:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:10.100 [2024-07-24 18:48:55.015876] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:10.100 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:10.100 "name": "Existed_Raid", 00:12:10.100 "aliases": [ 00:12:10.100 "9fe60e14-d788-4cf9-89f2-ef423673e075" 00:12:10.100 ], 00:12:10.100 "product_name": "Raid Volume", 00:12:10.100 "block_size": 512, 00:12:10.100 "num_blocks": 196608, 00:12:10.100 "uuid": "9fe60e14-d788-4cf9-89f2-ef423673e075", 00:12:10.100 "assigned_rate_limits": { 00:12:10.100 "rw_ios_per_sec": 0, 00:12:10.100 "rw_mbytes_per_sec": 0, 00:12:10.100 "r_mbytes_per_sec": 0, 00:12:10.100 "w_mbytes_per_sec": 0 00:12:10.100 }, 00:12:10.100 "claimed": false, 00:12:10.100 "zoned": false, 00:12:10.100 "supported_io_types": { 00:12:10.100 "read": true, 00:12:10.100 "write": true, 00:12:10.100 "unmap": true, 00:12:10.100 "flush": true, 00:12:10.100 "reset": true, 00:12:10.100 "nvme_admin": false, 00:12:10.100 "nvme_io": false, 00:12:10.100 "nvme_io_md": false, 00:12:10.100 "write_zeroes": true, 00:12:10.100 "zcopy": false, 00:12:10.100 "get_zone_info": false, 00:12:10.100 "zone_management": false, 00:12:10.100 "zone_append": false, 00:12:10.100 "compare": false, 00:12:10.100 "compare_and_write": false, 00:12:10.100 "abort": false, 00:12:10.100 "seek_hole": false, 00:12:10.100 "seek_data": false, 00:12:10.100 "copy": false, 00:12:10.100 "nvme_iov_md": false 00:12:10.100 }, 00:12:10.100 "memory_domains": [ 00:12:10.100 { 00:12:10.100 "dma_device_id": "system", 00:12:10.100 "dma_device_type": 1 00:12:10.100 }, 00:12:10.100 { 00:12:10.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.100 "dma_device_type": 2 00:12:10.100 }, 00:12:10.100 { 00:12:10.100 "dma_device_id": "system", 00:12:10.100 "dma_device_type": 1 00:12:10.100 }, 00:12:10.100 { 00:12:10.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.100 "dma_device_type": 2 00:12:10.100 }, 00:12:10.100 { 00:12:10.100 "dma_device_id": "system", 00:12:10.100 "dma_device_type": 1 00:12:10.100 }, 00:12:10.100 { 00:12:10.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.100 "dma_device_type": 2 00:12:10.100 } 00:12:10.100 ], 00:12:10.100 "driver_specific": { 00:12:10.100 "raid": { 00:12:10.100 "uuid": "9fe60e14-d788-4cf9-89f2-ef423673e075", 00:12:10.100 "strip_size_kb": 64, 00:12:10.100 "state": "online", 00:12:10.100 "raid_level": "concat", 00:12:10.100 "superblock": false, 00:12:10.100 "num_base_bdevs": 3, 00:12:10.100 "num_base_bdevs_discovered": 3, 00:12:10.100 "num_base_bdevs_operational": 3, 00:12:10.100 "base_bdevs_list": [ 00:12:10.100 { 00:12:10.100 "name": "BaseBdev1", 00:12:10.100 "uuid": "8f270669-a480-4aca-8fbd-73cd1190963c", 00:12:10.100 "is_configured": true, 00:12:10.100 "data_offset": 0, 00:12:10.100 "data_size": 65536 00:12:10.100 }, 00:12:10.100 { 00:12:10.100 "name": "BaseBdev2", 00:12:10.100 "uuid": "347b844a-195f-456d-b255-963e624d9cb4", 00:12:10.100 "is_configured": true, 00:12:10.100 "data_offset": 0, 00:12:10.100 "data_size": 65536 00:12:10.100 }, 00:12:10.100 { 00:12:10.100 "name": "BaseBdev3", 00:12:10.100 "uuid": "3b825a4f-b7f8-4250-a758-928309bc4858", 00:12:10.100 "is_configured": true, 00:12:10.100 "data_offset": 0, 00:12:10.100 "data_size": 65536 00:12:10.100 } 00:12:10.100 ] 00:12:10.100 } 00:12:10.100 } 00:12:10.100 }' 00:12:10.100 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:10.100 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:10.100 BaseBdev2 00:12:10.100 BaseBdev3' 00:12:10.100 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:10.100 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:10.100 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:10.359 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:10.359 "name": "BaseBdev1", 00:12:10.359 "aliases": [ 00:12:10.359 "8f270669-a480-4aca-8fbd-73cd1190963c" 00:12:10.359 ], 00:12:10.359 "product_name": "Malloc disk", 00:12:10.359 "block_size": 512, 00:12:10.359 "num_blocks": 65536, 00:12:10.359 "uuid": "8f270669-a480-4aca-8fbd-73cd1190963c", 00:12:10.359 "assigned_rate_limits": { 00:12:10.359 "rw_ios_per_sec": 0, 00:12:10.359 "rw_mbytes_per_sec": 0, 00:12:10.359 "r_mbytes_per_sec": 0, 00:12:10.359 "w_mbytes_per_sec": 0 00:12:10.359 }, 00:12:10.359 "claimed": true, 00:12:10.359 "claim_type": "exclusive_write", 00:12:10.359 "zoned": false, 00:12:10.359 "supported_io_types": { 00:12:10.359 "read": true, 00:12:10.359 "write": true, 00:12:10.359 "unmap": true, 00:12:10.359 "flush": true, 00:12:10.359 "reset": true, 00:12:10.359 "nvme_admin": false, 00:12:10.359 "nvme_io": false, 00:12:10.359 "nvme_io_md": false, 00:12:10.359 "write_zeroes": true, 00:12:10.359 "zcopy": true, 00:12:10.359 "get_zone_info": false, 00:12:10.359 "zone_management": false, 00:12:10.359 "zone_append": false, 00:12:10.359 "compare": false, 00:12:10.359 "compare_and_write": false, 00:12:10.359 "abort": true, 00:12:10.359 "seek_hole": false, 00:12:10.359 "seek_data": false, 00:12:10.359 "copy": true, 00:12:10.359 "nvme_iov_md": false 00:12:10.359 }, 00:12:10.359 "memory_domains": [ 00:12:10.359 { 00:12:10.359 "dma_device_id": "system", 00:12:10.359 "dma_device_type": 1 00:12:10.359 }, 00:12:10.359 { 00:12:10.359 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.359 "dma_device_type": 2 00:12:10.359 } 00:12:10.359 ], 00:12:10.359 "driver_specific": {} 00:12:10.359 }' 00:12:10.359 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:10.359 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:10.359 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:10.359 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:10.618 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:10.618 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:10.618 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:10.618 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:10.618 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:10.618 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:10.618 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:10.618 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:10.618 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:10.618 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:10.618 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:10.877 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:10.877 "name": "BaseBdev2", 00:12:10.877 "aliases": [ 00:12:10.877 "347b844a-195f-456d-b255-963e624d9cb4" 00:12:10.877 ], 00:12:10.877 "product_name": "Malloc disk", 00:12:10.877 "block_size": 512, 00:12:10.877 "num_blocks": 65536, 00:12:10.877 "uuid": "347b844a-195f-456d-b255-963e624d9cb4", 00:12:10.877 "assigned_rate_limits": { 00:12:10.877 "rw_ios_per_sec": 0, 00:12:10.877 "rw_mbytes_per_sec": 0, 00:12:10.877 "r_mbytes_per_sec": 0, 00:12:10.877 "w_mbytes_per_sec": 0 00:12:10.877 }, 00:12:10.877 "claimed": true, 00:12:10.877 "claim_type": "exclusive_write", 00:12:10.877 "zoned": false, 00:12:10.877 "supported_io_types": { 00:12:10.877 "read": true, 00:12:10.877 "write": true, 00:12:10.877 "unmap": true, 00:12:10.877 "flush": true, 00:12:10.877 "reset": true, 00:12:10.877 "nvme_admin": false, 00:12:10.877 "nvme_io": false, 00:12:10.877 "nvme_io_md": false, 00:12:10.877 "write_zeroes": true, 00:12:10.877 "zcopy": true, 00:12:10.877 "get_zone_info": false, 00:12:10.877 "zone_management": false, 00:12:10.877 "zone_append": false, 00:12:10.877 "compare": false, 00:12:10.877 "compare_and_write": false, 00:12:10.877 "abort": true, 00:12:10.877 "seek_hole": false, 00:12:10.877 "seek_data": false, 00:12:10.877 "copy": true, 00:12:10.877 "nvme_iov_md": false 00:12:10.877 }, 00:12:10.877 "memory_domains": [ 00:12:10.877 { 00:12:10.877 "dma_device_id": "system", 00:12:10.877 "dma_device_type": 1 00:12:10.877 }, 00:12:10.877 { 00:12:10.877 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.877 "dma_device_type": 2 00:12:10.877 } 00:12:10.877 ], 00:12:10.877 "driver_specific": {} 00:12:10.877 }' 00:12:10.877 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:10.877 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:10.877 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:10.877 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:10.877 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.135 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:11.135 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.135 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.135 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:11.135 18:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:11.135 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:11.135 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:11.135 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:11.135 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:11.135 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:11.392 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:11.392 "name": "BaseBdev3", 00:12:11.392 "aliases": [ 00:12:11.392 "3b825a4f-b7f8-4250-a758-928309bc4858" 00:12:11.392 ], 00:12:11.392 "product_name": "Malloc disk", 00:12:11.392 "block_size": 512, 00:12:11.393 "num_blocks": 65536, 00:12:11.393 "uuid": "3b825a4f-b7f8-4250-a758-928309bc4858", 00:12:11.393 "assigned_rate_limits": { 00:12:11.393 "rw_ios_per_sec": 0, 00:12:11.393 "rw_mbytes_per_sec": 0, 00:12:11.393 "r_mbytes_per_sec": 0, 00:12:11.393 "w_mbytes_per_sec": 0 00:12:11.393 }, 00:12:11.393 "claimed": true, 00:12:11.393 "claim_type": "exclusive_write", 00:12:11.393 "zoned": false, 00:12:11.393 "supported_io_types": { 00:12:11.393 "read": true, 00:12:11.393 "write": true, 00:12:11.393 "unmap": true, 00:12:11.393 "flush": true, 00:12:11.393 "reset": true, 00:12:11.393 "nvme_admin": false, 00:12:11.393 "nvme_io": false, 00:12:11.393 "nvme_io_md": false, 00:12:11.393 "write_zeroes": true, 00:12:11.393 "zcopy": true, 00:12:11.393 "get_zone_info": false, 00:12:11.393 "zone_management": false, 00:12:11.393 "zone_append": false, 00:12:11.393 "compare": false, 00:12:11.393 "compare_and_write": false, 00:12:11.393 "abort": true, 00:12:11.393 "seek_hole": false, 00:12:11.393 "seek_data": false, 00:12:11.393 "copy": true, 00:12:11.393 "nvme_iov_md": false 00:12:11.393 }, 00:12:11.393 "memory_domains": [ 00:12:11.393 { 00:12:11.393 "dma_device_id": "system", 00:12:11.393 "dma_device_type": 1 00:12:11.393 }, 00:12:11.393 { 00:12:11.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:11.393 "dma_device_type": 2 00:12:11.393 } 00:12:11.393 ], 00:12:11.393 "driver_specific": {} 00:12:11.393 }' 00:12:11.393 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:11.393 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:11.393 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:11.393 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.393 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.393 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:11.393 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.650 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.650 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:11.650 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:11.650 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:11.650 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:11.650 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:11.908 [2024-07-24 18:48:56.700076] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:11.908 [2024-07-24 18:48:56.700096] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:11.908 [2024-07-24 18:48:56.700124] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:11.908 "name": "Existed_Raid", 00:12:11.908 "uuid": "9fe60e14-d788-4cf9-89f2-ef423673e075", 00:12:11.908 "strip_size_kb": 64, 00:12:11.908 "state": "offline", 00:12:11.908 "raid_level": "concat", 00:12:11.908 "superblock": false, 00:12:11.908 "num_base_bdevs": 3, 00:12:11.908 "num_base_bdevs_discovered": 2, 00:12:11.908 "num_base_bdevs_operational": 2, 00:12:11.908 "base_bdevs_list": [ 00:12:11.908 { 00:12:11.908 "name": null, 00:12:11.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:11.908 "is_configured": false, 00:12:11.908 "data_offset": 0, 00:12:11.908 "data_size": 65536 00:12:11.908 }, 00:12:11.908 { 00:12:11.908 "name": "BaseBdev2", 00:12:11.908 "uuid": "347b844a-195f-456d-b255-963e624d9cb4", 00:12:11.908 "is_configured": true, 00:12:11.908 "data_offset": 0, 00:12:11.908 "data_size": 65536 00:12:11.908 }, 00:12:11.908 { 00:12:11.908 "name": "BaseBdev3", 00:12:11.908 "uuid": "3b825a4f-b7f8-4250-a758-928309bc4858", 00:12:11.908 "is_configured": true, 00:12:11.908 "data_offset": 0, 00:12:11.908 "data_size": 65536 00:12:11.908 } 00:12:11.908 ] 00:12:11.908 }' 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:11.908 18:48:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:12.474 18:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:12.474 18:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:12.474 18:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:12.474 18:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:12.732 18:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:12.732 18:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:12.732 18:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:12.732 [2024-07-24 18:48:57.691584] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:12.732 18:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:12.732 18:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:12.732 18:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:12.732 18:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:12.991 18:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:12.991 18:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:12.991 18:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:13.249 [2024-07-24 18:48:58.042183] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:13.249 [2024-07-24 18:48:58.042213] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc69360 name Existed_Raid, state offline 00:12:13.249 18:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:13.249 18:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:13.249 18:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:13.249 18:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:13.249 18:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:13.249 18:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:13.249 18:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:12:13.249 18:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:12:13.249 18:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:13.249 18:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:13.507 BaseBdev2 00:12:13.507 18:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:12:13.507 18:48:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:13.507 18:48:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:13.507 18:48:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:13.507 18:48:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:13.507 18:48:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:13.507 18:48:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:13.765 18:48:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:13.765 [ 00:12:13.765 { 00:12:13.765 "name": "BaseBdev2", 00:12:13.765 "aliases": [ 00:12:13.765 "01f2c5aa-4699-4655-b5ec-1cd48518a78a" 00:12:13.765 ], 00:12:13.765 "product_name": "Malloc disk", 00:12:13.765 "block_size": 512, 00:12:13.765 "num_blocks": 65536, 00:12:13.765 "uuid": "01f2c5aa-4699-4655-b5ec-1cd48518a78a", 00:12:13.765 "assigned_rate_limits": { 00:12:13.765 "rw_ios_per_sec": 0, 00:12:13.765 "rw_mbytes_per_sec": 0, 00:12:13.765 "r_mbytes_per_sec": 0, 00:12:13.765 "w_mbytes_per_sec": 0 00:12:13.765 }, 00:12:13.765 "claimed": false, 00:12:13.765 "zoned": false, 00:12:13.765 "supported_io_types": { 00:12:13.765 "read": true, 00:12:13.765 "write": true, 00:12:13.765 "unmap": true, 00:12:13.765 "flush": true, 00:12:13.765 "reset": true, 00:12:13.765 "nvme_admin": false, 00:12:13.765 "nvme_io": false, 00:12:13.765 "nvme_io_md": false, 00:12:13.765 "write_zeroes": true, 00:12:13.765 "zcopy": true, 00:12:13.765 "get_zone_info": false, 00:12:13.765 "zone_management": false, 00:12:13.765 "zone_append": false, 00:12:13.765 "compare": false, 00:12:13.765 "compare_and_write": false, 00:12:13.765 "abort": true, 00:12:13.765 "seek_hole": false, 00:12:13.765 "seek_data": false, 00:12:13.765 "copy": true, 00:12:13.765 "nvme_iov_md": false 00:12:13.765 }, 00:12:13.765 "memory_domains": [ 00:12:13.765 { 00:12:13.765 "dma_device_id": "system", 00:12:13.765 "dma_device_type": 1 00:12:13.765 }, 00:12:13.765 { 00:12:13.765 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:13.765 "dma_device_type": 2 00:12:13.765 } 00:12:13.765 ], 00:12:13.765 "driver_specific": {} 00:12:13.765 } 00:12:13.765 ] 00:12:13.765 18:48:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:13.765 18:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:13.765 18:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:13.765 18:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:14.024 BaseBdev3 00:12:14.024 18:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:12:14.024 18:48:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:14.024 18:48:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:14.024 18:48:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:14.024 18:48:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:14.024 18:48:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:14.024 18:48:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:14.282 18:48:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:14.282 [ 00:12:14.282 { 00:12:14.282 "name": "BaseBdev3", 00:12:14.282 "aliases": [ 00:12:14.282 "ed7c59b4-0e35-4cef-be81-05c1b48410bf" 00:12:14.282 ], 00:12:14.282 "product_name": "Malloc disk", 00:12:14.282 "block_size": 512, 00:12:14.282 "num_blocks": 65536, 00:12:14.282 "uuid": "ed7c59b4-0e35-4cef-be81-05c1b48410bf", 00:12:14.282 "assigned_rate_limits": { 00:12:14.282 "rw_ios_per_sec": 0, 00:12:14.282 "rw_mbytes_per_sec": 0, 00:12:14.282 "r_mbytes_per_sec": 0, 00:12:14.282 "w_mbytes_per_sec": 0 00:12:14.282 }, 00:12:14.282 "claimed": false, 00:12:14.282 "zoned": false, 00:12:14.282 "supported_io_types": { 00:12:14.282 "read": true, 00:12:14.282 "write": true, 00:12:14.282 "unmap": true, 00:12:14.282 "flush": true, 00:12:14.282 "reset": true, 00:12:14.282 "nvme_admin": false, 00:12:14.282 "nvme_io": false, 00:12:14.282 "nvme_io_md": false, 00:12:14.282 "write_zeroes": true, 00:12:14.282 "zcopy": true, 00:12:14.282 "get_zone_info": false, 00:12:14.282 "zone_management": false, 00:12:14.282 "zone_append": false, 00:12:14.283 "compare": false, 00:12:14.283 "compare_and_write": false, 00:12:14.283 "abort": true, 00:12:14.283 "seek_hole": false, 00:12:14.283 "seek_data": false, 00:12:14.283 "copy": true, 00:12:14.283 "nvme_iov_md": false 00:12:14.283 }, 00:12:14.283 "memory_domains": [ 00:12:14.283 { 00:12:14.283 "dma_device_id": "system", 00:12:14.283 "dma_device_type": 1 00:12:14.283 }, 00:12:14.283 { 00:12:14.283 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:14.283 "dma_device_type": 2 00:12:14.283 } 00:12:14.283 ], 00:12:14.283 "driver_specific": {} 00:12:14.283 } 00:12:14.283 ] 00:12:14.283 18:48:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:14.283 18:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:14.283 18:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:14.283 18:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:14.540 [2024-07-24 18:48:59.366797] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:14.540 [2024-07-24 18:48:59.366827] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:14.540 [2024-07-24 18:48:59.366838] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:14.540 [2024-07-24 18:48:59.367773] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:14.540 18:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:14.540 18:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:14.540 18:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:14.540 18:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:14.540 18:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:14.540 18:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:14.540 18:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:14.540 18:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:14.540 18:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:14.540 18:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:14.540 18:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.540 18:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:14.798 18:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:14.798 "name": "Existed_Raid", 00:12:14.798 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:14.798 "strip_size_kb": 64, 00:12:14.798 "state": "configuring", 00:12:14.798 "raid_level": "concat", 00:12:14.798 "superblock": false, 00:12:14.798 "num_base_bdevs": 3, 00:12:14.798 "num_base_bdevs_discovered": 2, 00:12:14.798 "num_base_bdevs_operational": 3, 00:12:14.798 "base_bdevs_list": [ 00:12:14.798 { 00:12:14.798 "name": "BaseBdev1", 00:12:14.798 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:14.798 "is_configured": false, 00:12:14.798 "data_offset": 0, 00:12:14.798 "data_size": 0 00:12:14.798 }, 00:12:14.798 { 00:12:14.798 "name": "BaseBdev2", 00:12:14.798 "uuid": "01f2c5aa-4699-4655-b5ec-1cd48518a78a", 00:12:14.798 "is_configured": true, 00:12:14.798 "data_offset": 0, 00:12:14.798 "data_size": 65536 00:12:14.798 }, 00:12:14.798 { 00:12:14.798 "name": "BaseBdev3", 00:12:14.798 "uuid": "ed7c59b4-0e35-4cef-be81-05c1b48410bf", 00:12:14.798 "is_configured": true, 00:12:14.798 "data_offset": 0, 00:12:14.798 "data_size": 65536 00:12:14.798 } 00:12:14.798 ] 00:12:14.798 }' 00:12:14.798 18:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:14.798 18:48:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:15.056 18:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:15.313 [2024-07-24 18:49:00.180892] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:15.313 18:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:15.313 18:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:15.313 18:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:15.313 18:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:15.313 18:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:15.313 18:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:15.313 18:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:15.313 18:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:15.313 18:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:15.313 18:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:15.313 18:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:15.313 18:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:15.570 18:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:15.570 "name": "Existed_Raid", 00:12:15.571 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:15.571 "strip_size_kb": 64, 00:12:15.571 "state": "configuring", 00:12:15.571 "raid_level": "concat", 00:12:15.571 "superblock": false, 00:12:15.571 "num_base_bdevs": 3, 00:12:15.571 "num_base_bdevs_discovered": 1, 00:12:15.571 "num_base_bdevs_operational": 3, 00:12:15.571 "base_bdevs_list": [ 00:12:15.571 { 00:12:15.571 "name": "BaseBdev1", 00:12:15.571 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:15.571 "is_configured": false, 00:12:15.571 "data_offset": 0, 00:12:15.571 "data_size": 0 00:12:15.571 }, 00:12:15.571 { 00:12:15.571 "name": null, 00:12:15.571 "uuid": "01f2c5aa-4699-4655-b5ec-1cd48518a78a", 00:12:15.571 "is_configured": false, 00:12:15.571 "data_offset": 0, 00:12:15.571 "data_size": 65536 00:12:15.571 }, 00:12:15.571 { 00:12:15.571 "name": "BaseBdev3", 00:12:15.571 "uuid": "ed7c59b4-0e35-4cef-be81-05c1b48410bf", 00:12:15.571 "is_configured": true, 00:12:15.571 "data_offset": 0, 00:12:15.571 "data_size": 65536 00:12:15.571 } 00:12:15.571 ] 00:12:15.571 }' 00:12:15.571 18:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:15.571 18:49:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:16.136 18:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:16.136 18:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:16.136 18:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:12:16.136 18:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:16.393 [2024-07-24 18:49:01.182036] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:16.393 BaseBdev1 00:12:16.393 18:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:12:16.393 18:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:16.393 18:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:16.393 18:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:16.393 18:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:16.393 18:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:16.393 18:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:16.393 18:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:16.651 [ 00:12:16.651 { 00:12:16.651 "name": "BaseBdev1", 00:12:16.651 "aliases": [ 00:12:16.651 "7c197cbb-eaee-42ae-ad0c-124aea27fac8" 00:12:16.651 ], 00:12:16.651 "product_name": "Malloc disk", 00:12:16.651 "block_size": 512, 00:12:16.651 "num_blocks": 65536, 00:12:16.651 "uuid": "7c197cbb-eaee-42ae-ad0c-124aea27fac8", 00:12:16.651 "assigned_rate_limits": { 00:12:16.651 "rw_ios_per_sec": 0, 00:12:16.651 "rw_mbytes_per_sec": 0, 00:12:16.651 "r_mbytes_per_sec": 0, 00:12:16.651 "w_mbytes_per_sec": 0 00:12:16.651 }, 00:12:16.651 "claimed": true, 00:12:16.651 "claim_type": "exclusive_write", 00:12:16.651 "zoned": false, 00:12:16.651 "supported_io_types": { 00:12:16.651 "read": true, 00:12:16.651 "write": true, 00:12:16.651 "unmap": true, 00:12:16.651 "flush": true, 00:12:16.651 "reset": true, 00:12:16.651 "nvme_admin": false, 00:12:16.651 "nvme_io": false, 00:12:16.651 "nvme_io_md": false, 00:12:16.651 "write_zeroes": true, 00:12:16.651 "zcopy": true, 00:12:16.651 "get_zone_info": false, 00:12:16.651 "zone_management": false, 00:12:16.651 "zone_append": false, 00:12:16.651 "compare": false, 00:12:16.651 "compare_and_write": false, 00:12:16.651 "abort": true, 00:12:16.651 "seek_hole": false, 00:12:16.651 "seek_data": false, 00:12:16.651 "copy": true, 00:12:16.651 "nvme_iov_md": false 00:12:16.651 }, 00:12:16.651 "memory_domains": [ 00:12:16.651 { 00:12:16.651 "dma_device_id": "system", 00:12:16.651 "dma_device_type": 1 00:12:16.651 }, 00:12:16.651 { 00:12:16.651 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:16.651 "dma_device_type": 2 00:12:16.651 } 00:12:16.651 ], 00:12:16.651 "driver_specific": {} 00:12:16.651 } 00:12:16.651 ] 00:12:16.651 18:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:16.651 18:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:16.651 18:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:16.652 18:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:16.652 18:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:16.652 18:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:16.652 18:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:16.652 18:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:16.652 18:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:16.652 18:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:16.652 18:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:16.652 18:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:16.652 18:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:16.910 18:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:16.910 "name": "Existed_Raid", 00:12:16.910 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:16.910 "strip_size_kb": 64, 00:12:16.910 "state": "configuring", 00:12:16.910 "raid_level": "concat", 00:12:16.910 "superblock": false, 00:12:16.910 "num_base_bdevs": 3, 00:12:16.910 "num_base_bdevs_discovered": 2, 00:12:16.910 "num_base_bdevs_operational": 3, 00:12:16.910 "base_bdevs_list": [ 00:12:16.910 { 00:12:16.910 "name": "BaseBdev1", 00:12:16.910 "uuid": "7c197cbb-eaee-42ae-ad0c-124aea27fac8", 00:12:16.910 "is_configured": true, 00:12:16.910 "data_offset": 0, 00:12:16.910 "data_size": 65536 00:12:16.910 }, 00:12:16.910 { 00:12:16.910 "name": null, 00:12:16.910 "uuid": "01f2c5aa-4699-4655-b5ec-1cd48518a78a", 00:12:16.910 "is_configured": false, 00:12:16.910 "data_offset": 0, 00:12:16.910 "data_size": 65536 00:12:16.910 }, 00:12:16.910 { 00:12:16.910 "name": "BaseBdev3", 00:12:16.910 "uuid": "ed7c59b4-0e35-4cef-be81-05c1b48410bf", 00:12:16.910 "is_configured": true, 00:12:16.910 "data_offset": 0, 00:12:16.910 "data_size": 65536 00:12:16.910 } 00:12:16.910 ] 00:12:16.910 }' 00:12:16.910 18:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:16.910 18:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:17.475 18:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:17.475 18:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:17.475 18:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:12:17.475 18:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:17.733 [2024-07-24 18:49:02.513519] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:17.733 18:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:17.733 18:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:17.733 18:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:17.733 18:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:17.733 18:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:17.733 18:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:17.733 18:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:17.733 18:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:17.733 18:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:17.733 18:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:17.733 18:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:17.733 18:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:17.733 18:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:17.733 "name": "Existed_Raid", 00:12:17.733 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:17.733 "strip_size_kb": 64, 00:12:17.733 "state": "configuring", 00:12:17.733 "raid_level": "concat", 00:12:17.733 "superblock": false, 00:12:17.733 "num_base_bdevs": 3, 00:12:17.733 "num_base_bdevs_discovered": 1, 00:12:17.733 "num_base_bdevs_operational": 3, 00:12:17.733 "base_bdevs_list": [ 00:12:17.733 { 00:12:17.733 "name": "BaseBdev1", 00:12:17.733 "uuid": "7c197cbb-eaee-42ae-ad0c-124aea27fac8", 00:12:17.733 "is_configured": true, 00:12:17.733 "data_offset": 0, 00:12:17.733 "data_size": 65536 00:12:17.733 }, 00:12:17.733 { 00:12:17.733 "name": null, 00:12:17.733 "uuid": "01f2c5aa-4699-4655-b5ec-1cd48518a78a", 00:12:17.733 "is_configured": false, 00:12:17.733 "data_offset": 0, 00:12:17.733 "data_size": 65536 00:12:17.733 }, 00:12:17.733 { 00:12:17.733 "name": null, 00:12:17.733 "uuid": "ed7c59b4-0e35-4cef-be81-05c1b48410bf", 00:12:17.733 "is_configured": false, 00:12:17.733 "data_offset": 0, 00:12:17.733 "data_size": 65536 00:12:17.733 } 00:12:17.733 ] 00:12:17.733 }' 00:12:17.733 18:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:17.733 18:49:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:18.333 18:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:18.333 18:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:18.333 18:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:12:18.333 18:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:18.590 [2024-07-24 18:49:03.488053] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:18.590 18:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:18.590 18:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:18.590 18:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:18.590 18:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:18.590 18:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:18.590 18:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:18.590 18:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:18.590 18:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:18.590 18:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:18.590 18:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:18.590 18:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:18.590 18:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:18.847 18:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:18.847 "name": "Existed_Raid", 00:12:18.847 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:18.847 "strip_size_kb": 64, 00:12:18.847 "state": "configuring", 00:12:18.847 "raid_level": "concat", 00:12:18.847 "superblock": false, 00:12:18.847 "num_base_bdevs": 3, 00:12:18.847 "num_base_bdevs_discovered": 2, 00:12:18.847 "num_base_bdevs_operational": 3, 00:12:18.847 "base_bdevs_list": [ 00:12:18.847 { 00:12:18.847 "name": "BaseBdev1", 00:12:18.847 "uuid": "7c197cbb-eaee-42ae-ad0c-124aea27fac8", 00:12:18.847 "is_configured": true, 00:12:18.847 "data_offset": 0, 00:12:18.847 "data_size": 65536 00:12:18.847 }, 00:12:18.847 { 00:12:18.847 "name": null, 00:12:18.847 "uuid": "01f2c5aa-4699-4655-b5ec-1cd48518a78a", 00:12:18.847 "is_configured": false, 00:12:18.847 "data_offset": 0, 00:12:18.847 "data_size": 65536 00:12:18.847 }, 00:12:18.847 { 00:12:18.847 "name": "BaseBdev3", 00:12:18.847 "uuid": "ed7c59b4-0e35-4cef-be81-05c1b48410bf", 00:12:18.847 "is_configured": true, 00:12:18.847 "data_offset": 0, 00:12:18.847 "data_size": 65536 00:12:18.847 } 00:12:18.847 ] 00:12:18.847 }' 00:12:18.848 18:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:18.848 18:49:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:19.420 18:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:19.420 18:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:19.420 18:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:12:19.420 18:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:19.678 [2024-07-24 18:49:04.438514] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:19.678 18:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:19.678 18:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:19.678 18:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:19.678 18:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:19.678 18:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:19.678 18:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:19.678 18:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:19.678 18:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:19.678 18:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:19.678 18:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:19.678 18:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:19.678 18:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:19.678 18:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:19.678 "name": "Existed_Raid", 00:12:19.678 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:19.678 "strip_size_kb": 64, 00:12:19.678 "state": "configuring", 00:12:19.678 "raid_level": "concat", 00:12:19.678 "superblock": false, 00:12:19.678 "num_base_bdevs": 3, 00:12:19.678 "num_base_bdevs_discovered": 1, 00:12:19.678 "num_base_bdevs_operational": 3, 00:12:19.678 "base_bdevs_list": [ 00:12:19.678 { 00:12:19.678 "name": null, 00:12:19.678 "uuid": "7c197cbb-eaee-42ae-ad0c-124aea27fac8", 00:12:19.678 "is_configured": false, 00:12:19.678 "data_offset": 0, 00:12:19.678 "data_size": 65536 00:12:19.678 }, 00:12:19.678 { 00:12:19.678 "name": null, 00:12:19.678 "uuid": "01f2c5aa-4699-4655-b5ec-1cd48518a78a", 00:12:19.678 "is_configured": false, 00:12:19.678 "data_offset": 0, 00:12:19.678 "data_size": 65536 00:12:19.678 }, 00:12:19.678 { 00:12:19.678 "name": "BaseBdev3", 00:12:19.678 "uuid": "ed7c59b4-0e35-4cef-be81-05c1b48410bf", 00:12:19.678 "is_configured": true, 00:12:19.678 "data_offset": 0, 00:12:19.678 "data_size": 65536 00:12:19.678 } 00:12:19.678 ] 00:12:19.678 }' 00:12:19.678 18:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:19.678 18:49:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:20.243 18:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:20.243 18:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:20.501 18:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:12:20.501 18:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:20.501 [2024-07-24 18:49:05.450724] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:20.501 18:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:20.501 18:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:20.501 18:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:20.501 18:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:20.501 18:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:20.501 18:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:20.501 18:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:20.501 18:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:20.501 18:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:20.501 18:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:20.501 18:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:20.501 18:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:20.759 18:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:20.759 "name": "Existed_Raid", 00:12:20.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:20.759 "strip_size_kb": 64, 00:12:20.759 "state": "configuring", 00:12:20.759 "raid_level": "concat", 00:12:20.759 "superblock": false, 00:12:20.759 "num_base_bdevs": 3, 00:12:20.759 "num_base_bdevs_discovered": 2, 00:12:20.759 "num_base_bdevs_operational": 3, 00:12:20.759 "base_bdevs_list": [ 00:12:20.759 { 00:12:20.759 "name": null, 00:12:20.759 "uuid": "7c197cbb-eaee-42ae-ad0c-124aea27fac8", 00:12:20.759 "is_configured": false, 00:12:20.759 "data_offset": 0, 00:12:20.759 "data_size": 65536 00:12:20.759 }, 00:12:20.759 { 00:12:20.759 "name": "BaseBdev2", 00:12:20.759 "uuid": "01f2c5aa-4699-4655-b5ec-1cd48518a78a", 00:12:20.759 "is_configured": true, 00:12:20.759 "data_offset": 0, 00:12:20.759 "data_size": 65536 00:12:20.759 }, 00:12:20.759 { 00:12:20.759 "name": "BaseBdev3", 00:12:20.759 "uuid": "ed7c59b4-0e35-4cef-be81-05c1b48410bf", 00:12:20.759 "is_configured": true, 00:12:20.759 "data_offset": 0, 00:12:20.759 "data_size": 65536 00:12:20.759 } 00:12:20.759 ] 00:12:20.759 }' 00:12:20.759 18:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:20.759 18:49:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:21.322 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:21.322 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.322 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:12:21.322 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.322 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:21.580 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 7c197cbb-eaee-42ae-ad0c-124aea27fac8 00:12:21.838 [2024-07-24 18:49:06.600399] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:21.838 [2024-07-24 18:49:06.600425] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xe11cf0 00:12:21.838 [2024-07-24 18:49:06.600429] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:21.838 [2024-07-24 18:49:06.600569] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe0dee0 00:12:21.838 [2024-07-24 18:49:06.600655] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe11cf0 00:12:21.838 [2024-07-24 18:49:06.600659] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe11cf0 00:12:21.838 [2024-07-24 18:49:06.600770] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:21.838 NewBaseBdev 00:12:21.838 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:12:21.838 18:49:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:12:21.838 18:49:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:21.838 18:49:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:21.838 18:49:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:21.838 18:49:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:21.838 18:49:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:21.838 18:49:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:22.096 [ 00:12:22.096 { 00:12:22.096 "name": "NewBaseBdev", 00:12:22.096 "aliases": [ 00:12:22.096 "7c197cbb-eaee-42ae-ad0c-124aea27fac8" 00:12:22.096 ], 00:12:22.096 "product_name": "Malloc disk", 00:12:22.096 "block_size": 512, 00:12:22.096 "num_blocks": 65536, 00:12:22.096 "uuid": "7c197cbb-eaee-42ae-ad0c-124aea27fac8", 00:12:22.096 "assigned_rate_limits": { 00:12:22.096 "rw_ios_per_sec": 0, 00:12:22.096 "rw_mbytes_per_sec": 0, 00:12:22.096 "r_mbytes_per_sec": 0, 00:12:22.096 "w_mbytes_per_sec": 0 00:12:22.096 }, 00:12:22.096 "claimed": true, 00:12:22.096 "claim_type": "exclusive_write", 00:12:22.096 "zoned": false, 00:12:22.096 "supported_io_types": { 00:12:22.096 "read": true, 00:12:22.096 "write": true, 00:12:22.096 "unmap": true, 00:12:22.096 "flush": true, 00:12:22.096 "reset": true, 00:12:22.096 "nvme_admin": false, 00:12:22.096 "nvme_io": false, 00:12:22.096 "nvme_io_md": false, 00:12:22.096 "write_zeroes": true, 00:12:22.096 "zcopy": true, 00:12:22.096 "get_zone_info": false, 00:12:22.096 "zone_management": false, 00:12:22.096 "zone_append": false, 00:12:22.096 "compare": false, 00:12:22.096 "compare_and_write": false, 00:12:22.096 "abort": true, 00:12:22.096 "seek_hole": false, 00:12:22.096 "seek_data": false, 00:12:22.096 "copy": true, 00:12:22.096 "nvme_iov_md": false 00:12:22.096 }, 00:12:22.096 "memory_domains": [ 00:12:22.096 { 00:12:22.096 "dma_device_id": "system", 00:12:22.096 "dma_device_type": 1 00:12:22.096 }, 00:12:22.096 { 00:12:22.096 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:22.096 "dma_device_type": 2 00:12:22.096 } 00:12:22.096 ], 00:12:22.096 "driver_specific": {} 00:12:22.096 } 00:12:22.096 ] 00:12:22.096 18:49:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:22.096 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:12:22.096 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:22.096 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:22.096 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:22.096 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:22.097 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:22.097 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:22.097 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:22.097 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:22.097 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:22.097 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:22.097 18:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:22.097 18:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:22.097 "name": "Existed_Raid", 00:12:22.097 "uuid": "e31cb93d-dff7-4490-b128-b7c2d5eece21", 00:12:22.097 "strip_size_kb": 64, 00:12:22.097 "state": "online", 00:12:22.097 "raid_level": "concat", 00:12:22.097 "superblock": false, 00:12:22.097 "num_base_bdevs": 3, 00:12:22.097 "num_base_bdevs_discovered": 3, 00:12:22.097 "num_base_bdevs_operational": 3, 00:12:22.097 "base_bdevs_list": [ 00:12:22.097 { 00:12:22.097 "name": "NewBaseBdev", 00:12:22.097 "uuid": "7c197cbb-eaee-42ae-ad0c-124aea27fac8", 00:12:22.097 "is_configured": true, 00:12:22.097 "data_offset": 0, 00:12:22.097 "data_size": 65536 00:12:22.097 }, 00:12:22.097 { 00:12:22.097 "name": "BaseBdev2", 00:12:22.097 "uuid": "01f2c5aa-4699-4655-b5ec-1cd48518a78a", 00:12:22.097 "is_configured": true, 00:12:22.097 "data_offset": 0, 00:12:22.097 "data_size": 65536 00:12:22.097 }, 00:12:22.097 { 00:12:22.097 "name": "BaseBdev3", 00:12:22.097 "uuid": "ed7c59b4-0e35-4cef-be81-05c1b48410bf", 00:12:22.097 "is_configured": true, 00:12:22.097 "data_offset": 0, 00:12:22.097 "data_size": 65536 00:12:22.097 } 00:12:22.097 ] 00:12:22.097 }' 00:12:22.097 18:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:22.097 18:49:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:22.663 18:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:12:22.663 18:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:22.663 18:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:22.663 18:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:22.663 18:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:22.663 18:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:22.663 18:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:22.663 18:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:22.921 [2024-07-24 18:49:07.755665] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:22.921 18:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:22.921 "name": "Existed_Raid", 00:12:22.921 "aliases": [ 00:12:22.921 "e31cb93d-dff7-4490-b128-b7c2d5eece21" 00:12:22.921 ], 00:12:22.921 "product_name": "Raid Volume", 00:12:22.921 "block_size": 512, 00:12:22.921 "num_blocks": 196608, 00:12:22.921 "uuid": "e31cb93d-dff7-4490-b128-b7c2d5eece21", 00:12:22.921 "assigned_rate_limits": { 00:12:22.921 "rw_ios_per_sec": 0, 00:12:22.921 "rw_mbytes_per_sec": 0, 00:12:22.921 "r_mbytes_per_sec": 0, 00:12:22.921 "w_mbytes_per_sec": 0 00:12:22.921 }, 00:12:22.921 "claimed": false, 00:12:22.921 "zoned": false, 00:12:22.921 "supported_io_types": { 00:12:22.921 "read": true, 00:12:22.921 "write": true, 00:12:22.921 "unmap": true, 00:12:22.921 "flush": true, 00:12:22.921 "reset": true, 00:12:22.921 "nvme_admin": false, 00:12:22.921 "nvme_io": false, 00:12:22.921 "nvme_io_md": false, 00:12:22.921 "write_zeroes": true, 00:12:22.921 "zcopy": false, 00:12:22.921 "get_zone_info": false, 00:12:22.921 "zone_management": false, 00:12:22.921 "zone_append": false, 00:12:22.921 "compare": false, 00:12:22.921 "compare_and_write": false, 00:12:22.921 "abort": false, 00:12:22.921 "seek_hole": false, 00:12:22.921 "seek_data": false, 00:12:22.921 "copy": false, 00:12:22.921 "nvme_iov_md": false 00:12:22.921 }, 00:12:22.921 "memory_domains": [ 00:12:22.921 { 00:12:22.921 "dma_device_id": "system", 00:12:22.921 "dma_device_type": 1 00:12:22.921 }, 00:12:22.921 { 00:12:22.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:22.921 "dma_device_type": 2 00:12:22.921 }, 00:12:22.921 { 00:12:22.921 "dma_device_id": "system", 00:12:22.921 "dma_device_type": 1 00:12:22.921 }, 00:12:22.921 { 00:12:22.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:22.921 "dma_device_type": 2 00:12:22.921 }, 00:12:22.921 { 00:12:22.921 "dma_device_id": "system", 00:12:22.921 "dma_device_type": 1 00:12:22.921 }, 00:12:22.921 { 00:12:22.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:22.921 "dma_device_type": 2 00:12:22.921 } 00:12:22.921 ], 00:12:22.921 "driver_specific": { 00:12:22.921 "raid": { 00:12:22.921 "uuid": "e31cb93d-dff7-4490-b128-b7c2d5eece21", 00:12:22.921 "strip_size_kb": 64, 00:12:22.921 "state": "online", 00:12:22.921 "raid_level": "concat", 00:12:22.921 "superblock": false, 00:12:22.921 "num_base_bdevs": 3, 00:12:22.921 "num_base_bdevs_discovered": 3, 00:12:22.921 "num_base_bdevs_operational": 3, 00:12:22.921 "base_bdevs_list": [ 00:12:22.921 { 00:12:22.921 "name": "NewBaseBdev", 00:12:22.921 "uuid": "7c197cbb-eaee-42ae-ad0c-124aea27fac8", 00:12:22.921 "is_configured": true, 00:12:22.921 "data_offset": 0, 00:12:22.921 "data_size": 65536 00:12:22.921 }, 00:12:22.921 { 00:12:22.921 "name": "BaseBdev2", 00:12:22.921 "uuid": "01f2c5aa-4699-4655-b5ec-1cd48518a78a", 00:12:22.921 "is_configured": true, 00:12:22.921 "data_offset": 0, 00:12:22.921 "data_size": 65536 00:12:22.921 }, 00:12:22.921 { 00:12:22.921 "name": "BaseBdev3", 00:12:22.921 "uuid": "ed7c59b4-0e35-4cef-be81-05c1b48410bf", 00:12:22.921 "is_configured": true, 00:12:22.921 "data_offset": 0, 00:12:22.921 "data_size": 65536 00:12:22.921 } 00:12:22.921 ] 00:12:22.921 } 00:12:22.921 } 00:12:22.921 }' 00:12:22.921 18:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:22.921 18:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:12:22.921 BaseBdev2 00:12:22.921 BaseBdev3' 00:12:22.921 18:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:22.921 18:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:22.921 18:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:23.179 18:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:23.179 "name": "NewBaseBdev", 00:12:23.179 "aliases": [ 00:12:23.179 "7c197cbb-eaee-42ae-ad0c-124aea27fac8" 00:12:23.179 ], 00:12:23.179 "product_name": "Malloc disk", 00:12:23.179 "block_size": 512, 00:12:23.179 "num_blocks": 65536, 00:12:23.179 "uuid": "7c197cbb-eaee-42ae-ad0c-124aea27fac8", 00:12:23.179 "assigned_rate_limits": { 00:12:23.179 "rw_ios_per_sec": 0, 00:12:23.179 "rw_mbytes_per_sec": 0, 00:12:23.179 "r_mbytes_per_sec": 0, 00:12:23.179 "w_mbytes_per_sec": 0 00:12:23.179 }, 00:12:23.179 "claimed": true, 00:12:23.179 "claim_type": "exclusive_write", 00:12:23.179 "zoned": false, 00:12:23.179 "supported_io_types": { 00:12:23.179 "read": true, 00:12:23.179 "write": true, 00:12:23.179 "unmap": true, 00:12:23.179 "flush": true, 00:12:23.179 "reset": true, 00:12:23.179 "nvme_admin": false, 00:12:23.179 "nvme_io": false, 00:12:23.179 "nvme_io_md": false, 00:12:23.179 "write_zeroes": true, 00:12:23.179 "zcopy": true, 00:12:23.179 "get_zone_info": false, 00:12:23.179 "zone_management": false, 00:12:23.179 "zone_append": false, 00:12:23.179 "compare": false, 00:12:23.179 "compare_and_write": false, 00:12:23.179 "abort": true, 00:12:23.179 "seek_hole": false, 00:12:23.179 "seek_data": false, 00:12:23.179 "copy": true, 00:12:23.179 "nvme_iov_md": false 00:12:23.179 }, 00:12:23.179 "memory_domains": [ 00:12:23.179 { 00:12:23.179 "dma_device_id": "system", 00:12:23.179 "dma_device_type": 1 00:12:23.179 }, 00:12:23.179 { 00:12:23.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:23.179 "dma_device_type": 2 00:12:23.179 } 00:12:23.179 ], 00:12:23.179 "driver_specific": {} 00:12:23.179 }' 00:12:23.179 18:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:23.179 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:23.179 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:23.179 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:23.179 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:23.179 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:23.179 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:23.179 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:23.437 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:23.437 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:23.437 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:23.437 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:23.438 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:23.438 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:23.438 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:23.696 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:23.696 "name": "BaseBdev2", 00:12:23.696 "aliases": [ 00:12:23.696 "01f2c5aa-4699-4655-b5ec-1cd48518a78a" 00:12:23.696 ], 00:12:23.696 "product_name": "Malloc disk", 00:12:23.696 "block_size": 512, 00:12:23.696 "num_blocks": 65536, 00:12:23.696 "uuid": "01f2c5aa-4699-4655-b5ec-1cd48518a78a", 00:12:23.696 "assigned_rate_limits": { 00:12:23.696 "rw_ios_per_sec": 0, 00:12:23.696 "rw_mbytes_per_sec": 0, 00:12:23.696 "r_mbytes_per_sec": 0, 00:12:23.696 "w_mbytes_per_sec": 0 00:12:23.696 }, 00:12:23.696 "claimed": true, 00:12:23.696 "claim_type": "exclusive_write", 00:12:23.696 "zoned": false, 00:12:23.696 "supported_io_types": { 00:12:23.696 "read": true, 00:12:23.696 "write": true, 00:12:23.696 "unmap": true, 00:12:23.696 "flush": true, 00:12:23.696 "reset": true, 00:12:23.696 "nvme_admin": false, 00:12:23.696 "nvme_io": false, 00:12:23.696 "nvme_io_md": false, 00:12:23.696 "write_zeroes": true, 00:12:23.696 "zcopy": true, 00:12:23.696 "get_zone_info": false, 00:12:23.696 "zone_management": false, 00:12:23.696 "zone_append": false, 00:12:23.696 "compare": false, 00:12:23.696 "compare_and_write": false, 00:12:23.696 "abort": true, 00:12:23.696 "seek_hole": false, 00:12:23.696 "seek_data": false, 00:12:23.696 "copy": true, 00:12:23.696 "nvme_iov_md": false 00:12:23.696 }, 00:12:23.696 "memory_domains": [ 00:12:23.696 { 00:12:23.696 "dma_device_id": "system", 00:12:23.696 "dma_device_type": 1 00:12:23.696 }, 00:12:23.696 { 00:12:23.696 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:23.696 "dma_device_type": 2 00:12:23.696 } 00:12:23.696 ], 00:12:23.696 "driver_specific": {} 00:12:23.696 }' 00:12:23.696 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:23.696 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:23.696 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:23.696 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:23.696 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:23.696 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:23.696 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:23.696 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:23.954 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:23.954 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:23.954 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:23.954 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:23.954 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:23.954 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:23.954 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:24.212 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:24.212 "name": "BaseBdev3", 00:12:24.212 "aliases": [ 00:12:24.212 "ed7c59b4-0e35-4cef-be81-05c1b48410bf" 00:12:24.212 ], 00:12:24.212 "product_name": "Malloc disk", 00:12:24.212 "block_size": 512, 00:12:24.212 "num_blocks": 65536, 00:12:24.212 "uuid": "ed7c59b4-0e35-4cef-be81-05c1b48410bf", 00:12:24.212 "assigned_rate_limits": { 00:12:24.212 "rw_ios_per_sec": 0, 00:12:24.212 "rw_mbytes_per_sec": 0, 00:12:24.212 "r_mbytes_per_sec": 0, 00:12:24.212 "w_mbytes_per_sec": 0 00:12:24.212 }, 00:12:24.212 "claimed": true, 00:12:24.212 "claim_type": "exclusive_write", 00:12:24.212 "zoned": false, 00:12:24.212 "supported_io_types": { 00:12:24.212 "read": true, 00:12:24.212 "write": true, 00:12:24.212 "unmap": true, 00:12:24.212 "flush": true, 00:12:24.212 "reset": true, 00:12:24.212 "nvme_admin": false, 00:12:24.212 "nvme_io": false, 00:12:24.212 "nvme_io_md": false, 00:12:24.212 "write_zeroes": true, 00:12:24.212 "zcopy": true, 00:12:24.212 "get_zone_info": false, 00:12:24.212 "zone_management": false, 00:12:24.212 "zone_append": false, 00:12:24.212 "compare": false, 00:12:24.212 "compare_and_write": false, 00:12:24.212 "abort": true, 00:12:24.212 "seek_hole": false, 00:12:24.212 "seek_data": false, 00:12:24.212 "copy": true, 00:12:24.212 "nvme_iov_md": false 00:12:24.212 }, 00:12:24.212 "memory_domains": [ 00:12:24.212 { 00:12:24.212 "dma_device_id": "system", 00:12:24.212 "dma_device_type": 1 00:12:24.212 }, 00:12:24.212 { 00:12:24.212 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:24.212 "dma_device_type": 2 00:12:24.212 } 00:12:24.212 ], 00:12:24.212 "driver_specific": {} 00:12:24.212 }' 00:12:24.212 18:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:24.212 18:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:24.212 18:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:24.212 18:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:24.212 18:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:24.212 18:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:24.212 18:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:24.212 18:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:24.212 18:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:24.212 18:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:24.470 18:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:24.470 18:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:24.470 18:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:24.470 [2024-07-24 18:49:09.427813] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:24.470 [2024-07-24 18:49:09.427833] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:24.470 [2024-07-24 18:49:09.427873] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:24.470 [2024-07-24 18:49:09.427910] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:24.470 [2024-07-24 18:49:09.427916] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe11cf0 name Existed_Raid, state offline 00:12:24.470 18:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2079786 00:12:24.470 18:49:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2079786 ']' 00:12:24.470 18:49:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2079786 00:12:24.470 18:49:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:24.470 18:49:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:24.470 18:49:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2079786 00:12:24.729 18:49:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:24.729 18:49:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:24.729 18:49:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2079786' 00:12:24.729 killing process with pid 2079786 00:12:24.729 18:49:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2079786 00:12:24.729 [2024-07-24 18:49:09.482091] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:24.729 18:49:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2079786 00:12:24.729 [2024-07-24 18:49:09.505159] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:24.729 18:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:24.729 00:12:24.729 real 0m21.412s 00:12:24.729 user 0m39.809s 00:12:24.729 sys 0m3.407s 00:12:24.729 18:49:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:24.729 18:49:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:24.729 ************************************ 00:12:24.729 END TEST raid_state_function_test 00:12:24.729 ************************************ 00:12:24.729 18:49:09 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:12:24.729 18:49:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:24.729 18:49:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:24.729 18:49:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:24.988 ************************************ 00:12:24.988 START TEST raid_state_function_test_sb 00:12:24.988 ************************************ 00:12:24.988 18:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:12:24.988 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:24.988 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:24.988 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:24.988 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:24.988 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:24.988 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:24.988 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:24.988 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:24.988 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:24.988 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:24.988 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2084023 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2084023' 00:12:24.989 Process raid pid: 2084023 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2084023 /var/tmp/spdk-raid.sock 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2084023 ']' 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:24.989 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:24.989 18:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:24.989 [2024-07-24 18:49:09.810652] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:12:24.989 [2024-07-24 18:49:09.810691] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:24.989 [2024-07-24 18:49:09.875527] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:24.989 [2024-07-24 18:49:09.946537] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:24.989 [2024-07-24 18:49:09.997283] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:24.989 [2024-07-24 18:49:09.997306] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:25.923 18:49:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:25.923 18:49:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:25.923 18:49:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:25.923 [2024-07-24 18:49:10.750080] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:25.923 [2024-07-24 18:49:10.750110] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:25.923 [2024-07-24 18:49:10.750116] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:25.923 [2024-07-24 18:49:10.750121] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:25.923 [2024-07-24 18:49:10.750127] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:25.923 [2024-07-24 18:49:10.750132] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:25.923 18:49:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:25.923 18:49:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:25.923 18:49:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:25.923 18:49:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:25.923 18:49:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:25.923 18:49:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:25.923 18:49:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:25.923 18:49:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:25.923 18:49:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:25.923 18:49:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:25.924 18:49:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:25.924 18:49:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:26.181 18:49:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:26.181 "name": "Existed_Raid", 00:12:26.181 "uuid": "07d32ca7-f96e-4c05-8c14-e2f009789788", 00:12:26.181 "strip_size_kb": 64, 00:12:26.181 "state": "configuring", 00:12:26.181 "raid_level": "concat", 00:12:26.181 "superblock": true, 00:12:26.181 "num_base_bdevs": 3, 00:12:26.181 "num_base_bdevs_discovered": 0, 00:12:26.181 "num_base_bdevs_operational": 3, 00:12:26.181 "base_bdevs_list": [ 00:12:26.181 { 00:12:26.181 "name": "BaseBdev1", 00:12:26.181 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:26.181 "is_configured": false, 00:12:26.181 "data_offset": 0, 00:12:26.181 "data_size": 0 00:12:26.181 }, 00:12:26.181 { 00:12:26.181 "name": "BaseBdev2", 00:12:26.181 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:26.181 "is_configured": false, 00:12:26.181 "data_offset": 0, 00:12:26.181 "data_size": 0 00:12:26.181 }, 00:12:26.181 { 00:12:26.181 "name": "BaseBdev3", 00:12:26.181 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:26.181 "is_configured": false, 00:12:26.181 "data_offset": 0, 00:12:26.181 "data_size": 0 00:12:26.181 } 00:12:26.181 ] 00:12:26.181 }' 00:12:26.181 18:49:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:26.181 18:49:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:26.439 18:49:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:26.697 [2024-07-24 18:49:11.564100] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:26.697 [2024-07-24 18:49:11.564120] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1740ba0 name Existed_Raid, state configuring 00:12:26.697 18:49:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:26.956 [2024-07-24 18:49:11.740584] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:26.956 [2024-07-24 18:49:11.740606] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:26.956 [2024-07-24 18:49:11.740611] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:26.956 [2024-07-24 18:49:11.740616] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:26.956 [2024-07-24 18:49:11.740620] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:26.956 [2024-07-24 18:49:11.740625] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:26.956 18:49:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:26.956 [2024-07-24 18:49:11.917195] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:26.956 BaseBdev1 00:12:26.956 18:49:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:26.956 18:49:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:26.956 18:49:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:26.956 18:49:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:26.956 18:49:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:26.956 18:49:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:26.956 18:49:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:27.214 18:49:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:27.473 [ 00:12:27.473 { 00:12:27.473 "name": "BaseBdev1", 00:12:27.473 "aliases": [ 00:12:27.473 "44cbe4dc-d1e9-4e60-89c6-1d899a0852f6" 00:12:27.473 ], 00:12:27.473 "product_name": "Malloc disk", 00:12:27.473 "block_size": 512, 00:12:27.473 "num_blocks": 65536, 00:12:27.473 "uuid": "44cbe4dc-d1e9-4e60-89c6-1d899a0852f6", 00:12:27.473 "assigned_rate_limits": { 00:12:27.473 "rw_ios_per_sec": 0, 00:12:27.473 "rw_mbytes_per_sec": 0, 00:12:27.473 "r_mbytes_per_sec": 0, 00:12:27.473 "w_mbytes_per_sec": 0 00:12:27.473 }, 00:12:27.473 "claimed": true, 00:12:27.473 "claim_type": "exclusive_write", 00:12:27.473 "zoned": false, 00:12:27.473 "supported_io_types": { 00:12:27.473 "read": true, 00:12:27.473 "write": true, 00:12:27.473 "unmap": true, 00:12:27.473 "flush": true, 00:12:27.473 "reset": true, 00:12:27.473 "nvme_admin": false, 00:12:27.473 "nvme_io": false, 00:12:27.473 "nvme_io_md": false, 00:12:27.473 "write_zeroes": true, 00:12:27.473 "zcopy": true, 00:12:27.473 "get_zone_info": false, 00:12:27.473 "zone_management": false, 00:12:27.473 "zone_append": false, 00:12:27.473 "compare": false, 00:12:27.473 "compare_and_write": false, 00:12:27.473 "abort": true, 00:12:27.473 "seek_hole": false, 00:12:27.473 "seek_data": false, 00:12:27.473 "copy": true, 00:12:27.473 "nvme_iov_md": false 00:12:27.473 }, 00:12:27.473 "memory_domains": [ 00:12:27.473 { 00:12:27.473 "dma_device_id": "system", 00:12:27.473 "dma_device_type": 1 00:12:27.473 }, 00:12:27.473 { 00:12:27.473 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:27.473 "dma_device_type": 2 00:12:27.473 } 00:12:27.473 ], 00:12:27.473 "driver_specific": {} 00:12:27.473 } 00:12:27.473 ] 00:12:27.473 18:49:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:27.473 18:49:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:27.473 18:49:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:27.473 18:49:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:27.473 18:49:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:27.473 18:49:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:27.473 18:49:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:27.473 18:49:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:27.473 18:49:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:27.473 18:49:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:27.473 18:49:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:27.473 18:49:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:27.473 18:49:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:27.473 18:49:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:27.473 "name": "Existed_Raid", 00:12:27.473 "uuid": "9e7199e2-35ff-46b9-838e-174f00c740f6", 00:12:27.473 "strip_size_kb": 64, 00:12:27.473 "state": "configuring", 00:12:27.473 "raid_level": "concat", 00:12:27.473 "superblock": true, 00:12:27.473 "num_base_bdevs": 3, 00:12:27.473 "num_base_bdevs_discovered": 1, 00:12:27.473 "num_base_bdevs_operational": 3, 00:12:27.473 "base_bdevs_list": [ 00:12:27.473 { 00:12:27.473 "name": "BaseBdev1", 00:12:27.473 "uuid": "44cbe4dc-d1e9-4e60-89c6-1d899a0852f6", 00:12:27.473 "is_configured": true, 00:12:27.473 "data_offset": 2048, 00:12:27.473 "data_size": 63488 00:12:27.473 }, 00:12:27.473 { 00:12:27.473 "name": "BaseBdev2", 00:12:27.473 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:27.473 "is_configured": false, 00:12:27.473 "data_offset": 0, 00:12:27.473 "data_size": 0 00:12:27.473 }, 00:12:27.473 { 00:12:27.473 "name": "BaseBdev3", 00:12:27.473 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:27.473 "is_configured": false, 00:12:27.473 "data_offset": 0, 00:12:27.473 "data_size": 0 00:12:27.473 } 00:12:27.473 ] 00:12:27.473 }' 00:12:27.473 18:49:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:27.473 18:49:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:28.039 18:49:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:28.297 [2024-07-24 18:49:13.064156] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:28.297 [2024-07-24 18:49:13.064193] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1740470 name Existed_Raid, state configuring 00:12:28.297 18:49:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:28.297 [2024-07-24 18:49:13.224607] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:28.297 [2024-07-24 18:49:13.225712] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:28.297 [2024-07-24 18:49:13.225739] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:28.297 [2024-07-24 18:49:13.225745] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:28.297 [2024-07-24 18:49:13.225751] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:28.297 18:49:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:28.297 18:49:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:28.297 18:49:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:28.297 18:49:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:28.297 18:49:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:28.297 18:49:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:28.297 18:49:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:28.297 18:49:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:28.297 18:49:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:28.297 18:49:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:28.297 18:49:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:28.297 18:49:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:28.297 18:49:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.297 18:49:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:28.554 18:49:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:28.554 "name": "Existed_Raid", 00:12:28.554 "uuid": "cd060fe0-e0da-4881-950d-b3c0c1d90ae0", 00:12:28.554 "strip_size_kb": 64, 00:12:28.554 "state": "configuring", 00:12:28.554 "raid_level": "concat", 00:12:28.554 "superblock": true, 00:12:28.554 "num_base_bdevs": 3, 00:12:28.554 "num_base_bdevs_discovered": 1, 00:12:28.554 "num_base_bdevs_operational": 3, 00:12:28.554 "base_bdevs_list": [ 00:12:28.554 { 00:12:28.554 "name": "BaseBdev1", 00:12:28.554 "uuid": "44cbe4dc-d1e9-4e60-89c6-1d899a0852f6", 00:12:28.554 "is_configured": true, 00:12:28.554 "data_offset": 2048, 00:12:28.554 "data_size": 63488 00:12:28.554 }, 00:12:28.554 { 00:12:28.554 "name": "BaseBdev2", 00:12:28.554 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:28.554 "is_configured": false, 00:12:28.554 "data_offset": 0, 00:12:28.554 "data_size": 0 00:12:28.554 }, 00:12:28.554 { 00:12:28.554 "name": "BaseBdev3", 00:12:28.554 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:28.554 "is_configured": false, 00:12:28.554 "data_offset": 0, 00:12:28.554 "data_size": 0 00:12:28.554 } 00:12:28.554 ] 00:12:28.554 }' 00:12:28.554 18:49:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:28.554 18:49:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:29.119 18:49:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:29.119 [2024-07-24 18:49:14.041286] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:29.119 BaseBdev2 00:12:29.119 18:49:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:29.119 18:49:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:29.119 18:49:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:29.119 18:49:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:29.119 18:49:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:29.119 18:49:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:29.119 18:49:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:29.377 18:49:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:29.377 [ 00:12:29.377 { 00:12:29.377 "name": "BaseBdev2", 00:12:29.377 "aliases": [ 00:12:29.377 "aec82cc6-346b-4833-9343-54a7c6700a13" 00:12:29.377 ], 00:12:29.377 "product_name": "Malloc disk", 00:12:29.377 "block_size": 512, 00:12:29.377 "num_blocks": 65536, 00:12:29.377 "uuid": "aec82cc6-346b-4833-9343-54a7c6700a13", 00:12:29.377 "assigned_rate_limits": { 00:12:29.377 "rw_ios_per_sec": 0, 00:12:29.377 "rw_mbytes_per_sec": 0, 00:12:29.377 "r_mbytes_per_sec": 0, 00:12:29.377 "w_mbytes_per_sec": 0 00:12:29.377 }, 00:12:29.377 "claimed": true, 00:12:29.377 "claim_type": "exclusive_write", 00:12:29.377 "zoned": false, 00:12:29.377 "supported_io_types": { 00:12:29.377 "read": true, 00:12:29.377 "write": true, 00:12:29.377 "unmap": true, 00:12:29.377 "flush": true, 00:12:29.377 "reset": true, 00:12:29.377 "nvme_admin": false, 00:12:29.377 "nvme_io": false, 00:12:29.377 "nvme_io_md": false, 00:12:29.377 "write_zeroes": true, 00:12:29.377 "zcopy": true, 00:12:29.377 "get_zone_info": false, 00:12:29.377 "zone_management": false, 00:12:29.377 "zone_append": false, 00:12:29.377 "compare": false, 00:12:29.377 "compare_and_write": false, 00:12:29.377 "abort": true, 00:12:29.377 "seek_hole": false, 00:12:29.377 "seek_data": false, 00:12:29.377 "copy": true, 00:12:29.377 "nvme_iov_md": false 00:12:29.377 }, 00:12:29.377 "memory_domains": [ 00:12:29.377 { 00:12:29.377 "dma_device_id": "system", 00:12:29.377 "dma_device_type": 1 00:12:29.377 }, 00:12:29.377 { 00:12:29.377 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.377 "dma_device_type": 2 00:12:29.377 } 00:12:29.377 ], 00:12:29.377 "driver_specific": {} 00:12:29.377 } 00:12:29.377 ] 00:12:29.635 18:49:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:29.635 18:49:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:29.635 18:49:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:29.635 18:49:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:29.635 18:49:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:29.635 18:49:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:29.635 18:49:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:29.635 18:49:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:29.635 18:49:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:29.635 18:49:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:29.635 18:49:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:29.635 18:49:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:29.635 18:49:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:29.635 18:49:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:29.635 18:49:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:29.635 18:49:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:29.635 "name": "Existed_Raid", 00:12:29.635 "uuid": "cd060fe0-e0da-4881-950d-b3c0c1d90ae0", 00:12:29.635 "strip_size_kb": 64, 00:12:29.635 "state": "configuring", 00:12:29.635 "raid_level": "concat", 00:12:29.635 "superblock": true, 00:12:29.635 "num_base_bdevs": 3, 00:12:29.635 "num_base_bdevs_discovered": 2, 00:12:29.635 "num_base_bdevs_operational": 3, 00:12:29.635 "base_bdevs_list": [ 00:12:29.635 { 00:12:29.635 "name": "BaseBdev1", 00:12:29.635 "uuid": "44cbe4dc-d1e9-4e60-89c6-1d899a0852f6", 00:12:29.635 "is_configured": true, 00:12:29.635 "data_offset": 2048, 00:12:29.635 "data_size": 63488 00:12:29.635 }, 00:12:29.635 { 00:12:29.635 "name": "BaseBdev2", 00:12:29.635 "uuid": "aec82cc6-346b-4833-9343-54a7c6700a13", 00:12:29.635 "is_configured": true, 00:12:29.635 "data_offset": 2048, 00:12:29.635 "data_size": 63488 00:12:29.635 }, 00:12:29.635 { 00:12:29.635 "name": "BaseBdev3", 00:12:29.635 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:29.635 "is_configured": false, 00:12:29.635 "data_offset": 0, 00:12:29.635 "data_size": 0 00:12:29.635 } 00:12:29.635 ] 00:12:29.635 }' 00:12:29.635 18:49:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:29.635 18:49:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:30.201 18:49:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:30.459 [2024-07-24 18:49:15.230979] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:30.459 [2024-07-24 18:49:15.231102] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1741360 00:12:30.459 [2024-07-24 18:49:15.231111] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:30.459 [2024-07-24 18:49:15.231227] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14467d0 00:12:30.459 [2024-07-24 18:49:15.231309] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1741360 00:12:30.459 [2024-07-24 18:49:15.231314] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1741360 00:12:30.459 [2024-07-24 18:49:15.231379] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:30.459 BaseBdev3 00:12:30.459 18:49:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:12:30.459 18:49:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:30.459 18:49:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:30.459 18:49:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:30.459 18:49:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:30.459 18:49:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:30.459 18:49:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:30.459 18:49:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:30.717 [ 00:12:30.717 { 00:12:30.717 "name": "BaseBdev3", 00:12:30.717 "aliases": [ 00:12:30.717 "ee0d0cdd-6c99-4493-a75f-0ac7db1dd4f6" 00:12:30.717 ], 00:12:30.717 "product_name": "Malloc disk", 00:12:30.717 "block_size": 512, 00:12:30.717 "num_blocks": 65536, 00:12:30.717 "uuid": "ee0d0cdd-6c99-4493-a75f-0ac7db1dd4f6", 00:12:30.717 "assigned_rate_limits": { 00:12:30.717 "rw_ios_per_sec": 0, 00:12:30.717 "rw_mbytes_per_sec": 0, 00:12:30.717 "r_mbytes_per_sec": 0, 00:12:30.717 "w_mbytes_per_sec": 0 00:12:30.717 }, 00:12:30.717 "claimed": true, 00:12:30.717 "claim_type": "exclusive_write", 00:12:30.717 "zoned": false, 00:12:30.717 "supported_io_types": { 00:12:30.717 "read": true, 00:12:30.717 "write": true, 00:12:30.717 "unmap": true, 00:12:30.717 "flush": true, 00:12:30.717 "reset": true, 00:12:30.717 "nvme_admin": false, 00:12:30.717 "nvme_io": false, 00:12:30.717 "nvme_io_md": false, 00:12:30.717 "write_zeroes": true, 00:12:30.717 "zcopy": true, 00:12:30.717 "get_zone_info": false, 00:12:30.717 "zone_management": false, 00:12:30.717 "zone_append": false, 00:12:30.717 "compare": false, 00:12:30.717 "compare_and_write": false, 00:12:30.717 "abort": true, 00:12:30.717 "seek_hole": false, 00:12:30.717 "seek_data": false, 00:12:30.717 "copy": true, 00:12:30.718 "nvme_iov_md": false 00:12:30.718 }, 00:12:30.718 "memory_domains": [ 00:12:30.718 { 00:12:30.718 "dma_device_id": "system", 00:12:30.718 "dma_device_type": 1 00:12:30.718 }, 00:12:30.718 { 00:12:30.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:30.718 "dma_device_type": 2 00:12:30.718 } 00:12:30.718 ], 00:12:30.718 "driver_specific": {} 00:12:30.718 } 00:12:30.718 ] 00:12:30.718 18:49:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:30.718 18:49:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:30.718 18:49:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:30.718 18:49:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:12:30.718 18:49:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:30.718 18:49:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:30.718 18:49:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:30.718 18:49:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:30.718 18:49:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:30.718 18:49:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:30.718 18:49:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:30.718 18:49:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:30.718 18:49:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:30.718 18:49:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.718 18:49:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:30.976 18:49:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:30.976 "name": "Existed_Raid", 00:12:30.976 "uuid": "cd060fe0-e0da-4881-950d-b3c0c1d90ae0", 00:12:30.976 "strip_size_kb": 64, 00:12:30.976 "state": "online", 00:12:30.976 "raid_level": "concat", 00:12:30.976 "superblock": true, 00:12:30.976 "num_base_bdevs": 3, 00:12:30.976 "num_base_bdevs_discovered": 3, 00:12:30.976 "num_base_bdevs_operational": 3, 00:12:30.976 "base_bdevs_list": [ 00:12:30.976 { 00:12:30.976 "name": "BaseBdev1", 00:12:30.976 "uuid": "44cbe4dc-d1e9-4e60-89c6-1d899a0852f6", 00:12:30.976 "is_configured": true, 00:12:30.976 "data_offset": 2048, 00:12:30.976 "data_size": 63488 00:12:30.976 }, 00:12:30.976 { 00:12:30.976 "name": "BaseBdev2", 00:12:30.976 "uuid": "aec82cc6-346b-4833-9343-54a7c6700a13", 00:12:30.976 "is_configured": true, 00:12:30.976 "data_offset": 2048, 00:12:30.976 "data_size": 63488 00:12:30.976 }, 00:12:30.976 { 00:12:30.976 "name": "BaseBdev3", 00:12:30.976 "uuid": "ee0d0cdd-6c99-4493-a75f-0ac7db1dd4f6", 00:12:30.976 "is_configured": true, 00:12:30.976 "data_offset": 2048, 00:12:30.976 "data_size": 63488 00:12:30.976 } 00:12:30.976 ] 00:12:30.976 }' 00:12:30.976 18:49:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:30.976 18:49:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:31.235 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:31.235 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:31.235 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:31.235 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:31.235 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:31.235 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:31.235 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:31.235 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:31.494 [2024-07-24 18:49:16.358317] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:31.494 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:31.494 "name": "Existed_Raid", 00:12:31.494 "aliases": [ 00:12:31.494 "cd060fe0-e0da-4881-950d-b3c0c1d90ae0" 00:12:31.494 ], 00:12:31.494 "product_name": "Raid Volume", 00:12:31.494 "block_size": 512, 00:12:31.494 "num_blocks": 190464, 00:12:31.494 "uuid": "cd060fe0-e0da-4881-950d-b3c0c1d90ae0", 00:12:31.494 "assigned_rate_limits": { 00:12:31.494 "rw_ios_per_sec": 0, 00:12:31.494 "rw_mbytes_per_sec": 0, 00:12:31.494 "r_mbytes_per_sec": 0, 00:12:31.494 "w_mbytes_per_sec": 0 00:12:31.494 }, 00:12:31.494 "claimed": false, 00:12:31.494 "zoned": false, 00:12:31.494 "supported_io_types": { 00:12:31.494 "read": true, 00:12:31.494 "write": true, 00:12:31.494 "unmap": true, 00:12:31.494 "flush": true, 00:12:31.494 "reset": true, 00:12:31.494 "nvme_admin": false, 00:12:31.494 "nvme_io": false, 00:12:31.494 "nvme_io_md": false, 00:12:31.494 "write_zeroes": true, 00:12:31.494 "zcopy": false, 00:12:31.494 "get_zone_info": false, 00:12:31.494 "zone_management": false, 00:12:31.494 "zone_append": false, 00:12:31.494 "compare": false, 00:12:31.494 "compare_and_write": false, 00:12:31.494 "abort": false, 00:12:31.494 "seek_hole": false, 00:12:31.494 "seek_data": false, 00:12:31.494 "copy": false, 00:12:31.494 "nvme_iov_md": false 00:12:31.494 }, 00:12:31.494 "memory_domains": [ 00:12:31.494 { 00:12:31.494 "dma_device_id": "system", 00:12:31.494 "dma_device_type": 1 00:12:31.494 }, 00:12:31.494 { 00:12:31.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.494 "dma_device_type": 2 00:12:31.494 }, 00:12:31.494 { 00:12:31.494 "dma_device_id": "system", 00:12:31.494 "dma_device_type": 1 00:12:31.494 }, 00:12:31.494 { 00:12:31.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.494 "dma_device_type": 2 00:12:31.494 }, 00:12:31.494 { 00:12:31.494 "dma_device_id": "system", 00:12:31.494 "dma_device_type": 1 00:12:31.494 }, 00:12:31.494 { 00:12:31.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.494 "dma_device_type": 2 00:12:31.494 } 00:12:31.494 ], 00:12:31.494 "driver_specific": { 00:12:31.494 "raid": { 00:12:31.494 "uuid": "cd060fe0-e0da-4881-950d-b3c0c1d90ae0", 00:12:31.494 "strip_size_kb": 64, 00:12:31.494 "state": "online", 00:12:31.494 "raid_level": "concat", 00:12:31.494 "superblock": true, 00:12:31.494 "num_base_bdevs": 3, 00:12:31.494 "num_base_bdevs_discovered": 3, 00:12:31.494 "num_base_bdevs_operational": 3, 00:12:31.494 "base_bdevs_list": [ 00:12:31.494 { 00:12:31.494 "name": "BaseBdev1", 00:12:31.494 "uuid": "44cbe4dc-d1e9-4e60-89c6-1d899a0852f6", 00:12:31.494 "is_configured": true, 00:12:31.494 "data_offset": 2048, 00:12:31.494 "data_size": 63488 00:12:31.494 }, 00:12:31.494 { 00:12:31.494 "name": "BaseBdev2", 00:12:31.494 "uuid": "aec82cc6-346b-4833-9343-54a7c6700a13", 00:12:31.494 "is_configured": true, 00:12:31.494 "data_offset": 2048, 00:12:31.494 "data_size": 63488 00:12:31.494 }, 00:12:31.494 { 00:12:31.494 "name": "BaseBdev3", 00:12:31.494 "uuid": "ee0d0cdd-6c99-4493-a75f-0ac7db1dd4f6", 00:12:31.494 "is_configured": true, 00:12:31.494 "data_offset": 2048, 00:12:31.494 "data_size": 63488 00:12:31.494 } 00:12:31.494 ] 00:12:31.494 } 00:12:31.494 } 00:12:31.494 }' 00:12:31.494 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:31.494 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:31.494 BaseBdev2 00:12:31.494 BaseBdev3' 00:12:31.494 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:31.494 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:31.494 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:31.753 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:31.753 "name": "BaseBdev1", 00:12:31.753 "aliases": [ 00:12:31.753 "44cbe4dc-d1e9-4e60-89c6-1d899a0852f6" 00:12:31.753 ], 00:12:31.753 "product_name": "Malloc disk", 00:12:31.753 "block_size": 512, 00:12:31.753 "num_blocks": 65536, 00:12:31.753 "uuid": "44cbe4dc-d1e9-4e60-89c6-1d899a0852f6", 00:12:31.753 "assigned_rate_limits": { 00:12:31.753 "rw_ios_per_sec": 0, 00:12:31.753 "rw_mbytes_per_sec": 0, 00:12:31.753 "r_mbytes_per_sec": 0, 00:12:31.753 "w_mbytes_per_sec": 0 00:12:31.753 }, 00:12:31.753 "claimed": true, 00:12:31.753 "claim_type": "exclusive_write", 00:12:31.753 "zoned": false, 00:12:31.753 "supported_io_types": { 00:12:31.753 "read": true, 00:12:31.753 "write": true, 00:12:31.753 "unmap": true, 00:12:31.753 "flush": true, 00:12:31.753 "reset": true, 00:12:31.753 "nvme_admin": false, 00:12:31.753 "nvme_io": false, 00:12:31.753 "nvme_io_md": false, 00:12:31.753 "write_zeroes": true, 00:12:31.753 "zcopy": true, 00:12:31.753 "get_zone_info": false, 00:12:31.753 "zone_management": false, 00:12:31.753 "zone_append": false, 00:12:31.753 "compare": false, 00:12:31.753 "compare_and_write": false, 00:12:31.753 "abort": true, 00:12:31.753 "seek_hole": false, 00:12:31.753 "seek_data": false, 00:12:31.753 "copy": true, 00:12:31.753 "nvme_iov_md": false 00:12:31.753 }, 00:12:31.753 "memory_domains": [ 00:12:31.753 { 00:12:31.753 "dma_device_id": "system", 00:12:31.753 "dma_device_type": 1 00:12:31.753 }, 00:12:31.753 { 00:12:31.753 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.753 "dma_device_type": 2 00:12:31.753 } 00:12:31.753 ], 00:12:31.753 "driver_specific": {} 00:12:31.753 }' 00:12:31.753 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:31.753 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:31.753 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:31.753 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:31.753 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:31.753 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:31.753 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:31.753 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.012 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:32.012 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.012 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.012 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:32.012 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:32.012 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:32.012 18:49:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:32.012 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:32.012 "name": "BaseBdev2", 00:12:32.012 "aliases": [ 00:12:32.012 "aec82cc6-346b-4833-9343-54a7c6700a13" 00:12:32.012 ], 00:12:32.012 "product_name": "Malloc disk", 00:12:32.012 "block_size": 512, 00:12:32.012 "num_blocks": 65536, 00:12:32.012 "uuid": "aec82cc6-346b-4833-9343-54a7c6700a13", 00:12:32.012 "assigned_rate_limits": { 00:12:32.012 "rw_ios_per_sec": 0, 00:12:32.012 "rw_mbytes_per_sec": 0, 00:12:32.012 "r_mbytes_per_sec": 0, 00:12:32.012 "w_mbytes_per_sec": 0 00:12:32.012 }, 00:12:32.012 "claimed": true, 00:12:32.012 "claim_type": "exclusive_write", 00:12:32.012 "zoned": false, 00:12:32.012 "supported_io_types": { 00:12:32.012 "read": true, 00:12:32.012 "write": true, 00:12:32.012 "unmap": true, 00:12:32.012 "flush": true, 00:12:32.012 "reset": true, 00:12:32.012 "nvme_admin": false, 00:12:32.012 "nvme_io": false, 00:12:32.012 "nvme_io_md": false, 00:12:32.012 "write_zeroes": true, 00:12:32.012 "zcopy": true, 00:12:32.012 "get_zone_info": false, 00:12:32.012 "zone_management": false, 00:12:32.012 "zone_append": false, 00:12:32.012 "compare": false, 00:12:32.012 "compare_and_write": false, 00:12:32.012 "abort": true, 00:12:32.012 "seek_hole": false, 00:12:32.012 "seek_data": false, 00:12:32.012 "copy": true, 00:12:32.012 "nvme_iov_md": false 00:12:32.012 }, 00:12:32.012 "memory_domains": [ 00:12:32.012 { 00:12:32.012 "dma_device_id": "system", 00:12:32.012 "dma_device_type": 1 00:12:32.012 }, 00:12:32.012 { 00:12:32.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:32.012 "dma_device_type": 2 00:12:32.012 } 00:12:32.012 ], 00:12:32.012 "driver_specific": {} 00:12:32.012 }' 00:12:32.012 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.270 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.270 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:32.270 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.270 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.270 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:32.270 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.270 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.270 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:32.270 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.270 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.529 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:32.529 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:32.529 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:32.529 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:32.529 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:32.529 "name": "BaseBdev3", 00:12:32.529 "aliases": [ 00:12:32.529 "ee0d0cdd-6c99-4493-a75f-0ac7db1dd4f6" 00:12:32.529 ], 00:12:32.529 "product_name": "Malloc disk", 00:12:32.529 "block_size": 512, 00:12:32.529 "num_blocks": 65536, 00:12:32.529 "uuid": "ee0d0cdd-6c99-4493-a75f-0ac7db1dd4f6", 00:12:32.529 "assigned_rate_limits": { 00:12:32.529 "rw_ios_per_sec": 0, 00:12:32.529 "rw_mbytes_per_sec": 0, 00:12:32.529 "r_mbytes_per_sec": 0, 00:12:32.529 "w_mbytes_per_sec": 0 00:12:32.530 }, 00:12:32.530 "claimed": true, 00:12:32.530 "claim_type": "exclusive_write", 00:12:32.530 "zoned": false, 00:12:32.530 "supported_io_types": { 00:12:32.530 "read": true, 00:12:32.530 "write": true, 00:12:32.530 "unmap": true, 00:12:32.530 "flush": true, 00:12:32.530 "reset": true, 00:12:32.530 "nvme_admin": false, 00:12:32.530 "nvme_io": false, 00:12:32.530 "nvme_io_md": false, 00:12:32.530 "write_zeroes": true, 00:12:32.530 "zcopy": true, 00:12:32.530 "get_zone_info": false, 00:12:32.530 "zone_management": false, 00:12:32.530 "zone_append": false, 00:12:32.530 "compare": false, 00:12:32.530 "compare_and_write": false, 00:12:32.530 "abort": true, 00:12:32.530 "seek_hole": false, 00:12:32.530 "seek_data": false, 00:12:32.530 "copy": true, 00:12:32.530 "nvme_iov_md": false 00:12:32.530 }, 00:12:32.530 "memory_domains": [ 00:12:32.530 { 00:12:32.530 "dma_device_id": "system", 00:12:32.530 "dma_device_type": 1 00:12:32.530 }, 00:12:32.530 { 00:12:32.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:32.530 "dma_device_type": 2 00:12:32.530 } 00:12:32.530 ], 00:12:32.530 "driver_specific": {} 00:12:32.530 }' 00:12:32.530 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.530 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.530 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:32.530 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.788 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.788 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:32.788 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.788 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.788 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:32.788 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.788 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.788 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:32.788 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:33.047 [2024-07-24 18:49:17.890100] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:33.047 [2024-07-24 18:49:17.890120] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:33.047 [2024-07-24 18:49:17.890149] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:33.047 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:33.047 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:33.047 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:33.047 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:33.047 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:33.047 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:12:33.047 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:33.047 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:33.047 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:33.047 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:33.047 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:33.047 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:33.047 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:33.047 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:33.047 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:33.047 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.047 18:49:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:33.305 18:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:33.305 "name": "Existed_Raid", 00:12:33.305 "uuid": "cd060fe0-e0da-4881-950d-b3c0c1d90ae0", 00:12:33.305 "strip_size_kb": 64, 00:12:33.305 "state": "offline", 00:12:33.305 "raid_level": "concat", 00:12:33.305 "superblock": true, 00:12:33.305 "num_base_bdevs": 3, 00:12:33.305 "num_base_bdevs_discovered": 2, 00:12:33.305 "num_base_bdevs_operational": 2, 00:12:33.305 "base_bdevs_list": [ 00:12:33.305 { 00:12:33.305 "name": null, 00:12:33.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.305 "is_configured": false, 00:12:33.305 "data_offset": 2048, 00:12:33.305 "data_size": 63488 00:12:33.305 }, 00:12:33.305 { 00:12:33.305 "name": "BaseBdev2", 00:12:33.305 "uuid": "aec82cc6-346b-4833-9343-54a7c6700a13", 00:12:33.305 "is_configured": true, 00:12:33.305 "data_offset": 2048, 00:12:33.305 "data_size": 63488 00:12:33.305 }, 00:12:33.305 { 00:12:33.305 "name": "BaseBdev3", 00:12:33.305 "uuid": "ee0d0cdd-6c99-4493-a75f-0ac7db1dd4f6", 00:12:33.305 "is_configured": true, 00:12:33.305 "data_offset": 2048, 00:12:33.305 "data_size": 63488 00:12:33.305 } 00:12:33.305 ] 00:12:33.305 }' 00:12:33.305 18:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:33.305 18:49:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:33.563 18:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:33.563 18:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:33.563 18:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:33.563 18:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.822 18:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:33.822 18:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:33.822 18:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:34.080 [2024-07-24 18:49:18.865386] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:34.080 18:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:34.080 18:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:34.080 18:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:34.080 18:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:34.081 18:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:34.081 18:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:34.081 18:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:34.339 [2024-07-24 18:49:19.200103] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:34.339 [2024-07-24 18:49:19.200137] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1741360 name Existed_Raid, state offline 00:12:34.339 18:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:34.339 18:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:34.339 18:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:34.339 18:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:34.597 18:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:34.597 18:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:34.597 18:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:12:34.597 18:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:12:34.597 18:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:34.597 18:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:34.597 BaseBdev2 00:12:34.597 18:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:12:34.597 18:49:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:34.597 18:49:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:34.597 18:49:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:34.597 18:49:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:34.597 18:49:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:34.597 18:49:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:34.855 18:49:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:34.855 [ 00:12:34.855 { 00:12:34.855 "name": "BaseBdev2", 00:12:34.855 "aliases": [ 00:12:34.855 "a18ab113-a3ee-4e1d-a67a-17d3588f6c54" 00:12:34.855 ], 00:12:34.855 "product_name": "Malloc disk", 00:12:34.855 "block_size": 512, 00:12:34.855 "num_blocks": 65536, 00:12:34.855 "uuid": "a18ab113-a3ee-4e1d-a67a-17d3588f6c54", 00:12:34.855 "assigned_rate_limits": { 00:12:34.855 "rw_ios_per_sec": 0, 00:12:34.855 "rw_mbytes_per_sec": 0, 00:12:34.855 "r_mbytes_per_sec": 0, 00:12:34.855 "w_mbytes_per_sec": 0 00:12:34.855 }, 00:12:34.855 "claimed": false, 00:12:34.855 "zoned": false, 00:12:34.855 "supported_io_types": { 00:12:34.855 "read": true, 00:12:34.855 "write": true, 00:12:34.855 "unmap": true, 00:12:34.855 "flush": true, 00:12:34.855 "reset": true, 00:12:34.855 "nvme_admin": false, 00:12:34.855 "nvme_io": false, 00:12:34.855 "nvme_io_md": false, 00:12:34.855 "write_zeroes": true, 00:12:34.855 "zcopy": true, 00:12:34.855 "get_zone_info": false, 00:12:34.855 "zone_management": false, 00:12:34.855 "zone_append": false, 00:12:34.855 "compare": false, 00:12:34.855 "compare_and_write": false, 00:12:34.855 "abort": true, 00:12:34.855 "seek_hole": false, 00:12:34.855 "seek_data": false, 00:12:34.855 "copy": true, 00:12:34.855 "nvme_iov_md": false 00:12:34.855 }, 00:12:34.855 "memory_domains": [ 00:12:34.855 { 00:12:34.855 "dma_device_id": "system", 00:12:34.855 "dma_device_type": 1 00:12:34.855 }, 00:12:34.855 { 00:12:34.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.855 "dma_device_type": 2 00:12:34.855 } 00:12:34.855 ], 00:12:34.855 "driver_specific": {} 00:12:34.855 } 00:12:34.855 ] 00:12:35.113 18:49:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:35.113 18:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:35.113 18:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:35.113 18:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:35.113 BaseBdev3 00:12:35.113 18:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:12:35.113 18:49:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:35.113 18:49:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:35.113 18:49:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:35.113 18:49:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:35.113 18:49:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:35.113 18:49:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:35.372 18:49:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:35.372 [ 00:12:35.372 { 00:12:35.372 "name": "BaseBdev3", 00:12:35.372 "aliases": [ 00:12:35.372 "8f66d0f1-4f67-4db5-b68b-8226593fc891" 00:12:35.372 ], 00:12:35.372 "product_name": "Malloc disk", 00:12:35.372 "block_size": 512, 00:12:35.372 "num_blocks": 65536, 00:12:35.372 "uuid": "8f66d0f1-4f67-4db5-b68b-8226593fc891", 00:12:35.372 "assigned_rate_limits": { 00:12:35.372 "rw_ios_per_sec": 0, 00:12:35.372 "rw_mbytes_per_sec": 0, 00:12:35.372 "r_mbytes_per_sec": 0, 00:12:35.372 "w_mbytes_per_sec": 0 00:12:35.372 }, 00:12:35.372 "claimed": false, 00:12:35.372 "zoned": false, 00:12:35.372 "supported_io_types": { 00:12:35.372 "read": true, 00:12:35.372 "write": true, 00:12:35.372 "unmap": true, 00:12:35.372 "flush": true, 00:12:35.372 "reset": true, 00:12:35.372 "nvme_admin": false, 00:12:35.372 "nvme_io": false, 00:12:35.372 "nvme_io_md": false, 00:12:35.372 "write_zeroes": true, 00:12:35.372 "zcopy": true, 00:12:35.372 "get_zone_info": false, 00:12:35.372 "zone_management": false, 00:12:35.372 "zone_append": false, 00:12:35.372 "compare": false, 00:12:35.372 "compare_and_write": false, 00:12:35.372 "abort": true, 00:12:35.372 "seek_hole": false, 00:12:35.372 "seek_data": false, 00:12:35.372 "copy": true, 00:12:35.372 "nvme_iov_md": false 00:12:35.372 }, 00:12:35.372 "memory_domains": [ 00:12:35.372 { 00:12:35.372 "dma_device_id": "system", 00:12:35.372 "dma_device_type": 1 00:12:35.372 }, 00:12:35.372 { 00:12:35.372 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:35.372 "dma_device_type": 2 00:12:35.372 } 00:12:35.372 ], 00:12:35.372 "driver_specific": {} 00:12:35.372 } 00:12:35.372 ] 00:12:35.372 18:49:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:35.372 18:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:35.372 18:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:35.372 18:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:35.631 [2024-07-24 18:49:20.525074] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:35.631 [2024-07-24 18:49:20.525106] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:35.631 [2024-07-24 18:49:20.525118] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:35.631 [2024-07-24 18:49:20.526104] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:35.631 18:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:35.631 18:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:35.631 18:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:35.631 18:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:35.631 18:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:35.631 18:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:35.631 18:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:35.631 18:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:35.631 18:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:35.631 18:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:35.631 18:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:35.631 18:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:35.890 18:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:35.890 "name": "Existed_Raid", 00:12:35.890 "uuid": "aa1891fb-2feb-448d-9a84-25d3dfed5745", 00:12:35.890 "strip_size_kb": 64, 00:12:35.890 "state": "configuring", 00:12:35.890 "raid_level": "concat", 00:12:35.890 "superblock": true, 00:12:35.890 "num_base_bdevs": 3, 00:12:35.890 "num_base_bdevs_discovered": 2, 00:12:35.890 "num_base_bdevs_operational": 3, 00:12:35.890 "base_bdevs_list": [ 00:12:35.890 { 00:12:35.890 "name": "BaseBdev1", 00:12:35.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:35.890 "is_configured": false, 00:12:35.890 "data_offset": 0, 00:12:35.890 "data_size": 0 00:12:35.890 }, 00:12:35.890 { 00:12:35.890 "name": "BaseBdev2", 00:12:35.890 "uuid": "a18ab113-a3ee-4e1d-a67a-17d3588f6c54", 00:12:35.890 "is_configured": true, 00:12:35.890 "data_offset": 2048, 00:12:35.890 "data_size": 63488 00:12:35.890 }, 00:12:35.890 { 00:12:35.890 "name": "BaseBdev3", 00:12:35.890 "uuid": "8f66d0f1-4f67-4db5-b68b-8226593fc891", 00:12:35.890 "is_configured": true, 00:12:35.890 "data_offset": 2048, 00:12:35.890 "data_size": 63488 00:12:35.890 } 00:12:35.890 ] 00:12:35.890 }' 00:12:35.890 18:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:35.890 18:49:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:36.457 18:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:36.457 [2024-07-24 18:49:21.371248] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:36.457 18:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:36.457 18:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:36.457 18:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:36.457 18:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:36.457 18:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:36.457 18:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:36.457 18:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:36.457 18:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:36.457 18:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:36.457 18:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:36.457 18:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.457 18:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:36.716 18:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:36.716 "name": "Existed_Raid", 00:12:36.716 "uuid": "aa1891fb-2feb-448d-9a84-25d3dfed5745", 00:12:36.716 "strip_size_kb": 64, 00:12:36.716 "state": "configuring", 00:12:36.716 "raid_level": "concat", 00:12:36.716 "superblock": true, 00:12:36.716 "num_base_bdevs": 3, 00:12:36.716 "num_base_bdevs_discovered": 1, 00:12:36.716 "num_base_bdevs_operational": 3, 00:12:36.716 "base_bdevs_list": [ 00:12:36.716 { 00:12:36.716 "name": "BaseBdev1", 00:12:36.716 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:36.716 "is_configured": false, 00:12:36.716 "data_offset": 0, 00:12:36.716 "data_size": 0 00:12:36.716 }, 00:12:36.716 { 00:12:36.716 "name": null, 00:12:36.716 "uuid": "a18ab113-a3ee-4e1d-a67a-17d3588f6c54", 00:12:36.716 "is_configured": false, 00:12:36.716 "data_offset": 2048, 00:12:36.716 "data_size": 63488 00:12:36.716 }, 00:12:36.716 { 00:12:36.716 "name": "BaseBdev3", 00:12:36.716 "uuid": "8f66d0f1-4f67-4db5-b68b-8226593fc891", 00:12:36.716 "is_configured": true, 00:12:36.716 "data_offset": 2048, 00:12:36.716 "data_size": 63488 00:12:36.716 } 00:12:36.716 ] 00:12:36.716 }' 00:12:36.716 18:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:36.716 18:49:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:37.348 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:37.348 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.348 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:12:37.348 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:37.608 [2024-07-24 18:49:22.408508] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:37.608 BaseBdev1 00:12:37.608 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:12:37.608 18:49:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:37.608 18:49:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:37.608 18:49:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:37.608 18:49:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:37.608 18:49:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:37.608 18:49:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:37.608 18:49:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:37.867 [ 00:12:37.867 { 00:12:37.867 "name": "BaseBdev1", 00:12:37.867 "aliases": [ 00:12:37.867 "3e9b8be1-d8fa-4f8c-90e5-86f24da20a3c" 00:12:37.867 ], 00:12:37.867 "product_name": "Malloc disk", 00:12:37.867 "block_size": 512, 00:12:37.867 "num_blocks": 65536, 00:12:37.867 "uuid": "3e9b8be1-d8fa-4f8c-90e5-86f24da20a3c", 00:12:37.867 "assigned_rate_limits": { 00:12:37.867 "rw_ios_per_sec": 0, 00:12:37.867 "rw_mbytes_per_sec": 0, 00:12:37.867 "r_mbytes_per_sec": 0, 00:12:37.867 "w_mbytes_per_sec": 0 00:12:37.867 }, 00:12:37.867 "claimed": true, 00:12:37.867 "claim_type": "exclusive_write", 00:12:37.867 "zoned": false, 00:12:37.867 "supported_io_types": { 00:12:37.867 "read": true, 00:12:37.867 "write": true, 00:12:37.867 "unmap": true, 00:12:37.867 "flush": true, 00:12:37.867 "reset": true, 00:12:37.867 "nvme_admin": false, 00:12:37.867 "nvme_io": false, 00:12:37.867 "nvme_io_md": false, 00:12:37.867 "write_zeroes": true, 00:12:37.867 "zcopy": true, 00:12:37.867 "get_zone_info": false, 00:12:37.867 "zone_management": false, 00:12:37.867 "zone_append": false, 00:12:37.867 "compare": false, 00:12:37.867 "compare_and_write": false, 00:12:37.867 "abort": true, 00:12:37.867 "seek_hole": false, 00:12:37.867 "seek_data": false, 00:12:37.867 "copy": true, 00:12:37.867 "nvme_iov_md": false 00:12:37.867 }, 00:12:37.867 "memory_domains": [ 00:12:37.867 { 00:12:37.867 "dma_device_id": "system", 00:12:37.867 "dma_device_type": 1 00:12:37.867 }, 00:12:37.867 { 00:12:37.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.867 "dma_device_type": 2 00:12:37.867 } 00:12:37.867 ], 00:12:37.867 "driver_specific": {} 00:12:37.867 } 00:12:37.867 ] 00:12:37.867 18:49:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:37.867 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:37.867 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:37.867 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:37.867 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:37.867 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:37.867 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:37.867 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:37.867 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:37.867 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:37.867 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:37.867 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.867 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:38.126 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:38.126 "name": "Existed_Raid", 00:12:38.126 "uuid": "aa1891fb-2feb-448d-9a84-25d3dfed5745", 00:12:38.126 "strip_size_kb": 64, 00:12:38.126 "state": "configuring", 00:12:38.126 "raid_level": "concat", 00:12:38.126 "superblock": true, 00:12:38.126 "num_base_bdevs": 3, 00:12:38.126 "num_base_bdevs_discovered": 2, 00:12:38.126 "num_base_bdevs_operational": 3, 00:12:38.126 "base_bdevs_list": [ 00:12:38.126 { 00:12:38.126 "name": "BaseBdev1", 00:12:38.126 "uuid": "3e9b8be1-d8fa-4f8c-90e5-86f24da20a3c", 00:12:38.126 "is_configured": true, 00:12:38.126 "data_offset": 2048, 00:12:38.126 "data_size": 63488 00:12:38.126 }, 00:12:38.126 { 00:12:38.126 "name": null, 00:12:38.126 "uuid": "a18ab113-a3ee-4e1d-a67a-17d3588f6c54", 00:12:38.126 "is_configured": false, 00:12:38.126 "data_offset": 2048, 00:12:38.126 "data_size": 63488 00:12:38.126 }, 00:12:38.126 { 00:12:38.126 "name": "BaseBdev3", 00:12:38.126 "uuid": "8f66d0f1-4f67-4db5-b68b-8226593fc891", 00:12:38.126 "is_configured": true, 00:12:38.126 "data_offset": 2048, 00:12:38.126 "data_size": 63488 00:12:38.126 } 00:12:38.126 ] 00:12:38.126 }' 00:12:38.126 18:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:38.126 18:49:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:38.693 18:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:38.693 18:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:38.693 18:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:12:38.693 18:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:38.952 [2024-07-24 18:49:23.743967] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:38.952 18:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:38.952 18:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:38.952 18:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:38.952 18:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:38.952 18:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:38.952 18:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:38.952 18:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:38.952 18:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:38.952 18:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:38.952 18:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:38.952 18:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:38.952 18:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:38.952 18:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:38.952 "name": "Existed_Raid", 00:12:38.952 "uuid": "aa1891fb-2feb-448d-9a84-25d3dfed5745", 00:12:38.952 "strip_size_kb": 64, 00:12:38.952 "state": "configuring", 00:12:38.952 "raid_level": "concat", 00:12:38.952 "superblock": true, 00:12:38.952 "num_base_bdevs": 3, 00:12:38.952 "num_base_bdevs_discovered": 1, 00:12:38.952 "num_base_bdevs_operational": 3, 00:12:38.952 "base_bdevs_list": [ 00:12:38.952 { 00:12:38.952 "name": "BaseBdev1", 00:12:38.952 "uuid": "3e9b8be1-d8fa-4f8c-90e5-86f24da20a3c", 00:12:38.952 "is_configured": true, 00:12:38.952 "data_offset": 2048, 00:12:38.952 "data_size": 63488 00:12:38.952 }, 00:12:38.952 { 00:12:38.952 "name": null, 00:12:38.952 "uuid": "a18ab113-a3ee-4e1d-a67a-17d3588f6c54", 00:12:38.952 "is_configured": false, 00:12:38.952 "data_offset": 2048, 00:12:38.952 "data_size": 63488 00:12:38.952 }, 00:12:38.952 { 00:12:38.952 "name": null, 00:12:38.952 "uuid": "8f66d0f1-4f67-4db5-b68b-8226593fc891", 00:12:38.952 "is_configured": false, 00:12:38.952 "data_offset": 2048, 00:12:38.952 "data_size": 63488 00:12:38.952 } 00:12:38.952 ] 00:12:38.952 }' 00:12:38.952 18:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:38.952 18:49:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:39.517 18:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.517 18:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:39.775 18:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:12:39.775 18:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:39.775 [2024-07-24 18:49:24.750586] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:39.775 18:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:39.775 18:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:39.775 18:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:39.775 18:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:39.775 18:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:39.775 18:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:39.775 18:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:39.775 18:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:39.775 18:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:39.775 18:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:39.775 18:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.775 18:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:40.034 18:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:40.034 "name": "Existed_Raid", 00:12:40.034 "uuid": "aa1891fb-2feb-448d-9a84-25d3dfed5745", 00:12:40.034 "strip_size_kb": 64, 00:12:40.034 "state": "configuring", 00:12:40.034 "raid_level": "concat", 00:12:40.034 "superblock": true, 00:12:40.034 "num_base_bdevs": 3, 00:12:40.034 "num_base_bdevs_discovered": 2, 00:12:40.034 "num_base_bdevs_operational": 3, 00:12:40.034 "base_bdevs_list": [ 00:12:40.034 { 00:12:40.034 "name": "BaseBdev1", 00:12:40.034 "uuid": "3e9b8be1-d8fa-4f8c-90e5-86f24da20a3c", 00:12:40.034 "is_configured": true, 00:12:40.034 "data_offset": 2048, 00:12:40.034 "data_size": 63488 00:12:40.034 }, 00:12:40.034 { 00:12:40.034 "name": null, 00:12:40.034 "uuid": "a18ab113-a3ee-4e1d-a67a-17d3588f6c54", 00:12:40.034 "is_configured": false, 00:12:40.034 "data_offset": 2048, 00:12:40.034 "data_size": 63488 00:12:40.034 }, 00:12:40.034 { 00:12:40.034 "name": "BaseBdev3", 00:12:40.034 "uuid": "8f66d0f1-4f67-4db5-b68b-8226593fc891", 00:12:40.034 "is_configured": true, 00:12:40.034 "data_offset": 2048, 00:12:40.034 "data_size": 63488 00:12:40.034 } 00:12:40.034 ] 00:12:40.034 }' 00:12:40.034 18:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:40.034 18:49:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:40.600 18:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.600 18:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:40.600 18:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:12:40.600 18:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:40.859 [2024-07-24 18:49:25.753195] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:40.859 18:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:40.859 18:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:40.859 18:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:40.859 18:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:40.859 18:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:40.859 18:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:40.859 18:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:40.859 18:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:40.859 18:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:40.859 18:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:40.859 18:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.859 18:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:41.117 18:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:41.117 "name": "Existed_Raid", 00:12:41.117 "uuid": "aa1891fb-2feb-448d-9a84-25d3dfed5745", 00:12:41.117 "strip_size_kb": 64, 00:12:41.117 "state": "configuring", 00:12:41.117 "raid_level": "concat", 00:12:41.117 "superblock": true, 00:12:41.117 "num_base_bdevs": 3, 00:12:41.117 "num_base_bdevs_discovered": 1, 00:12:41.117 "num_base_bdevs_operational": 3, 00:12:41.117 "base_bdevs_list": [ 00:12:41.117 { 00:12:41.117 "name": null, 00:12:41.117 "uuid": "3e9b8be1-d8fa-4f8c-90e5-86f24da20a3c", 00:12:41.117 "is_configured": false, 00:12:41.117 "data_offset": 2048, 00:12:41.117 "data_size": 63488 00:12:41.117 }, 00:12:41.117 { 00:12:41.117 "name": null, 00:12:41.117 "uuid": "a18ab113-a3ee-4e1d-a67a-17d3588f6c54", 00:12:41.117 "is_configured": false, 00:12:41.117 "data_offset": 2048, 00:12:41.117 "data_size": 63488 00:12:41.117 }, 00:12:41.117 { 00:12:41.117 "name": "BaseBdev3", 00:12:41.117 "uuid": "8f66d0f1-4f67-4db5-b68b-8226593fc891", 00:12:41.117 "is_configured": true, 00:12:41.117 "data_offset": 2048, 00:12:41.117 "data_size": 63488 00:12:41.117 } 00:12:41.117 ] 00:12:41.117 }' 00:12:41.117 18:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:41.117 18:49:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:41.684 18:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:41.684 18:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.684 18:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:12:41.684 18:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:41.942 [2024-07-24 18:49:26.757639] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:41.942 18:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:41.942 18:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:41.942 18:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:41.942 18:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:41.942 18:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:41.942 18:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:41.942 18:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:41.942 18:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:41.942 18:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:41.942 18:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:41.942 18:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.942 18:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:41.942 18:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:41.942 "name": "Existed_Raid", 00:12:41.942 "uuid": "aa1891fb-2feb-448d-9a84-25d3dfed5745", 00:12:41.942 "strip_size_kb": 64, 00:12:41.942 "state": "configuring", 00:12:41.942 "raid_level": "concat", 00:12:41.942 "superblock": true, 00:12:41.942 "num_base_bdevs": 3, 00:12:41.942 "num_base_bdevs_discovered": 2, 00:12:41.942 "num_base_bdevs_operational": 3, 00:12:41.942 "base_bdevs_list": [ 00:12:41.942 { 00:12:41.942 "name": null, 00:12:41.942 "uuid": "3e9b8be1-d8fa-4f8c-90e5-86f24da20a3c", 00:12:41.942 "is_configured": false, 00:12:41.942 "data_offset": 2048, 00:12:41.942 "data_size": 63488 00:12:41.942 }, 00:12:41.942 { 00:12:41.942 "name": "BaseBdev2", 00:12:41.942 "uuid": "a18ab113-a3ee-4e1d-a67a-17d3588f6c54", 00:12:41.942 "is_configured": true, 00:12:41.942 "data_offset": 2048, 00:12:41.942 "data_size": 63488 00:12:41.942 }, 00:12:41.942 { 00:12:41.942 "name": "BaseBdev3", 00:12:41.942 "uuid": "8f66d0f1-4f67-4db5-b68b-8226593fc891", 00:12:41.942 "is_configured": true, 00:12:41.942 "data_offset": 2048, 00:12:41.942 "data_size": 63488 00:12:41.942 } 00:12:41.942 ] 00:12:41.942 }' 00:12:41.942 18:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:41.942 18:49:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:42.509 18:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.509 18:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:42.767 18:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:12:42.767 18:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.767 18:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:42.767 18:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 3e9b8be1-d8fa-4f8c-90e5-86f24da20a3c 00:12:43.025 [2024-07-24 18:49:27.927324] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:43.025 [2024-07-24 18:49:27.927449] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x18e5a60 00:12:43.025 [2024-07-24 18:49:27.927457] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:43.025 [2024-07-24 18:49:27.927599] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x172d2d0 00:12:43.025 [2024-07-24 18:49:27.927695] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18e5a60 00:12:43.025 [2024-07-24 18:49:27.927701] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x18e5a60 00:12:43.025 [2024-07-24 18:49:27.927773] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:43.025 NewBaseBdev 00:12:43.025 18:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:12:43.025 18:49:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:12:43.025 18:49:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:43.025 18:49:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:43.025 18:49:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:43.025 18:49:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:43.026 18:49:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:43.284 18:49:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:43.284 [ 00:12:43.284 { 00:12:43.284 "name": "NewBaseBdev", 00:12:43.284 "aliases": [ 00:12:43.284 "3e9b8be1-d8fa-4f8c-90e5-86f24da20a3c" 00:12:43.284 ], 00:12:43.284 "product_name": "Malloc disk", 00:12:43.284 "block_size": 512, 00:12:43.284 "num_blocks": 65536, 00:12:43.284 "uuid": "3e9b8be1-d8fa-4f8c-90e5-86f24da20a3c", 00:12:43.284 "assigned_rate_limits": { 00:12:43.284 "rw_ios_per_sec": 0, 00:12:43.284 "rw_mbytes_per_sec": 0, 00:12:43.284 "r_mbytes_per_sec": 0, 00:12:43.284 "w_mbytes_per_sec": 0 00:12:43.284 }, 00:12:43.284 "claimed": true, 00:12:43.284 "claim_type": "exclusive_write", 00:12:43.284 "zoned": false, 00:12:43.284 "supported_io_types": { 00:12:43.284 "read": true, 00:12:43.284 "write": true, 00:12:43.284 "unmap": true, 00:12:43.284 "flush": true, 00:12:43.284 "reset": true, 00:12:43.284 "nvme_admin": false, 00:12:43.284 "nvme_io": false, 00:12:43.284 "nvme_io_md": false, 00:12:43.284 "write_zeroes": true, 00:12:43.284 "zcopy": true, 00:12:43.284 "get_zone_info": false, 00:12:43.284 "zone_management": false, 00:12:43.284 "zone_append": false, 00:12:43.284 "compare": false, 00:12:43.284 "compare_and_write": false, 00:12:43.284 "abort": true, 00:12:43.284 "seek_hole": false, 00:12:43.284 "seek_data": false, 00:12:43.284 "copy": true, 00:12:43.284 "nvme_iov_md": false 00:12:43.284 }, 00:12:43.284 "memory_domains": [ 00:12:43.284 { 00:12:43.284 "dma_device_id": "system", 00:12:43.284 "dma_device_type": 1 00:12:43.284 }, 00:12:43.284 { 00:12:43.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.284 "dma_device_type": 2 00:12:43.284 } 00:12:43.284 ], 00:12:43.284 "driver_specific": {} 00:12:43.284 } 00:12:43.284 ] 00:12:43.284 18:49:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:43.284 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:12:43.284 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:43.284 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:43.284 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:43.284 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:43.284 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:43.284 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:43.284 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:43.284 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:43.285 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:43.285 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:43.285 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:43.543 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:43.543 "name": "Existed_Raid", 00:12:43.543 "uuid": "aa1891fb-2feb-448d-9a84-25d3dfed5745", 00:12:43.543 "strip_size_kb": 64, 00:12:43.543 "state": "online", 00:12:43.543 "raid_level": "concat", 00:12:43.543 "superblock": true, 00:12:43.543 "num_base_bdevs": 3, 00:12:43.543 "num_base_bdevs_discovered": 3, 00:12:43.543 "num_base_bdevs_operational": 3, 00:12:43.543 "base_bdevs_list": [ 00:12:43.543 { 00:12:43.543 "name": "NewBaseBdev", 00:12:43.543 "uuid": "3e9b8be1-d8fa-4f8c-90e5-86f24da20a3c", 00:12:43.543 "is_configured": true, 00:12:43.543 "data_offset": 2048, 00:12:43.543 "data_size": 63488 00:12:43.543 }, 00:12:43.543 { 00:12:43.543 "name": "BaseBdev2", 00:12:43.543 "uuid": "a18ab113-a3ee-4e1d-a67a-17d3588f6c54", 00:12:43.543 "is_configured": true, 00:12:43.543 "data_offset": 2048, 00:12:43.543 "data_size": 63488 00:12:43.543 }, 00:12:43.543 { 00:12:43.543 "name": "BaseBdev3", 00:12:43.543 "uuid": "8f66d0f1-4f67-4db5-b68b-8226593fc891", 00:12:43.543 "is_configured": true, 00:12:43.543 "data_offset": 2048, 00:12:43.543 "data_size": 63488 00:12:43.543 } 00:12:43.543 ] 00:12:43.543 }' 00:12:43.543 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:43.543 18:49:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:44.110 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:12:44.110 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:44.110 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:44.110 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:44.110 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:44.110 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:44.110 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:44.110 18:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:44.110 [2024-07-24 18:49:29.086544] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:44.110 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:44.110 "name": "Existed_Raid", 00:12:44.110 "aliases": [ 00:12:44.110 "aa1891fb-2feb-448d-9a84-25d3dfed5745" 00:12:44.110 ], 00:12:44.110 "product_name": "Raid Volume", 00:12:44.110 "block_size": 512, 00:12:44.110 "num_blocks": 190464, 00:12:44.110 "uuid": "aa1891fb-2feb-448d-9a84-25d3dfed5745", 00:12:44.110 "assigned_rate_limits": { 00:12:44.110 "rw_ios_per_sec": 0, 00:12:44.110 "rw_mbytes_per_sec": 0, 00:12:44.110 "r_mbytes_per_sec": 0, 00:12:44.110 "w_mbytes_per_sec": 0 00:12:44.110 }, 00:12:44.110 "claimed": false, 00:12:44.110 "zoned": false, 00:12:44.110 "supported_io_types": { 00:12:44.110 "read": true, 00:12:44.110 "write": true, 00:12:44.110 "unmap": true, 00:12:44.110 "flush": true, 00:12:44.110 "reset": true, 00:12:44.110 "nvme_admin": false, 00:12:44.110 "nvme_io": false, 00:12:44.110 "nvme_io_md": false, 00:12:44.110 "write_zeroes": true, 00:12:44.110 "zcopy": false, 00:12:44.110 "get_zone_info": false, 00:12:44.110 "zone_management": false, 00:12:44.110 "zone_append": false, 00:12:44.110 "compare": false, 00:12:44.110 "compare_and_write": false, 00:12:44.110 "abort": false, 00:12:44.110 "seek_hole": false, 00:12:44.110 "seek_data": false, 00:12:44.110 "copy": false, 00:12:44.110 "nvme_iov_md": false 00:12:44.110 }, 00:12:44.111 "memory_domains": [ 00:12:44.111 { 00:12:44.111 "dma_device_id": "system", 00:12:44.111 "dma_device_type": 1 00:12:44.111 }, 00:12:44.111 { 00:12:44.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:44.111 "dma_device_type": 2 00:12:44.111 }, 00:12:44.111 { 00:12:44.111 "dma_device_id": "system", 00:12:44.111 "dma_device_type": 1 00:12:44.111 }, 00:12:44.111 { 00:12:44.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:44.111 "dma_device_type": 2 00:12:44.111 }, 00:12:44.111 { 00:12:44.111 "dma_device_id": "system", 00:12:44.111 "dma_device_type": 1 00:12:44.111 }, 00:12:44.111 { 00:12:44.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:44.111 "dma_device_type": 2 00:12:44.111 } 00:12:44.111 ], 00:12:44.111 "driver_specific": { 00:12:44.111 "raid": { 00:12:44.111 "uuid": "aa1891fb-2feb-448d-9a84-25d3dfed5745", 00:12:44.111 "strip_size_kb": 64, 00:12:44.111 "state": "online", 00:12:44.111 "raid_level": "concat", 00:12:44.111 "superblock": true, 00:12:44.111 "num_base_bdevs": 3, 00:12:44.111 "num_base_bdevs_discovered": 3, 00:12:44.111 "num_base_bdevs_operational": 3, 00:12:44.111 "base_bdevs_list": [ 00:12:44.111 { 00:12:44.111 "name": "NewBaseBdev", 00:12:44.111 "uuid": "3e9b8be1-d8fa-4f8c-90e5-86f24da20a3c", 00:12:44.111 "is_configured": true, 00:12:44.111 "data_offset": 2048, 00:12:44.111 "data_size": 63488 00:12:44.111 }, 00:12:44.111 { 00:12:44.111 "name": "BaseBdev2", 00:12:44.111 "uuid": "a18ab113-a3ee-4e1d-a67a-17d3588f6c54", 00:12:44.111 "is_configured": true, 00:12:44.111 "data_offset": 2048, 00:12:44.111 "data_size": 63488 00:12:44.111 }, 00:12:44.111 { 00:12:44.111 "name": "BaseBdev3", 00:12:44.111 "uuid": "8f66d0f1-4f67-4db5-b68b-8226593fc891", 00:12:44.111 "is_configured": true, 00:12:44.111 "data_offset": 2048, 00:12:44.111 "data_size": 63488 00:12:44.111 } 00:12:44.111 ] 00:12:44.111 } 00:12:44.111 } 00:12:44.111 }' 00:12:44.111 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:44.370 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:12:44.370 BaseBdev2 00:12:44.370 BaseBdev3' 00:12:44.370 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:44.370 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:44.370 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:44.370 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:44.370 "name": "NewBaseBdev", 00:12:44.370 "aliases": [ 00:12:44.370 "3e9b8be1-d8fa-4f8c-90e5-86f24da20a3c" 00:12:44.370 ], 00:12:44.370 "product_name": "Malloc disk", 00:12:44.370 "block_size": 512, 00:12:44.370 "num_blocks": 65536, 00:12:44.370 "uuid": "3e9b8be1-d8fa-4f8c-90e5-86f24da20a3c", 00:12:44.370 "assigned_rate_limits": { 00:12:44.370 "rw_ios_per_sec": 0, 00:12:44.370 "rw_mbytes_per_sec": 0, 00:12:44.370 "r_mbytes_per_sec": 0, 00:12:44.370 "w_mbytes_per_sec": 0 00:12:44.370 }, 00:12:44.370 "claimed": true, 00:12:44.370 "claim_type": "exclusive_write", 00:12:44.370 "zoned": false, 00:12:44.370 "supported_io_types": { 00:12:44.370 "read": true, 00:12:44.370 "write": true, 00:12:44.370 "unmap": true, 00:12:44.370 "flush": true, 00:12:44.370 "reset": true, 00:12:44.370 "nvme_admin": false, 00:12:44.370 "nvme_io": false, 00:12:44.370 "nvme_io_md": false, 00:12:44.370 "write_zeroes": true, 00:12:44.370 "zcopy": true, 00:12:44.370 "get_zone_info": false, 00:12:44.370 "zone_management": false, 00:12:44.370 "zone_append": false, 00:12:44.370 "compare": false, 00:12:44.370 "compare_and_write": false, 00:12:44.370 "abort": true, 00:12:44.370 "seek_hole": false, 00:12:44.370 "seek_data": false, 00:12:44.370 "copy": true, 00:12:44.370 "nvme_iov_md": false 00:12:44.370 }, 00:12:44.370 "memory_domains": [ 00:12:44.370 { 00:12:44.370 "dma_device_id": "system", 00:12:44.370 "dma_device_type": 1 00:12:44.370 }, 00:12:44.370 { 00:12:44.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:44.370 "dma_device_type": 2 00:12:44.370 } 00:12:44.370 ], 00:12:44.371 "driver_specific": {} 00:12:44.371 }' 00:12:44.371 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:44.371 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:44.629 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:44.629 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:44.629 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:44.629 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:44.629 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:44.629 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:44.629 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:44.629 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:44.629 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:44.888 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:44.888 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:44.888 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:44.888 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:44.888 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:44.888 "name": "BaseBdev2", 00:12:44.888 "aliases": [ 00:12:44.888 "a18ab113-a3ee-4e1d-a67a-17d3588f6c54" 00:12:44.888 ], 00:12:44.888 "product_name": "Malloc disk", 00:12:44.888 "block_size": 512, 00:12:44.888 "num_blocks": 65536, 00:12:44.888 "uuid": "a18ab113-a3ee-4e1d-a67a-17d3588f6c54", 00:12:44.888 "assigned_rate_limits": { 00:12:44.888 "rw_ios_per_sec": 0, 00:12:44.888 "rw_mbytes_per_sec": 0, 00:12:44.888 "r_mbytes_per_sec": 0, 00:12:44.888 "w_mbytes_per_sec": 0 00:12:44.888 }, 00:12:44.888 "claimed": true, 00:12:44.888 "claim_type": "exclusive_write", 00:12:44.888 "zoned": false, 00:12:44.888 "supported_io_types": { 00:12:44.888 "read": true, 00:12:44.888 "write": true, 00:12:44.888 "unmap": true, 00:12:44.888 "flush": true, 00:12:44.888 "reset": true, 00:12:44.888 "nvme_admin": false, 00:12:44.888 "nvme_io": false, 00:12:44.888 "nvme_io_md": false, 00:12:44.888 "write_zeroes": true, 00:12:44.888 "zcopy": true, 00:12:44.888 "get_zone_info": false, 00:12:44.888 "zone_management": false, 00:12:44.888 "zone_append": false, 00:12:44.888 "compare": false, 00:12:44.888 "compare_and_write": false, 00:12:44.888 "abort": true, 00:12:44.888 "seek_hole": false, 00:12:44.888 "seek_data": false, 00:12:44.888 "copy": true, 00:12:44.888 "nvme_iov_md": false 00:12:44.888 }, 00:12:44.888 "memory_domains": [ 00:12:44.888 { 00:12:44.888 "dma_device_id": "system", 00:12:44.888 "dma_device_type": 1 00:12:44.888 }, 00:12:44.888 { 00:12:44.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:44.888 "dma_device_type": 2 00:12:44.888 } 00:12:44.888 ], 00:12:44.888 "driver_specific": {} 00:12:44.888 }' 00:12:44.888 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:44.888 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:45.147 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:45.147 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:45.147 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:45.147 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:45.147 18:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:45.147 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:45.147 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:45.147 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:45.147 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:45.406 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:45.406 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:45.406 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:45.406 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:45.406 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:45.406 "name": "BaseBdev3", 00:12:45.406 "aliases": [ 00:12:45.406 "8f66d0f1-4f67-4db5-b68b-8226593fc891" 00:12:45.406 ], 00:12:45.406 "product_name": "Malloc disk", 00:12:45.406 "block_size": 512, 00:12:45.406 "num_blocks": 65536, 00:12:45.406 "uuid": "8f66d0f1-4f67-4db5-b68b-8226593fc891", 00:12:45.406 "assigned_rate_limits": { 00:12:45.406 "rw_ios_per_sec": 0, 00:12:45.406 "rw_mbytes_per_sec": 0, 00:12:45.406 "r_mbytes_per_sec": 0, 00:12:45.406 "w_mbytes_per_sec": 0 00:12:45.406 }, 00:12:45.406 "claimed": true, 00:12:45.406 "claim_type": "exclusive_write", 00:12:45.406 "zoned": false, 00:12:45.406 "supported_io_types": { 00:12:45.406 "read": true, 00:12:45.406 "write": true, 00:12:45.406 "unmap": true, 00:12:45.406 "flush": true, 00:12:45.406 "reset": true, 00:12:45.406 "nvme_admin": false, 00:12:45.406 "nvme_io": false, 00:12:45.406 "nvme_io_md": false, 00:12:45.406 "write_zeroes": true, 00:12:45.406 "zcopy": true, 00:12:45.406 "get_zone_info": false, 00:12:45.406 "zone_management": false, 00:12:45.406 "zone_append": false, 00:12:45.406 "compare": false, 00:12:45.406 "compare_and_write": false, 00:12:45.406 "abort": true, 00:12:45.406 "seek_hole": false, 00:12:45.406 "seek_data": false, 00:12:45.406 "copy": true, 00:12:45.406 "nvme_iov_md": false 00:12:45.406 }, 00:12:45.406 "memory_domains": [ 00:12:45.406 { 00:12:45.406 "dma_device_id": "system", 00:12:45.406 "dma_device_type": 1 00:12:45.406 }, 00:12:45.406 { 00:12:45.406 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:45.406 "dma_device_type": 2 00:12:45.406 } 00:12:45.406 ], 00:12:45.406 "driver_specific": {} 00:12:45.406 }' 00:12:45.406 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:45.406 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:45.406 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:45.406 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:45.665 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:45.665 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:45.665 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:45.665 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:45.665 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:45.665 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:45.665 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:45.665 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:45.665 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:45.924 [2024-07-24 18:49:30.786883] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:45.924 [2024-07-24 18:49:30.786902] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:45.924 [2024-07-24 18:49:30.786954] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:45.924 [2024-07-24 18:49:30.786990] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:45.924 [2024-07-24 18:49:30.786995] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18e5a60 name Existed_Raid, state offline 00:12:45.924 18:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2084023 00:12:45.924 18:49:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2084023 ']' 00:12:45.924 18:49:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2084023 00:12:45.924 18:49:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:12:45.924 18:49:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:45.924 18:49:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2084023 00:12:45.924 18:49:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:45.924 18:49:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:45.924 18:49:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2084023' 00:12:45.924 killing process with pid 2084023 00:12:45.924 18:49:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2084023 00:12:45.924 [2024-07-24 18:49:30.844073] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:45.924 18:49:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2084023 00:12:45.925 [2024-07-24 18:49:30.867476] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:46.184 18:49:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:46.184 00:12:46.184 real 0m21.292s 00:12:46.184 user 0m39.659s 00:12:46.184 sys 0m3.288s 00:12:46.184 18:49:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:46.184 18:49:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:46.184 ************************************ 00:12:46.184 END TEST raid_state_function_test_sb 00:12:46.184 ************************************ 00:12:46.184 18:49:31 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:12:46.184 18:49:31 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:46.184 18:49:31 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:46.184 18:49:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:46.184 ************************************ 00:12:46.184 START TEST raid_superblock_test 00:12:46.184 ************************************ 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2088265 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2088265 /var/tmp/spdk-raid.sock 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2088265 ']' 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:46.184 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:46.184 18:49:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:46.184 [2024-07-24 18:49:31.165894] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:12:46.184 [2024-07-24 18:49:31.165929] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2088265 ] 00:12:46.443 [2024-07-24 18:49:31.228551] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:46.443 [2024-07-24 18:49:31.300837] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:46.443 [2024-07-24 18:49:31.355547] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:46.443 [2024-07-24 18:49:31.355575] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:47.012 18:49:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:47.012 18:49:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:47.012 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:47.012 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:47.012 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:47.012 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:47.012 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:47.012 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:47.012 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:47.012 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:47.012 18:49:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:47.271 malloc1 00:12:47.271 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:47.271 [2024-07-24 18:49:32.275837] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:47.271 [2024-07-24 18:49:32.275873] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:47.271 [2024-07-24 18:49:32.275886] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c98e20 00:12:47.271 [2024-07-24 18:49:32.275912] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:47.271 [2024-07-24 18:49:32.277172] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:47.272 [2024-07-24 18:49:32.277193] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:47.272 pt1 00:12:47.531 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:47.531 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:47.531 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:47.531 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:47.531 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:47.531 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:47.531 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:47.531 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:47.531 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:47.531 malloc2 00:12:47.531 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:47.789 [2024-07-24 18:49:32.608449] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:47.789 [2024-07-24 18:49:32.608484] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:47.790 [2024-07-24 18:49:32.608493] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e42ed0 00:12:47.790 [2024-07-24 18:49:32.608499] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:47.790 [2024-07-24 18:49:32.609566] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:47.790 [2024-07-24 18:49:32.609585] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:47.790 pt2 00:12:47.790 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:47.790 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:47.790 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:12:47.790 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:12:47.790 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:12:47.790 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:47.790 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:47.790 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:47.790 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:12:47.790 malloc3 00:12:48.048 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:12:48.048 [2024-07-24 18:49:32.944819] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:12:48.048 [2024-07-24 18:49:32.944851] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:48.048 [2024-07-24 18:49:32.944860] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e46a30 00:12:48.048 [2024-07-24 18:49:32.944866] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:48.048 [2024-07-24 18:49:32.945916] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:48.048 [2024-07-24 18:49:32.945939] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:12:48.048 pt3 00:12:48.048 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:48.048 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:48.048 18:49:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:12:48.307 [2024-07-24 18:49:33.097230] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:48.307 [2024-07-24 18:49:33.098097] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:48.307 [2024-07-24 18:49:33.098134] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:12:48.307 [2024-07-24 18:49:33.098233] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e47a40 00:12:48.307 [2024-07-24 18:49:33.098239] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:48.307 [2024-07-24 18:49:33.098366] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e42050 00:12:48.307 [2024-07-24 18:49:33.098462] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e47a40 00:12:48.307 [2024-07-24 18:49:33.098467] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e47a40 00:12:48.307 [2024-07-24 18:49:33.098536] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:48.307 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:12:48.307 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:48.307 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:48.307 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:48.307 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:48.307 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:48.307 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:48.307 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:48.307 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:48.307 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:48.307 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.307 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:48.307 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:48.307 "name": "raid_bdev1", 00:12:48.307 "uuid": "62992cbb-4672-4271-b788-2745028a8b4a", 00:12:48.307 "strip_size_kb": 64, 00:12:48.307 "state": "online", 00:12:48.307 "raid_level": "concat", 00:12:48.307 "superblock": true, 00:12:48.307 "num_base_bdevs": 3, 00:12:48.307 "num_base_bdevs_discovered": 3, 00:12:48.307 "num_base_bdevs_operational": 3, 00:12:48.307 "base_bdevs_list": [ 00:12:48.307 { 00:12:48.307 "name": "pt1", 00:12:48.307 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:48.307 "is_configured": true, 00:12:48.308 "data_offset": 2048, 00:12:48.308 "data_size": 63488 00:12:48.308 }, 00:12:48.308 { 00:12:48.308 "name": "pt2", 00:12:48.308 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:48.308 "is_configured": true, 00:12:48.308 "data_offset": 2048, 00:12:48.308 "data_size": 63488 00:12:48.308 }, 00:12:48.308 { 00:12:48.308 "name": "pt3", 00:12:48.308 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:48.308 "is_configured": true, 00:12:48.308 "data_offset": 2048, 00:12:48.308 "data_size": 63488 00:12:48.308 } 00:12:48.308 ] 00:12:48.308 }' 00:12:48.308 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:48.308 18:49:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:48.880 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:48.880 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:48.880 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:48.880 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:48.880 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:48.880 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:48.880 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:48.880 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:49.139 [2024-07-24 18:49:33.931556] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:49.139 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:49.139 "name": "raid_bdev1", 00:12:49.139 "aliases": [ 00:12:49.139 "62992cbb-4672-4271-b788-2745028a8b4a" 00:12:49.139 ], 00:12:49.139 "product_name": "Raid Volume", 00:12:49.139 "block_size": 512, 00:12:49.139 "num_blocks": 190464, 00:12:49.139 "uuid": "62992cbb-4672-4271-b788-2745028a8b4a", 00:12:49.139 "assigned_rate_limits": { 00:12:49.139 "rw_ios_per_sec": 0, 00:12:49.139 "rw_mbytes_per_sec": 0, 00:12:49.139 "r_mbytes_per_sec": 0, 00:12:49.139 "w_mbytes_per_sec": 0 00:12:49.139 }, 00:12:49.139 "claimed": false, 00:12:49.139 "zoned": false, 00:12:49.139 "supported_io_types": { 00:12:49.139 "read": true, 00:12:49.139 "write": true, 00:12:49.139 "unmap": true, 00:12:49.139 "flush": true, 00:12:49.139 "reset": true, 00:12:49.139 "nvme_admin": false, 00:12:49.139 "nvme_io": false, 00:12:49.139 "nvme_io_md": false, 00:12:49.139 "write_zeroes": true, 00:12:49.139 "zcopy": false, 00:12:49.139 "get_zone_info": false, 00:12:49.139 "zone_management": false, 00:12:49.139 "zone_append": false, 00:12:49.139 "compare": false, 00:12:49.139 "compare_and_write": false, 00:12:49.139 "abort": false, 00:12:49.139 "seek_hole": false, 00:12:49.139 "seek_data": false, 00:12:49.139 "copy": false, 00:12:49.139 "nvme_iov_md": false 00:12:49.139 }, 00:12:49.139 "memory_domains": [ 00:12:49.139 { 00:12:49.139 "dma_device_id": "system", 00:12:49.139 "dma_device_type": 1 00:12:49.139 }, 00:12:49.139 { 00:12:49.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:49.139 "dma_device_type": 2 00:12:49.139 }, 00:12:49.139 { 00:12:49.139 "dma_device_id": "system", 00:12:49.139 "dma_device_type": 1 00:12:49.139 }, 00:12:49.139 { 00:12:49.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:49.139 "dma_device_type": 2 00:12:49.139 }, 00:12:49.139 { 00:12:49.139 "dma_device_id": "system", 00:12:49.139 "dma_device_type": 1 00:12:49.139 }, 00:12:49.139 { 00:12:49.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:49.139 "dma_device_type": 2 00:12:49.139 } 00:12:49.139 ], 00:12:49.139 "driver_specific": { 00:12:49.139 "raid": { 00:12:49.139 "uuid": "62992cbb-4672-4271-b788-2745028a8b4a", 00:12:49.139 "strip_size_kb": 64, 00:12:49.139 "state": "online", 00:12:49.139 "raid_level": "concat", 00:12:49.139 "superblock": true, 00:12:49.139 "num_base_bdevs": 3, 00:12:49.139 "num_base_bdevs_discovered": 3, 00:12:49.139 "num_base_bdevs_operational": 3, 00:12:49.139 "base_bdevs_list": [ 00:12:49.139 { 00:12:49.139 "name": "pt1", 00:12:49.139 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:49.139 "is_configured": true, 00:12:49.139 "data_offset": 2048, 00:12:49.139 "data_size": 63488 00:12:49.139 }, 00:12:49.139 { 00:12:49.139 "name": "pt2", 00:12:49.139 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:49.139 "is_configured": true, 00:12:49.139 "data_offset": 2048, 00:12:49.139 "data_size": 63488 00:12:49.139 }, 00:12:49.139 { 00:12:49.139 "name": "pt3", 00:12:49.139 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:49.139 "is_configured": true, 00:12:49.139 "data_offset": 2048, 00:12:49.139 "data_size": 63488 00:12:49.139 } 00:12:49.139 ] 00:12:49.139 } 00:12:49.139 } 00:12:49.139 }' 00:12:49.139 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:49.139 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:49.139 pt2 00:12:49.139 pt3' 00:12:49.139 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:49.139 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:49.139 18:49:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:49.398 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:49.398 "name": "pt1", 00:12:49.398 "aliases": [ 00:12:49.398 "00000000-0000-0000-0000-000000000001" 00:12:49.398 ], 00:12:49.398 "product_name": "passthru", 00:12:49.398 "block_size": 512, 00:12:49.398 "num_blocks": 65536, 00:12:49.398 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:49.398 "assigned_rate_limits": { 00:12:49.398 "rw_ios_per_sec": 0, 00:12:49.398 "rw_mbytes_per_sec": 0, 00:12:49.398 "r_mbytes_per_sec": 0, 00:12:49.398 "w_mbytes_per_sec": 0 00:12:49.398 }, 00:12:49.398 "claimed": true, 00:12:49.398 "claim_type": "exclusive_write", 00:12:49.398 "zoned": false, 00:12:49.398 "supported_io_types": { 00:12:49.398 "read": true, 00:12:49.398 "write": true, 00:12:49.398 "unmap": true, 00:12:49.398 "flush": true, 00:12:49.398 "reset": true, 00:12:49.398 "nvme_admin": false, 00:12:49.398 "nvme_io": false, 00:12:49.398 "nvme_io_md": false, 00:12:49.398 "write_zeroes": true, 00:12:49.398 "zcopy": true, 00:12:49.398 "get_zone_info": false, 00:12:49.398 "zone_management": false, 00:12:49.398 "zone_append": false, 00:12:49.398 "compare": false, 00:12:49.398 "compare_and_write": false, 00:12:49.398 "abort": true, 00:12:49.398 "seek_hole": false, 00:12:49.398 "seek_data": false, 00:12:49.398 "copy": true, 00:12:49.398 "nvme_iov_md": false 00:12:49.398 }, 00:12:49.398 "memory_domains": [ 00:12:49.398 { 00:12:49.398 "dma_device_id": "system", 00:12:49.398 "dma_device_type": 1 00:12:49.398 }, 00:12:49.398 { 00:12:49.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:49.398 "dma_device_type": 2 00:12:49.398 } 00:12:49.398 ], 00:12:49.398 "driver_specific": { 00:12:49.398 "passthru": { 00:12:49.398 "name": "pt1", 00:12:49.398 "base_bdev_name": "malloc1" 00:12:49.398 } 00:12:49.398 } 00:12:49.398 }' 00:12:49.398 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:49.398 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:49.398 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:49.398 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:49.398 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:49.398 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:49.398 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:49.398 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:49.398 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:49.398 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:49.398 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:49.657 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:49.657 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:49.657 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:49.657 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:49.657 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:49.657 "name": "pt2", 00:12:49.657 "aliases": [ 00:12:49.657 "00000000-0000-0000-0000-000000000002" 00:12:49.657 ], 00:12:49.657 "product_name": "passthru", 00:12:49.657 "block_size": 512, 00:12:49.657 "num_blocks": 65536, 00:12:49.657 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:49.657 "assigned_rate_limits": { 00:12:49.657 "rw_ios_per_sec": 0, 00:12:49.657 "rw_mbytes_per_sec": 0, 00:12:49.657 "r_mbytes_per_sec": 0, 00:12:49.657 "w_mbytes_per_sec": 0 00:12:49.657 }, 00:12:49.657 "claimed": true, 00:12:49.657 "claim_type": "exclusive_write", 00:12:49.657 "zoned": false, 00:12:49.657 "supported_io_types": { 00:12:49.657 "read": true, 00:12:49.657 "write": true, 00:12:49.657 "unmap": true, 00:12:49.657 "flush": true, 00:12:49.657 "reset": true, 00:12:49.657 "nvme_admin": false, 00:12:49.657 "nvme_io": false, 00:12:49.657 "nvme_io_md": false, 00:12:49.657 "write_zeroes": true, 00:12:49.657 "zcopy": true, 00:12:49.657 "get_zone_info": false, 00:12:49.657 "zone_management": false, 00:12:49.657 "zone_append": false, 00:12:49.657 "compare": false, 00:12:49.657 "compare_and_write": false, 00:12:49.657 "abort": true, 00:12:49.657 "seek_hole": false, 00:12:49.657 "seek_data": false, 00:12:49.657 "copy": true, 00:12:49.657 "nvme_iov_md": false 00:12:49.657 }, 00:12:49.657 "memory_domains": [ 00:12:49.657 { 00:12:49.657 "dma_device_id": "system", 00:12:49.657 "dma_device_type": 1 00:12:49.657 }, 00:12:49.657 { 00:12:49.657 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:49.657 "dma_device_type": 2 00:12:49.657 } 00:12:49.657 ], 00:12:49.657 "driver_specific": { 00:12:49.657 "passthru": { 00:12:49.657 "name": "pt2", 00:12:49.657 "base_bdev_name": "malloc2" 00:12:49.657 } 00:12:49.657 } 00:12:49.657 }' 00:12:49.657 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:49.657 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:49.915 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:49.915 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:49.915 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:49.915 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:49.915 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:49.915 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:49.915 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:49.915 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:49.915 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:49.915 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:49.915 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:49.915 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:12:49.915 18:49:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:50.174 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:50.174 "name": "pt3", 00:12:50.174 "aliases": [ 00:12:50.174 "00000000-0000-0000-0000-000000000003" 00:12:50.174 ], 00:12:50.174 "product_name": "passthru", 00:12:50.174 "block_size": 512, 00:12:50.174 "num_blocks": 65536, 00:12:50.174 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:50.174 "assigned_rate_limits": { 00:12:50.174 "rw_ios_per_sec": 0, 00:12:50.174 "rw_mbytes_per_sec": 0, 00:12:50.174 "r_mbytes_per_sec": 0, 00:12:50.174 "w_mbytes_per_sec": 0 00:12:50.174 }, 00:12:50.174 "claimed": true, 00:12:50.174 "claim_type": "exclusive_write", 00:12:50.174 "zoned": false, 00:12:50.174 "supported_io_types": { 00:12:50.174 "read": true, 00:12:50.174 "write": true, 00:12:50.174 "unmap": true, 00:12:50.174 "flush": true, 00:12:50.174 "reset": true, 00:12:50.174 "nvme_admin": false, 00:12:50.174 "nvme_io": false, 00:12:50.174 "nvme_io_md": false, 00:12:50.174 "write_zeroes": true, 00:12:50.174 "zcopy": true, 00:12:50.174 "get_zone_info": false, 00:12:50.174 "zone_management": false, 00:12:50.174 "zone_append": false, 00:12:50.174 "compare": false, 00:12:50.174 "compare_and_write": false, 00:12:50.174 "abort": true, 00:12:50.174 "seek_hole": false, 00:12:50.174 "seek_data": false, 00:12:50.174 "copy": true, 00:12:50.174 "nvme_iov_md": false 00:12:50.174 }, 00:12:50.174 "memory_domains": [ 00:12:50.174 { 00:12:50.174 "dma_device_id": "system", 00:12:50.174 "dma_device_type": 1 00:12:50.174 }, 00:12:50.174 { 00:12:50.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:50.174 "dma_device_type": 2 00:12:50.174 } 00:12:50.174 ], 00:12:50.174 "driver_specific": { 00:12:50.174 "passthru": { 00:12:50.174 "name": "pt3", 00:12:50.174 "base_bdev_name": "malloc3" 00:12:50.174 } 00:12:50.174 } 00:12:50.174 }' 00:12:50.174 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:50.174 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:50.174 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:50.174 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:50.174 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:50.433 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:50.433 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:50.433 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:50.433 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:50.433 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:50.433 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:50.433 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:50.433 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:50.433 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:50.690 [2024-07-24 18:49:35.483543] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:50.691 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=62992cbb-4672-4271-b788-2745028a8b4a 00:12:50.691 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 62992cbb-4672-4271-b788-2745028a8b4a ']' 00:12:50.691 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:50.691 [2024-07-24 18:49:35.651797] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:50.691 [2024-07-24 18:49:35.651809] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:50.691 [2024-07-24 18:49:35.651843] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:50.691 [2024-07-24 18:49:35.651879] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:50.691 [2024-07-24 18:49:35.651885] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e47a40 name raid_bdev1, state offline 00:12:50.691 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.691 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:50.949 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:50.949 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:50.949 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:50.949 18:49:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:51.207 18:49:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:51.207 18:49:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:51.207 18:49:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:51.207 18:49:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:12:51.466 18:49:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:51.466 18:49:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:51.725 [2024-07-24 18:49:36.658379] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:51.725 [2024-07-24 18:49:36.659360] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:51.725 [2024-07-24 18:49:36.659390] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:12:51.725 [2024-07-24 18:49:36.659420] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:51.725 [2024-07-24 18:49:36.659447] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:51.725 [2024-07-24 18:49:36.659460] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:12:51.725 [2024-07-24 18:49:36.659475] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:51.725 [2024-07-24 18:49:36.659481] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e43100 name raid_bdev1, state configuring 00:12:51.725 request: 00:12:51.725 { 00:12:51.725 "name": "raid_bdev1", 00:12:51.725 "raid_level": "concat", 00:12:51.725 "base_bdevs": [ 00:12:51.725 "malloc1", 00:12:51.725 "malloc2", 00:12:51.725 "malloc3" 00:12:51.725 ], 00:12:51.725 "strip_size_kb": 64, 00:12:51.725 "superblock": false, 00:12:51.725 "method": "bdev_raid_create", 00:12:51.725 "req_id": 1 00:12:51.725 } 00:12:51.725 Got JSON-RPC error response 00:12:51.725 response: 00:12:51.725 { 00:12:51.725 "code": -17, 00:12:51.725 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:51.725 } 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.725 18:49:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:51.984 18:49:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:51.984 18:49:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:51.984 18:49:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:52.243 [2024-07-24 18:49:36.995211] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:52.243 [2024-07-24 18:49:36.995234] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:52.243 [2024-07-24 18:49:36.995244] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c99a40 00:12:52.243 [2024-07-24 18:49:36.995249] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:52.243 [2024-07-24 18:49:36.996433] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:52.243 [2024-07-24 18:49:36.996453] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:52.243 [2024-07-24 18:49:36.996502] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:52.243 [2024-07-24 18:49:36.996520] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:52.243 pt1 00:12:52.243 18:49:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:12:52.243 18:49:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:52.243 18:49:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:52.243 18:49:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:52.243 18:49:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:52.243 18:49:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:52.243 18:49:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:52.243 18:49:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:52.243 18:49:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:52.243 18:49:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:52.243 18:49:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.243 18:49:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:52.243 18:49:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:52.243 "name": "raid_bdev1", 00:12:52.243 "uuid": "62992cbb-4672-4271-b788-2745028a8b4a", 00:12:52.243 "strip_size_kb": 64, 00:12:52.243 "state": "configuring", 00:12:52.243 "raid_level": "concat", 00:12:52.243 "superblock": true, 00:12:52.243 "num_base_bdevs": 3, 00:12:52.243 "num_base_bdevs_discovered": 1, 00:12:52.243 "num_base_bdevs_operational": 3, 00:12:52.243 "base_bdevs_list": [ 00:12:52.243 { 00:12:52.243 "name": "pt1", 00:12:52.243 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:52.243 "is_configured": true, 00:12:52.243 "data_offset": 2048, 00:12:52.243 "data_size": 63488 00:12:52.243 }, 00:12:52.243 { 00:12:52.243 "name": null, 00:12:52.243 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:52.243 "is_configured": false, 00:12:52.243 "data_offset": 2048, 00:12:52.243 "data_size": 63488 00:12:52.243 }, 00:12:52.243 { 00:12:52.243 "name": null, 00:12:52.243 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:52.243 "is_configured": false, 00:12:52.243 "data_offset": 2048, 00:12:52.243 "data_size": 63488 00:12:52.243 } 00:12:52.243 ] 00:12:52.243 }' 00:12:52.243 18:49:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:52.243 18:49:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.809 18:49:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:12:52.809 18:49:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:52.809 [2024-07-24 18:49:37.817341] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:52.809 [2024-07-24 18:49:37.817379] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:52.809 [2024-07-24 18:49:37.817408] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c992c0 00:12:52.809 [2024-07-24 18:49:37.817415] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:52.809 [2024-07-24 18:49:37.817674] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:52.809 [2024-07-24 18:49:37.817685] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:52.809 [2024-07-24 18:49:37.817725] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:52.809 [2024-07-24 18:49:37.817738] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:53.068 pt2 00:12:53.068 18:49:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:53.068 [2024-07-24 18:49:37.993815] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:12:53.068 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:12:53.068 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:53.068 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:53.068 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:53.068 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:53.068 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:53.068 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:53.068 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:53.068 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:53.068 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:53.068 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:53.068 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:53.326 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:53.326 "name": "raid_bdev1", 00:12:53.326 "uuid": "62992cbb-4672-4271-b788-2745028a8b4a", 00:12:53.326 "strip_size_kb": 64, 00:12:53.326 "state": "configuring", 00:12:53.326 "raid_level": "concat", 00:12:53.326 "superblock": true, 00:12:53.326 "num_base_bdevs": 3, 00:12:53.326 "num_base_bdevs_discovered": 1, 00:12:53.326 "num_base_bdevs_operational": 3, 00:12:53.326 "base_bdevs_list": [ 00:12:53.326 { 00:12:53.326 "name": "pt1", 00:12:53.326 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:53.326 "is_configured": true, 00:12:53.326 "data_offset": 2048, 00:12:53.326 "data_size": 63488 00:12:53.326 }, 00:12:53.326 { 00:12:53.326 "name": null, 00:12:53.326 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:53.326 "is_configured": false, 00:12:53.326 "data_offset": 2048, 00:12:53.326 "data_size": 63488 00:12:53.326 }, 00:12:53.326 { 00:12:53.326 "name": null, 00:12:53.326 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:53.326 "is_configured": false, 00:12:53.326 "data_offset": 2048, 00:12:53.326 "data_size": 63488 00:12:53.326 } 00:12:53.326 ] 00:12:53.326 }' 00:12:53.326 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:53.326 18:49:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:53.893 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:53.893 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:53.893 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:53.893 [2024-07-24 18:49:38.819923] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:53.893 [2024-07-24 18:49:38.819962] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:53.893 [2024-07-24 18:49:38.819974] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e442e0 00:12:53.893 [2024-07-24 18:49:38.819997] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:53.893 [2024-07-24 18:49:38.820239] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:53.893 [2024-07-24 18:49:38.820248] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:53.893 [2024-07-24 18:49:38.820288] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:53.893 [2024-07-24 18:49:38.820300] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:53.893 pt2 00:12:53.893 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:53.893 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:53.893 18:49:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:12:54.152 [2024-07-24 18:49:38.988369] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:12:54.152 [2024-07-24 18:49:38.988389] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:54.152 [2024-07-24 18:49:38.988397] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e456d0 00:12:54.152 [2024-07-24 18:49:38.988403] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:54.152 [2024-07-24 18:49:38.988635] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:54.152 [2024-07-24 18:49:38.988646] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:12:54.152 [2024-07-24 18:49:38.988681] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:12:54.152 [2024-07-24 18:49:38.988703] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:12:54.152 [2024-07-24 18:49:38.988777] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e45fd0 00:12:54.152 [2024-07-24 18:49:38.988788] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:54.152 [2024-07-24 18:49:38.988929] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e470f0 00:12:54.153 [2024-07-24 18:49:38.989024] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e45fd0 00:12:54.153 [2024-07-24 18:49:38.989030] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e45fd0 00:12:54.153 [2024-07-24 18:49:38.989099] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:54.153 pt3 00:12:54.153 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:54.153 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:54.153 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:12:54.153 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:54.153 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:54.153 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:54.153 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:54.153 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:54.153 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:54.153 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:54.153 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:54.153 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:54.153 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.153 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:54.411 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:54.411 "name": "raid_bdev1", 00:12:54.411 "uuid": "62992cbb-4672-4271-b788-2745028a8b4a", 00:12:54.411 "strip_size_kb": 64, 00:12:54.411 "state": "online", 00:12:54.411 "raid_level": "concat", 00:12:54.411 "superblock": true, 00:12:54.411 "num_base_bdevs": 3, 00:12:54.411 "num_base_bdevs_discovered": 3, 00:12:54.411 "num_base_bdevs_operational": 3, 00:12:54.411 "base_bdevs_list": [ 00:12:54.411 { 00:12:54.411 "name": "pt1", 00:12:54.411 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:54.411 "is_configured": true, 00:12:54.411 "data_offset": 2048, 00:12:54.411 "data_size": 63488 00:12:54.411 }, 00:12:54.411 { 00:12:54.411 "name": "pt2", 00:12:54.411 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:54.411 "is_configured": true, 00:12:54.411 "data_offset": 2048, 00:12:54.411 "data_size": 63488 00:12:54.411 }, 00:12:54.411 { 00:12:54.411 "name": "pt3", 00:12:54.411 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:54.411 "is_configured": true, 00:12:54.411 "data_offset": 2048, 00:12:54.411 "data_size": 63488 00:12:54.411 } 00:12:54.411 ] 00:12:54.411 }' 00:12:54.411 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:54.411 18:49:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:54.670 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:54.670 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:54.670 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:54.670 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:54.670 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:54.670 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:54.670 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:54.670 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:54.928 [2024-07-24 18:49:39.770588] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:54.928 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:54.928 "name": "raid_bdev1", 00:12:54.928 "aliases": [ 00:12:54.928 "62992cbb-4672-4271-b788-2745028a8b4a" 00:12:54.929 ], 00:12:54.929 "product_name": "Raid Volume", 00:12:54.929 "block_size": 512, 00:12:54.929 "num_blocks": 190464, 00:12:54.929 "uuid": "62992cbb-4672-4271-b788-2745028a8b4a", 00:12:54.929 "assigned_rate_limits": { 00:12:54.929 "rw_ios_per_sec": 0, 00:12:54.929 "rw_mbytes_per_sec": 0, 00:12:54.929 "r_mbytes_per_sec": 0, 00:12:54.929 "w_mbytes_per_sec": 0 00:12:54.929 }, 00:12:54.929 "claimed": false, 00:12:54.929 "zoned": false, 00:12:54.929 "supported_io_types": { 00:12:54.929 "read": true, 00:12:54.929 "write": true, 00:12:54.929 "unmap": true, 00:12:54.929 "flush": true, 00:12:54.929 "reset": true, 00:12:54.929 "nvme_admin": false, 00:12:54.929 "nvme_io": false, 00:12:54.929 "nvme_io_md": false, 00:12:54.929 "write_zeroes": true, 00:12:54.929 "zcopy": false, 00:12:54.929 "get_zone_info": false, 00:12:54.929 "zone_management": false, 00:12:54.929 "zone_append": false, 00:12:54.929 "compare": false, 00:12:54.929 "compare_and_write": false, 00:12:54.929 "abort": false, 00:12:54.929 "seek_hole": false, 00:12:54.929 "seek_data": false, 00:12:54.929 "copy": false, 00:12:54.929 "nvme_iov_md": false 00:12:54.929 }, 00:12:54.929 "memory_domains": [ 00:12:54.929 { 00:12:54.929 "dma_device_id": "system", 00:12:54.929 "dma_device_type": 1 00:12:54.929 }, 00:12:54.929 { 00:12:54.929 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:54.929 "dma_device_type": 2 00:12:54.929 }, 00:12:54.929 { 00:12:54.929 "dma_device_id": "system", 00:12:54.929 "dma_device_type": 1 00:12:54.929 }, 00:12:54.929 { 00:12:54.929 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:54.929 "dma_device_type": 2 00:12:54.929 }, 00:12:54.929 { 00:12:54.929 "dma_device_id": "system", 00:12:54.929 "dma_device_type": 1 00:12:54.929 }, 00:12:54.929 { 00:12:54.929 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:54.929 "dma_device_type": 2 00:12:54.929 } 00:12:54.929 ], 00:12:54.929 "driver_specific": { 00:12:54.929 "raid": { 00:12:54.929 "uuid": "62992cbb-4672-4271-b788-2745028a8b4a", 00:12:54.929 "strip_size_kb": 64, 00:12:54.929 "state": "online", 00:12:54.929 "raid_level": "concat", 00:12:54.929 "superblock": true, 00:12:54.929 "num_base_bdevs": 3, 00:12:54.929 "num_base_bdevs_discovered": 3, 00:12:54.929 "num_base_bdevs_operational": 3, 00:12:54.929 "base_bdevs_list": [ 00:12:54.929 { 00:12:54.929 "name": "pt1", 00:12:54.929 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:54.929 "is_configured": true, 00:12:54.929 "data_offset": 2048, 00:12:54.929 "data_size": 63488 00:12:54.929 }, 00:12:54.929 { 00:12:54.929 "name": "pt2", 00:12:54.929 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:54.929 "is_configured": true, 00:12:54.929 "data_offset": 2048, 00:12:54.929 "data_size": 63488 00:12:54.929 }, 00:12:54.929 { 00:12:54.929 "name": "pt3", 00:12:54.929 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:54.929 "is_configured": true, 00:12:54.929 "data_offset": 2048, 00:12:54.929 "data_size": 63488 00:12:54.929 } 00:12:54.929 ] 00:12:54.929 } 00:12:54.929 } 00:12:54.929 }' 00:12:54.929 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:54.929 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:54.929 pt2 00:12:54.929 pt3' 00:12:54.929 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:54.929 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:54.929 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:55.188 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:55.188 "name": "pt1", 00:12:55.188 "aliases": [ 00:12:55.188 "00000000-0000-0000-0000-000000000001" 00:12:55.188 ], 00:12:55.188 "product_name": "passthru", 00:12:55.188 "block_size": 512, 00:12:55.188 "num_blocks": 65536, 00:12:55.188 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:55.188 "assigned_rate_limits": { 00:12:55.188 "rw_ios_per_sec": 0, 00:12:55.188 "rw_mbytes_per_sec": 0, 00:12:55.188 "r_mbytes_per_sec": 0, 00:12:55.188 "w_mbytes_per_sec": 0 00:12:55.188 }, 00:12:55.188 "claimed": true, 00:12:55.188 "claim_type": "exclusive_write", 00:12:55.188 "zoned": false, 00:12:55.188 "supported_io_types": { 00:12:55.188 "read": true, 00:12:55.188 "write": true, 00:12:55.188 "unmap": true, 00:12:55.188 "flush": true, 00:12:55.188 "reset": true, 00:12:55.188 "nvme_admin": false, 00:12:55.188 "nvme_io": false, 00:12:55.188 "nvme_io_md": false, 00:12:55.188 "write_zeroes": true, 00:12:55.188 "zcopy": true, 00:12:55.188 "get_zone_info": false, 00:12:55.188 "zone_management": false, 00:12:55.188 "zone_append": false, 00:12:55.188 "compare": false, 00:12:55.188 "compare_and_write": false, 00:12:55.188 "abort": true, 00:12:55.188 "seek_hole": false, 00:12:55.188 "seek_data": false, 00:12:55.188 "copy": true, 00:12:55.188 "nvme_iov_md": false 00:12:55.188 }, 00:12:55.188 "memory_domains": [ 00:12:55.188 { 00:12:55.188 "dma_device_id": "system", 00:12:55.188 "dma_device_type": 1 00:12:55.188 }, 00:12:55.188 { 00:12:55.188 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:55.188 "dma_device_type": 2 00:12:55.188 } 00:12:55.188 ], 00:12:55.188 "driver_specific": { 00:12:55.188 "passthru": { 00:12:55.188 "name": "pt1", 00:12:55.188 "base_bdev_name": "malloc1" 00:12:55.188 } 00:12:55.188 } 00:12:55.188 }' 00:12:55.188 18:49:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:55.188 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:55.188 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:55.188 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:55.188 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:55.188 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:55.188 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:55.188 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:55.446 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:55.446 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:55.446 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:55.446 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:55.446 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:55.446 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:55.446 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:55.705 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:55.705 "name": "pt2", 00:12:55.705 "aliases": [ 00:12:55.705 "00000000-0000-0000-0000-000000000002" 00:12:55.705 ], 00:12:55.705 "product_name": "passthru", 00:12:55.705 "block_size": 512, 00:12:55.705 "num_blocks": 65536, 00:12:55.705 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:55.705 "assigned_rate_limits": { 00:12:55.705 "rw_ios_per_sec": 0, 00:12:55.705 "rw_mbytes_per_sec": 0, 00:12:55.705 "r_mbytes_per_sec": 0, 00:12:55.705 "w_mbytes_per_sec": 0 00:12:55.705 }, 00:12:55.706 "claimed": true, 00:12:55.706 "claim_type": "exclusive_write", 00:12:55.706 "zoned": false, 00:12:55.706 "supported_io_types": { 00:12:55.706 "read": true, 00:12:55.706 "write": true, 00:12:55.706 "unmap": true, 00:12:55.706 "flush": true, 00:12:55.706 "reset": true, 00:12:55.706 "nvme_admin": false, 00:12:55.706 "nvme_io": false, 00:12:55.706 "nvme_io_md": false, 00:12:55.706 "write_zeroes": true, 00:12:55.706 "zcopy": true, 00:12:55.706 "get_zone_info": false, 00:12:55.706 "zone_management": false, 00:12:55.706 "zone_append": false, 00:12:55.706 "compare": false, 00:12:55.706 "compare_and_write": false, 00:12:55.706 "abort": true, 00:12:55.706 "seek_hole": false, 00:12:55.706 "seek_data": false, 00:12:55.706 "copy": true, 00:12:55.706 "nvme_iov_md": false 00:12:55.706 }, 00:12:55.706 "memory_domains": [ 00:12:55.706 { 00:12:55.706 "dma_device_id": "system", 00:12:55.706 "dma_device_type": 1 00:12:55.706 }, 00:12:55.706 { 00:12:55.706 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:55.706 "dma_device_type": 2 00:12:55.706 } 00:12:55.706 ], 00:12:55.706 "driver_specific": { 00:12:55.706 "passthru": { 00:12:55.706 "name": "pt2", 00:12:55.706 "base_bdev_name": "malloc2" 00:12:55.706 } 00:12:55.706 } 00:12:55.706 }' 00:12:55.706 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:55.706 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:55.706 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:55.706 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:55.706 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:55.706 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:55.706 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:55.706 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:55.706 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:55.706 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:55.706 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:56.000 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:56.000 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:56.000 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:56.000 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:12:56.000 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:56.000 "name": "pt3", 00:12:56.000 "aliases": [ 00:12:56.000 "00000000-0000-0000-0000-000000000003" 00:12:56.000 ], 00:12:56.000 "product_name": "passthru", 00:12:56.000 "block_size": 512, 00:12:56.000 "num_blocks": 65536, 00:12:56.000 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:56.000 "assigned_rate_limits": { 00:12:56.000 "rw_ios_per_sec": 0, 00:12:56.000 "rw_mbytes_per_sec": 0, 00:12:56.000 "r_mbytes_per_sec": 0, 00:12:56.000 "w_mbytes_per_sec": 0 00:12:56.000 }, 00:12:56.000 "claimed": true, 00:12:56.000 "claim_type": "exclusive_write", 00:12:56.000 "zoned": false, 00:12:56.000 "supported_io_types": { 00:12:56.000 "read": true, 00:12:56.000 "write": true, 00:12:56.000 "unmap": true, 00:12:56.000 "flush": true, 00:12:56.000 "reset": true, 00:12:56.000 "nvme_admin": false, 00:12:56.000 "nvme_io": false, 00:12:56.000 "nvme_io_md": false, 00:12:56.000 "write_zeroes": true, 00:12:56.000 "zcopy": true, 00:12:56.000 "get_zone_info": false, 00:12:56.000 "zone_management": false, 00:12:56.000 "zone_append": false, 00:12:56.000 "compare": false, 00:12:56.000 "compare_and_write": false, 00:12:56.000 "abort": true, 00:12:56.000 "seek_hole": false, 00:12:56.000 "seek_data": false, 00:12:56.000 "copy": true, 00:12:56.000 "nvme_iov_md": false 00:12:56.000 }, 00:12:56.000 "memory_domains": [ 00:12:56.000 { 00:12:56.000 "dma_device_id": "system", 00:12:56.000 "dma_device_type": 1 00:12:56.000 }, 00:12:56.000 { 00:12:56.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:56.000 "dma_device_type": 2 00:12:56.000 } 00:12:56.000 ], 00:12:56.000 "driver_specific": { 00:12:56.000 "passthru": { 00:12:56.000 "name": "pt3", 00:12:56.000 "base_bdev_name": "malloc3" 00:12:56.000 } 00:12:56.000 } 00:12:56.000 }' 00:12:56.000 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:56.000 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:56.000 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:56.000 18:49:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:56.000 18:49:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:56.279 18:49:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:56.279 18:49:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:56.279 18:49:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:56.279 18:49:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:56.279 18:49:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:56.279 18:49:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:56.279 18:49:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:56.279 18:49:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:56.279 18:49:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:56.538 [2024-07-24 18:49:41.290515] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:56.538 18:49:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 62992cbb-4672-4271-b788-2745028a8b4a '!=' 62992cbb-4672-4271-b788-2745028a8b4a ']' 00:12:56.538 18:49:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:12:56.538 18:49:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:56.538 18:49:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:56.538 18:49:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2088265 00:12:56.538 18:49:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2088265 ']' 00:12:56.538 18:49:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2088265 00:12:56.538 18:49:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:56.538 18:49:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:56.538 18:49:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2088265 00:12:56.538 18:49:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:56.538 18:49:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:56.538 18:49:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2088265' 00:12:56.538 killing process with pid 2088265 00:12:56.538 18:49:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2088265 00:12:56.538 [2024-07-24 18:49:41.335646] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:56.538 [2024-07-24 18:49:41.335687] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:56.538 [2024-07-24 18:49:41.335727] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:56.538 [2024-07-24 18:49:41.335732] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e45fd0 name raid_bdev1, state offline 00:12:56.538 18:49:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2088265 00:12:56.538 [2024-07-24 18:49:41.359543] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:56.538 18:49:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:56.538 00:12:56.538 real 0m10.418s 00:12:56.538 user 0m19.020s 00:12:56.538 sys 0m1.587s 00:12:56.538 18:49:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:56.538 18:49:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:56.538 ************************************ 00:12:56.538 END TEST raid_superblock_test 00:12:56.538 ************************************ 00:12:56.797 18:49:41 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:12:56.797 18:49:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:56.797 18:49:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:56.797 18:49:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:56.797 ************************************ 00:12:56.797 START TEST raid_read_error_test 00:12:56.797 ************************************ 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.0uuAPHOsbB 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2090166 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2090166 /var/tmp/spdk-raid.sock 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2090166 ']' 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:56.797 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:56.797 18:49:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:56.797 [2024-07-24 18:49:41.657461] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:12:56.797 [2024-07-24 18:49:41.657506] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2090166 ] 00:12:56.797 [2024-07-24 18:49:41.722129] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:56.797 [2024-07-24 18:49:41.801079] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:57.056 [2024-07-24 18:49:41.864240] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:57.056 [2024-07-24 18:49:41.864265] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:57.622 18:49:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:57.622 18:49:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:57.622 18:49:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:57.622 18:49:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:57.622 BaseBdev1_malloc 00:12:57.622 18:49:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:57.881 true 00:12:57.881 18:49:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:58.139 [2024-07-24 18:49:42.932665] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:58.139 [2024-07-24 18:49:42.932695] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:58.139 [2024-07-24 18:49:42.932706] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22ecd20 00:12:58.139 [2024-07-24 18:49:42.932711] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:58.139 [2024-07-24 18:49:42.933873] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:58.139 [2024-07-24 18:49:42.933893] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:58.139 BaseBdev1 00:12:58.139 18:49:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:58.139 18:49:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:58.139 BaseBdev2_malloc 00:12:58.139 18:49:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:58.397 true 00:12:58.397 18:49:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:58.655 [2024-07-24 18:49:43.437554] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:58.655 [2024-07-24 18:49:43.437587] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:58.655 [2024-07-24 18:49:43.437599] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22f1d50 00:12:58.655 [2024-07-24 18:49:43.437605] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:58.655 [2024-07-24 18:49:43.438690] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:58.655 [2024-07-24 18:49:43.438710] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:58.655 BaseBdev2 00:12:58.655 18:49:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:58.656 18:49:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:12:58.656 BaseBdev3_malloc 00:12:58.656 18:49:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:12:58.914 true 00:12:58.914 18:49:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:12:59.172 [2024-07-24 18:49:43.934384] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:12:59.172 [2024-07-24 18:49:43.934415] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:59.172 [2024-07-24 18:49:43.934425] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22f0ef0 00:12:59.172 [2024-07-24 18:49:43.934431] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:59.172 [2024-07-24 18:49:43.935541] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:59.172 [2024-07-24 18:49:43.935561] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:12:59.172 BaseBdev3 00:12:59.172 18:49:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:12:59.172 [2024-07-24 18:49:44.098836] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:59.172 [2024-07-24 18:49:44.099728] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:59.172 [2024-07-24 18:49:44.099773] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:59.172 [2024-07-24 18:49:44.099913] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x22f4a00 00:12:59.172 [2024-07-24 18:49:44.099920] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:59.172 [2024-07-24 18:49:44.100049] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2148790 00:12:59.172 [2024-07-24 18:49:44.100147] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22f4a00 00:12:59.172 [2024-07-24 18:49:44.100153] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22f4a00 00:12:59.172 [2024-07-24 18:49:44.100218] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:59.173 18:49:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:12:59.173 18:49:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:59.173 18:49:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:59.173 18:49:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:59.173 18:49:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:59.173 18:49:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:59.173 18:49:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:59.173 18:49:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:59.173 18:49:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:59.173 18:49:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:59.173 18:49:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:59.173 18:49:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.431 18:49:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:59.431 "name": "raid_bdev1", 00:12:59.431 "uuid": "7797c5ea-3ac0-4a2f-bece-1e6ecdd12e47", 00:12:59.431 "strip_size_kb": 64, 00:12:59.431 "state": "online", 00:12:59.431 "raid_level": "concat", 00:12:59.431 "superblock": true, 00:12:59.431 "num_base_bdevs": 3, 00:12:59.431 "num_base_bdevs_discovered": 3, 00:12:59.431 "num_base_bdevs_operational": 3, 00:12:59.431 "base_bdevs_list": [ 00:12:59.431 { 00:12:59.431 "name": "BaseBdev1", 00:12:59.431 "uuid": "23c96a80-8016-58a1-b58e-84fcdd601ba2", 00:12:59.431 "is_configured": true, 00:12:59.431 "data_offset": 2048, 00:12:59.431 "data_size": 63488 00:12:59.431 }, 00:12:59.431 { 00:12:59.431 "name": "BaseBdev2", 00:12:59.431 "uuid": "397f4cda-5dd7-523f-b2f6-d2a74e6959ab", 00:12:59.431 "is_configured": true, 00:12:59.431 "data_offset": 2048, 00:12:59.431 "data_size": 63488 00:12:59.431 }, 00:12:59.431 { 00:12:59.431 "name": "BaseBdev3", 00:12:59.431 "uuid": "f48d36ba-15e0-53ec-b6b9-5659b8476db6", 00:12:59.431 "is_configured": true, 00:12:59.431 "data_offset": 2048, 00:12:59.431 "data_size": 63488 00:12:59.431 } 00:12:59.431 ] 00:12:59.431 }' 00:12:59.431 18:49:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:59.431 18:49:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:59.997 18:49:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:59.997 18:49:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:59.997 [2024-07-24 18:49:44.848997] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22f4930 00:13:00.932 18:49:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:01.193 18:49:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:01.193 18:49:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:13:01.193 18:49:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:13:01.193 18:49:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:01.193 18:49:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:01.193 18:49:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:01.193 18:49:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:01.193 18:49:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:01.193 18:49:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:01.193 18:49:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:01.193 18:49:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:01.193 18:49:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:01.193 18:49:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:01.193 18:49:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:01.193 18:49:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:01.193 18:49:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:01.193 "name": "raid_bdev1", 00:13:01.193 "uuid": "7797c5ea-3ac0-4a2f-bece-1e6ecdd12e47", 00:13:01.193 "strip_size_kb": 64, 00:13:01.193 "state": "online", 00:13:01.193 "raid_level": "concat", 00:13:01.193 "superblock": true, 00:13:01.193 "num_base_bdevs": 3, 00:13:01.193 "num_base_bdevs_discovered": 3, 00:13:01.193 "num_base_bdevs_operational": 3, 00:13:01.193 "base_bdevs_list": [ 00:13:01.193 { 00:13:01.193 "name": "BaseBdev1", 00:13:01.193 "uuid": "23c96a80-8016-58a1-b58e-84fcdd601ba2", 00:13:01.193 "is_configured": true, 00:13:01.193 "data_offset": 2048, 00:13:01.193 "data_size": 63488 00:13:01.193 }, 00:13:01.193 { 00:13:01.193 "name": "BaseBdev2", 00:13:01.193 "uuid": "397f4cda-5dd7-523f-b2f6-d2a74e6959ab", 00:13:01.193 "is_configured": true, 00:13:01.193 "data_offset": 2048, 00:13:01.193 "data_size": 63488 00:13:01.193 }, 00:13:01.193 { 00:13:01.193 "name": "BaseBdev3", 00:13:01.193 "uuid": "f48d36ba-15e0-53ec-b6b9-5659b8476db6", 00:13:01.193 "is_configured": true, 00:13:01.193 "data_offset": 2048, 00:13:01.193 "data_size": 63488 00:13:01.193 } 00:13:01.193 ] 00:13:01.193 }' 00:13:01.193 18:49:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:01.193 18:49:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:01.759 18:49:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:01.759 [2024-07-24 18:49:46.741084] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:01.759 [2024-07-24 18:49:46.741116] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:01.759 [2024-07-24 18:49:46.743084] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:01.760 [2024-07-24 18:49:46.743109] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:01.760 [2024-07-24 18:49:46.743129] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:01.760 [2024-07-24 18:49:46.743134] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22f4a00 name raid_bdev1, state offline 00:13:01.760 0 00:13:01.760 18:49:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2090166 00:13:01.760 18:49:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2090166 ']' 00:13:01.760 18:49:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2090166 00:13:01.760 18:49:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:01.760 18:49:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:01.760 18:49:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2090166 00:13:02.017 18:49:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:02.017 18:49:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:02.017 18:49:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2090166' 00:13:02.017 killing process with pid 2090166 00:13:02.017 18:49:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2090166 00:13:02.017 [2024-07-24 18:49:46.802137] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:02.017 18:49:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2090166 00:13:02.017 [2024-07-24 18:49:46.820153] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:02.017 18:49:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.0uuAPHOsbB 00:13:02.017 18:49:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:02.017 18:49:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:02.017 18:49:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.53 00:13:02.017 18:49:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:13:02.017 18:49:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:02.017 18:49:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:02.017 18:49:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.53 != \0\.\0\0 ]] 00:13:02.017 00:13:02.017 real 0m5.415s 00:13:02.017 user 0m8.394s 00:13:02.017 sys 0m0.765s 00:13:02.017 18:49:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:02.017 18:49:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:02.017 ************************************ 00:13:02.017 END TEST raid_read_error_test 00:13:02.017 ************************************ 00:13:02.276 18:49:47 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:13:02.276 18:49:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:02.276 18:49:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:02.276 18:49:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:02.276 ************************************ 00:13:02.276 START TEST raid_write_error_test 00:13:02.276 ************************************ 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Lm7ItH8Vyu 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2091181 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2091181 /var/tmp/spdk-raid.sock 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2091181 ']' 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:02.276 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:02.276 18:49:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:02.276 [2024-07-24 18:49:47.128554] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:13:02.276 [2024-07-24 18:49:47.128591] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2091181 ] 00:13:02.276 [2024-07-24 18:49:47.191031] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:02.276 [2024-07-24 18:49:47.268531] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:02.535 [2024-07-24 18:49:47.318981] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:02.535 [2024-07-24 18:49:47.319004] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:03.101 18:49:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:03.101 18:49:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:03.101 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:03.101 18:49:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:03.101 BaseBdev1_malloc 00:13:03.101 18:49:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:03.359 true 00:13:03.359 18:49:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:03.617 [2024-07-24 18:49:48.386403] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:03.617 [2024-07-24 18:49:48.386434] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:03.617 [2024-07-24 18:49:48.386445] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2158d20 00:13:03.617 [2024-07-24 18:49:48.386451] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:03.617 [2024-07-24 18:49:48.387643] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:03.617 [2024-07-24 18:49:48.387663] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:03.617 BaseBdev1 00:13:03.617 18:49:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:03.617 18:49:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:03.617 BaseBdev2_malloc 00:13:03.617 18:49:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:03.877 true 00:13:03.877 18:49:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:03.877 [2024-07-24 18:49:48.871013] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:03.877 [2024-07-24 18:49:48.871043] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:03.877 [2024-07-24 18:49:48.871054] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x215dd50 00:13:03.877 [2024-07-24 18:49:48.871060] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:03.877 [2024-07-24 18:49:48.872070] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:03.877 [2024-07-24 18:49:48.872094] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:03.877 BaseBdev2 00:13:03.877 18:49:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:03.877 18:49:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:13:04.135 BaseBdev3_malloc 00:13:04.135 18:49:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:13:04.394 true 00:13:04.394 18:49:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:13:04.394 [2024-07-24 18:49:49.367851] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:13:04.394 [2024-07-24 18:49:49.367883] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:04.394 [2024-07-24 18:49:49.367894] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x215cef0 00:13:04.394 [2024-07-24 18:49:49.367900] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:04.394 [2024-07-24 18:49:49.368961] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:04.394 [2024-07-24 18:49:49.368982] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:13:04.394 BaseBdev3 00:13:04.394 18:49:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:13:04.652 [2024-07-24 18:49:49.532308] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:04.652 [2024-07-24 18:49:49.533170] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:04.652 [2024-07-24 18:49:49.533214] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:04.652 [2024-07-24 18:49:49.533346] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2160a00 00:13:04.652 [2024-07-24 18:49:49.533352] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:04.652 [2024-07-24 18:49:49.533488] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fb4790 00:13:04.652 [2024-07-24 18:49:49.533592] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2160a00 00:13:04.652 [2024-07-24 18:49:49.533597] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2160a00 00:13:04.652 [2024-07-24 18:49:49.533663] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:04.652 18:49:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:04.652 18:49:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:04.652 18:49:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:04.652 18:49:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:04.652 18:49:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:04.652 18:49:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:04.652 18:49:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:04.652 18:49:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:04.652 18:49:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:04.652 18:49:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:04.652 18:49:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.652 18:49:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:04.911 18:49:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:04.911 "name": "raid_bdev1", 00:13:04.911 "uuid": "c79a3362-f9e9-4701-8058-61653883deaf", 00:13:04.911 "strip_size_kb": 64, 00:13:04.911 "state": "online", 00:13:04.911 "raid_level": "concat", 00:13:04.911 "superblock": true, 00:13:04.911 "num_base_bdevs": 3, 00:13:04.911 "num_base_bdevs_discovered": 3, 00:13:04.911 "num_base_bdevs_operational": 3, 00:13:04.911 "base_bdevs_list": [ 00:13:04.911 { 00:13:04.911 "name": "BaseBdev1", 00:13:04.911 "uuid": "ab083ca7-e1e3-5dda-83af-dd7add998de6", 00:13:04.911 "is_configured": true, 00:13:04.911 "data_offset": 2048, 00:13:04.911 "data_size": 63488 00:13:04.911 }, 00:13:04.911 { 00:13:04.911 "name": "BaseBdev2", 00:13:04.911 "uuid": "b8d35d82-b721-54f9-802d-2b04256e930c", 00:13:04.911 "is_configured": true, 00:13:04.911 "data_offset": 2048, 00:13:04.911 "data_size": 63488 00:13:04.911 }, 00:13:04.911 { 00:13:04.911 "name": "BaseBdev3", 00:13:04.911 "uuid": "e88a2a2b-d455-5992-a8c9-270899d37853", 00:13:04.911 "is_configured": true, 00:13:04.911 "data_offset": 2048, 00:13:04.911 "data_size": 63488 00:13:04.911 } 00:13:04.911 ] 00:13:04.911 }' 00:13:04.911 18:49:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:04.911 18:49:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:05.477 18:49:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:05.477 18:49:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:05.477 [2024-07-24 18:49:50.262428] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2160930 00:13:06.414 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:06.414 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:06.414 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:13:06.414 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:13:06.414 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:06.414 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:06.414 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:06.414 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:06.414 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:06.414 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:06.414 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:06.414 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:06.414 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:06.414 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:06.414 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.414 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:06.673 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:06.673 "name": "raid_bdev1", 00:13:06.673 "uuid": "c79a3362-f9e9-4701-8058-61653883deaf", 00:13:06.673 "strip_size_kb": 64, 00:13:06.673 "state": "online", 00:13:06.673 "raid_level": "concat", 00:13:06.673 "superblock": true, 00:13:06.673 "num_base_bdevs": 3, 00:13:06.673 "num_base_bdevs_discovered": 3, 00:13:06.673 "num_base_bdevs_operational": 3, 00:13:06.673 "base_bdevs_list": [ 00:13:06.673 { 00:13:06.673 "name": "BaseBdev1", 00:13:06.673 "uuid": "ab083ca7-e1e3-5dda-83af-dd7add998de6", 00:13:06.673 "is_configured": true, 00:13:06.673 "data_offset": 2048, 00:13:06.673 "data_size": 63488 00:13:06.673 }, 00:13:06.673 { 00:13:06.673 "name": "BaseBdev2", 00:13:06.673 "uuid": "b8d35d82-b721-54f9-802d-2b04256e930c", 00:13:06.673 "is_configured": true, 00:13:06.673 "data_offset": 2048, 00:13:06.673 "data_size": 63488 00:13:06.673 }, 00:13:06.673 { 00:13:06.673 "name": "BaseBdev3", 00:13:06.673 "uuid": "e88a2a2b-d455-5992-a8c9-270899d37853", 00:13:06.673 "is_configured": true, 00:13:06.673 "data_offset": 2048, 00:13:06.673 "data_size": 63488 00:13:06.673 } 00:13:06.673 ] 00:13:06.673 }' 00:13:06.673 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:06.673 18:49:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:07.241 18:49:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:07.241 [2024-07-24 18:49:52.146809] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:07.241 [2024-07-24 18:49:52.146836] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:07.241 [2024-07-24 18:49:52.148925] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:07.241 [2024-07-24 18:49:52.148949] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:07.241 [2024-07-24 18:49:52.148969] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:07.241 [2024-07-24 18:49:52.148973] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2160a00 name raid_bdev1, state offline 00:13:07.241 0 00:13:07.241 18:49:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2091181 00:13:07.241 18:49:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2091181 ']' 00:13:07.241 18:49:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2091181 00:13:07.241 18:49:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:07.241 18:49:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:07.241 18:49:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2091181 00:13:07.241 18:49:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:07.241 18:49:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:07.241 18:49:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2091181' 00:13:07.241 killing process with pid 2091181 00:13:07.241 18:49:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2091181 00:13:07.241 [2024-07-24 18:49:52.205485] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:07.241 18:49:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2091181 00:13:07.241 [2024-07-24 18:49:52.223532] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:07.500 18:49:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:07.500 18:49:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Lm7ItH8Vyu 00:13:07.500 18:49:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:07.500 18:49:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.53 00:13:07.500 18:49:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:13:07.500 18:49:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:07.500 18:49:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:07.500 18:49:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.53 != \0\.\0\0 ]] 00:13:07.500 00:13:07.500 real 0m5.341s 00:13:07.500 user 0m8.264s 00:13:07.500 sys 0m0.791s 00:13:07.500 18:49:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:07.500 18:49:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:07.500 ************************************ 00:13:07.500 END TEST raid_write_error_test 00:13:07.501 ************************************ 00:13:07.501 18:49:52 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:07.501 18:49:52 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:13:07.501 18:49:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:07.501 18:49:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:07.501 18:49:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:07.501 ************************************ 00:13:07.501 START TEST raid_state_function_test 00:13:07.501 ************************************ 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2092186 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2092186' 00:13:07.501 Process raid pid: 2092186 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2092186 /var/tmp/spdk-raid.sock 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2092186 ']' 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:07.501 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:07.501 18:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:07.760 [2024-07-24 18:49:52.534830] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:13:07.760 [2024-07-24 18:49:52.534870] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:07.760 [2024-07-24 18:49:52.599892] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:07.760 [2024-07-24 18:49:52.668580] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:07.760 [2024-07-24 18:49:52.717812] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:07.760 [2024-07-24 18:49:52.717835] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:08.327 18:49:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:08.327 18:49:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:08.327 18:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:08.585 [2024-07-24 18:49:53.472912] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:08.585 [2024-07-24 18:49:53.472941] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:08.585 [2024-07-24 18:49:53.472947] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:08.585 [2024-07-24 18:49:53.472952] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:08.585 [2024-07-24 18:49:53.472958] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:08.585 [2024-07-24 18:49:53.472963] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:08.585 18:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:08.586 18:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:08.586 18:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:08.586 18:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:08.586 18:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:08.586 18:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:08.586 18:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:08.586 18:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:08.586 18:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:08.586 18:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:08.586 18:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.586 18:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:08.844 18:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:08.844 "name": "Existed_Raid", 00:13:08.844 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:08.844 "strip_size_kb": 0, 00:13:08.844 "state": "configuring", 00:13:08.844 "raid_level": "raid1", 00:13:08.844 "superblock": false, 00:13:08.844 "num_base_bdevs": 3, 00:13:08.844 "num_base_bdevs_discovered": 0, 00:13:08.844 "num_base_bdevs_operational": 3, 00:13:08.844 "base_bdevs_list": [ 00:13:08.844 { 00:13:08.844 "name": "BaseBdev1", 00:13:08.844 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:08.844 "is_configured": false, 00:13:08.844 "data_offset": 0, 00:13:08.844 "data_size": 0 00:13:08.844 }, 00:13:08.844 { 00:13:08.844 "name": "BaseBdev2", 00:13:08.844 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:08.844 "is_configured": false, 00:13:08.844 "data_offset": 0, 00:13:08.844 "data_size": 0 00:13:08.844 }, 00:13:08.844 { 00:13:08.844 "name": "BaseBdev3", 00:13:08.844 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:08.844 "is_configured": false, 00:13:08.844 "data_offset": 0, 00:13:08.844 "data_size": 0 00:13:08.844 } 00:13:08.844 ] 00:13:08.844 }' 00:13:08.844 18:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:08.844 18:49:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:09.409 18:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:09.409 [2024-07-24 18:49:54.282916] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:09.409 [2024-07-24 18:49:54.282936] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf3fba0 name Existed_Raid, state configuring 00:13:09.409 18:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:09.668 [2024-07-24 18:49:54.463390] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:09.668 [2024-07-24 18:49:54.463408] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:09.668 [2024-07-24 18:49:54.463412] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:09.668 [2024-07-24 18:49:54.463417] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:09.668 [2024-07-24 18:49:54.463421] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:09.668 [2024-07-24 18:49:54.463426] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:09.668 18:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:09.668 [2024-07-24 18:49:54.635993] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:09.668 BaseBdev1 00:13:09.668 18:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:09.668 18:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:09.668 18:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:09.668 18:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:09.668 18:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:09.668 18:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:09.668 18:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:09.925 18:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:10.182 [ 00:13:10.183 { 00:13:10.183 "name": "BaseBdev1", 00:13:10.183 "aliases": [ 00:13:10.183 "0e2838fb-68c7-44c6-9a81-7ce33f2d25c0" 00:13:10.183 ], 00:13:10.183 "product_name": "Malloc disk", 00:13:10.183 "block_size": 512, 00:13:10.183 "num_blocks": 65536, 00:13:10.183 "uuid": "0e2838fb-68c7-44c6-9a81-7ce33f2d25c0", 00:13:10.183 "assigned_rate_limits": { 00:13:10.183 "rw_ios_per_sec": 0, 00:13:10.183 "rw_mbytes_per_sec": 0, 00:13:10.183 "r_mbytes_per_sec": 0, 00:13:10.183 "w_mbytes_per_sec": 0 00:13:10.183 }, 00:13:10.183 "claimed": true, 00:13:10.183 "claim_type": "exclusive_write", 00:13:10.183 "zoned": false, 00:13:10.183 "supported_io_types": { 00:13:10.183 "read": true, 00:13:10.183 "write": true, 00:13:10.183 "unmap": true, 00:13:10.183 "flush": true, 00:13:10.183 "reset": true, 00:13:10.183 "nvme_admin": false, 00:13:10.183 "nvme_io": false, 00:13:10.183 "nvme_io_md": false, 00:13:10.183 "write_zeroes": true, 00:13:10.183 "zcopy": true, 00:13:10.183 "get_zone_info": false, 00:13:10.183 "zone_management": false, 00:13:10.183 "zone_append": false, 00:13:10.183 "compare": false, 00:13:10.183 "compare_and_write": false, 00:13:10.183 "abort": true, 00:13:10.183 "seek_hole": false, 00:13:10.183 "seek_data": false, 00:13:10.183 "copy": true, 00:13:10.183 "nvme_iov_md": false 00:13:10.183 }, 00:13:10.183 "memory_domains": [ 00:13:10.183 { 00:13:10.183 "dma_device_id": "system", 00:13:10.183 "dma_device_type": 1 00:13:10.183 }, 00:13:10.183 { 00:13:10.183 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.183 "dma_device_type": 2 00:13:10.183 } 00:13:10.183 ], 00:13:10.183 "driver_specific": {} 00:13:10.183 } 00:13:10.183 ] 00:13:10.183 18:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:10.183 18:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:10.183 18:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:10.183 18:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:10.183 18:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:10.183 18:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:10.183 18:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:10.183 18:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:10.183 18:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:10.183 18:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:10.183 18:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:10.183 18:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.183 18:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:10.183 18:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:10.183 "name": "Existed_Raid", 00:13:10.183 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.183 "strip_size_kb": 0, 00:13:10.183 "state": "configuring", 00:13:10.183 "raid_level": "raid1", 00:13:10.183 "superblock": false, 00:13:10.183 "num_base_bdevs": 3, 00:13:10.183 "num_base_bdevs_discovered": 1, 00:13:10.183 "num_base_bdevs_operational": 3, 00:13:10.183 "base_bdevs_list": [ 00:13:10.183 { 00:13:10.183 "name": "BaseBdev1", 00:13:10.183 "uuid": "0e2838fb-68c7-44c6-9a81-7ce33f2d25c0", 00:13:10.183 "is_configured": true, 00:13:10.183 "data_offset": 0, 00:13:10.183 "data_size": 65536 00:13:10.183 }, 00:13:10.183 { 00:13:10.183 "name": "BaseBdev2", 00:13:10.183 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.183 "is_configured": false, 00:13:10.183 "data_offset": 0, 00:13:10.183 "data_size": 0 00:13:10.183 }, 00:13:10.183 { 00:13:10.183 "name": "BaseBdev3", 00:13:10.183 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.183 "is_configured": false, 00:13:10.183 "data_offset": 0, 00:13:10.183 "data_size": 0 00:13:10.183 } 00:13:10.183 ] 00:13:10.183 }' 00:13:10.183 18:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:10.183 18:49:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:10.749 18:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:11.007 [2024-07-24 18:49:55.799000] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:11.007 [2024-07-24 18:49:55.799035] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf3f470 name Existed_Raid, state configuring 00:13:11.007 18:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:11.007 [2024-07-24 18:49:55.967449] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:11.007 [2024-07-24 18:49:55.968509] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:11.007 [2024-07-24 18:49:55.968535] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:11.007 [2024-07-24 18:49:55.968541] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:11.007 [2024-07-24 18:49:55.968546] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:11.007 18:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:11.007 18:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:11.007 18:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:11.008 18:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:11.008 18:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:11.008 18:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:11.008 18:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:11.008 18:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:11.008 18:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:11.008 18:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:11.008 18:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:11.008 18:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:11.008 18:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.008 18:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:11.264 18:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:11.264 "name": "Existed_Raid", 00:13:11.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:11.265 "strip_size_kb": 0, 00:13:11.265 "state": "configuring", 00:13:11.265 "raid_level": "raid1", 00:13:11.265 "superblock": false, 00:13:11.265 "num_base_bdevs": 3, 00:13:11.265 "num_base_bdevs_discovered": 1, 00:13:11.265 "num_base_bdevs_operational": 3, 00:13:11.265 "base_bdevs_list": [ 00:13:11.265 { 00:13:11.265 "name": "BaseBdev1", 00:13:11.265 "uuid": "0e2838fb-68c7-44c6-9a81-7ce33f2d25c0", 00:13:11.265 "is_configured": true, 00:13:11.265 "data_offset": 0, 00:13:11.265 "data_size": 65536 00:13:11.265 }, 00:13:11.265 { 00:13:11.265 "name": "BaseBdev2", 00:13:11.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:11.265 "is_configured": false, 00:13:11.265 "data_offset": 0, 00:13:11.265 "data_size": 0 00:13:11.265 }, 00:13:11.265 { 00:13:11.265 "name": "BaseBdev3", 00:13:11.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:11.265 "is_configured": false, 00:13:11.265 "data_offset": 0, 00:13:11.265 "data_size": 0 00:13:11.265 } 00:13:11.265 ] 00:13:11.265 }' 00:13:11.265 18:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:11.265 18:49:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:11.831 18:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:11.831 [2024-07-24 18:49:56.796295] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:11.831 BaseBdev2 00:13:11.831 18:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:11.831 18:49:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:11.831 18:49:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:11.831 18:49:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:11.831 18:49:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:11.831 18:49:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:11.831 18:49:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:12.088 18:49:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:12.347 [ 00:13:12.347 { 00:13:12.347 "name": "BaseBdev2", 00:13:12.347 "aliases": [ 00:13:12.347 "60002530-85a5-411c-a8fd-45005c4de68a" 00:13:12.347 ], 00:13:12.347 "product_name": "Malloc disk", 00:13:12.347 "block_size": 512, 00:13:12.347 "num_blocks": 65536, 00:13:12.347 "uuid": "60002530-85a5-411c-a8fd-45005c4de68a", 00:13:12.347 "assigned_rate_limits": { 00:13:12.347 "rw_ios_per_sec": 0, 00:13:12.347 "rw_mbytes_per_sec": 0, 00:13:12.347 "r_mbytes_per_sec": 0, 00:13:12.347 "w_mbytes_per_sec": 0 00:13:12.347 }, 00:13:12.347 "claimed": true, 00:13:12.347 "claim_type": "exclusive_write", 00:13:12.347 "zoned": false, 00:13:12.347 "supported_io_types": { 00:13:12.347 "read": true, 00:13:12.347 "write": true, 00:13:12.347 "unmap": true, 00:13:12.347 "flush": true, 00:13:12.347 "reset": true, 00:13:12.347 "nvme_admin": false, 00:13:12.347 "nvme_io": false, 00:13:12.347 "nvme_io_md": false, 00:13:12.347 "write_zeroes": true, 00:13:12.347 "zcopy": true, 00:13:12.347 "get_zone_info": false, 00:13:12.347 "zone_management": false, 00:13:12.347 "zone_append": false, 00:13:12.347 "compare": false, 00:13:12.347 "compare_and_write": false, 00:13:12.347 "abort": true, 00:13:12.347 "seek_hole": false, 00:13:12.347 "seek_data": false, 00:13:12.347 "copy": true, 00:13:12.347 "nvme_iov_md": false 00:13:12.347 }, 00:13:12.347 "memory_domains": [ 00:13:12.347 { 00:13:12.347 "dma_device_id": "system", 00:13:12.347 "dma_device_type": 1 00:13:12.347 }, 00:13:12.347 { 00:13:12.347 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:12.347 "dma_device_type": 2 00:13:12.347 } 00:13:12.347 ], 00:13:12.347 "driver_specific": {} 00:13:12.347 } 00:13:12.347 ] 00:13:12.347 18:49:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:12.347 18:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:12.347 18:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:12.347 18:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:12.347 18:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:12.347 18:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:12.347 18:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:12.347 18:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:12.347 18:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:12.347 18:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:12.347 18:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:12.347 18:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:12.347 18:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:12.347 18:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:12.347 18:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:12.347 18:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:12.347 "name": "Existed_Raid", 00:13:12.347 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.347 "strip_size_kb": 0, 00:13:12.347 "state": "configuring", 00:13:12.347 "raid_level": "raid1", 00:13:12.347 "superblock": false, 00:13:12.347 "num_base_bdevs": 3, 00:13:12.347 "num_base_bdevs_discovered": 2, 00:13:12.347 "num_base_bdevs_operational": 3, 00:13:12.347 "base_bdevs_list": [ 00:13:12.347 { 00:13:12.347 "name": "BaseBdev1", 00:13:12.347 "uuid": "0e2838fb-68c7-44c6-9a81-7ce33f2d25c0", 00:13:12.347 "is_configured": true, 00:13:12.347 "data_offset": 0, 00:13:12.347 "data_size": 65536 00:13:12.347 }, 00:13:12.347 { 00:13:12.347 "name": "BaseBdev2", 00:13:12.347 "uuid": "60002530-85a5-411c-a8fd-45005c4de68a", 00:13:12.347 "is_configured": true, 00:13:12.347 "data_offset": 0, 00:13:12.347 "data_size": 65536 00:13:12.347 }, 00:13:12.347 { 00:13:12.347 "name": "BaseBdev3", 00:13:12.347 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.347 "is_configured": false, 00:13:12.347 "data_offset": 0, 00:13:12.347 "data_size": 0 00:13:12.347 } 00:13:12.347 ] 00:13:12.347 }' 00:13:12.347 18:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:12.347 18:49:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:12.914 18:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:13.171 [2024-07-24 18:49:57.953900] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:13.171 [2024-07-24 18:49:57.953932] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xf40360 00:13:13.172 [2024-07-24 18:49:57.953936] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:13:13.172 [2024-07-24 18:49:57.954067] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10e8860 00:13:13.172 [2024-07-24 18:49:57.954155] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf40360 00:13:13.172 [2024-07-24 18:49:57.954164] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf40360 00:13:13.172 [2024-07-24 18:49:57.954282] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:13.172 BaseBdev3 00:13:13.172 18:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:13.172 18:49:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:13.172 18:49:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:13.172 18:49:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:13.172 18:49:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:13.172 18:49:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:13.172 18:49:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:13.172 18:49:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:13.430 [ 00:13:13.430 { 00:13:13.430 "name": "BaseBdev3", 00:13:13.430 "aliases": [ 00:13:13.430 "df2de7df-b93f-4ee5-9885-32c18a1d375a" 00:13:13.430 ], 00:13:13.430 "product_name": "Malloc disk", 00:13:13.430 "block_size": 512, 00:13:13.430 "num_blocks": 65536, 00:13:13.430 "uuid": "df2de7df-b93f-4ee5-9885-32c18a1d375a", 00:13:13.430 "assigned_rate_limits": { 00:13:13.430 "rw_ios_per_sec": 0, 00:13:13.430 "rw_mbytes_per_sec": 0, 00:13:13.430 "r_mbytes_per_sec": 0, 00:13:13.430 "w_mbytes_per_sec": 0 00:13:13.430 }, 00:13:13.430 "claimed": true, 00:13:13.430 "claim_type": "exclusive_write", 00:13:13.430 "zoned": false, 00:13:13.430 "supported_io_types": { 00:13:13.430 "read": true, 00:13:13.430 "write": true, 00:13:13.430 "unmap": true, 00:13:13.430 "flush": true, 00:13:13.430 "reset": true, 00:13:13.430 "nvme_admin": false, 00:13:13.430 "nvme_io": false, 00:13:13.430 "nvme_io_md": false, 00:13:13.430 "write_zeroes": true, 00:13:13.430 "zcopy": true, 00:13:13.430 "get_zone_info": false, 00:13:13.430 "zone_management": false, 00:13:13.430 "zone_append": false, 00:13:13.430 "compare": false, 00:13:13.430 "compare_and_write": false, 00:13:13.430 "abort": true, 00:13:13.430 "seek_hole": false, 00:13:13.430 "seek_data": false, 00:13:13.430 "copy": true, 00:13:13.430 "nvme_iov_md": false 00:13:13.430 }, 00:13:13.430 "memory_domains": [ 00:13:13.430 { 00:13:13.430 "dma_device_id": "system", 00:13:13.430 "dma_device_type": 1 00:13:13.430 }, 00:13:13.430 { 00:13:13.430 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:13.430 "dma_device_type": 2 00:13:13.430 } 00:13:13.430 ], 00:13:13.430 "driver_specific": {} 00:13:13.430 } 00:13:13.430 ] 00:13:13.430 18:49:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:13.430 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:13.430 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:13.430 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:13:13.430 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:13.430 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:13.430 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:13.430 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:13.430 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:13.430 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:13.430 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:13.430 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:13.430 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:13.430 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.430 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:13.689 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:13.689 "name": "Existed_Raid", 00:13:13.689 "uuid": "d3102d9f-ba63-437d-96c8-8cbfd231a052", 00:13:13.689 "strip_size_kb": 0, 00:13:13.689 "state": "online", 00:13:13.689 "raid_level": "raid1", 00:13:13.689 "superblock": false, 00:13:13.689 "num_base_bdevs": 3, 00:13:13.689 "num_base_bdevs_discovered": 3, 00:13:13.689 "num_base_bdevs_operational": 3, 00:13:13.689 "base_bdevs_list": [ 00:13:13.689 { 00:13:13.689 "name": "BaseBdev1", 00:13:13.689 "uuid": "0e2838fb-68c7-44c6-9a81-7ce33f2d25c0", 00:13:13.689 "is_configured": true, 00:13:13.689 "data_offset": 0, 00:13:13.689 "data_size": 65536 00:13:13.689 }, 00:13:13.689 { 00:13:13.689 "name": "BaseBdev2", 00:13:13.689 "uuid": "60002530-85a5-411c-a8fd-45005c4de68a", 00:13:13.689 "is_configured": true, 00:13:13.689 "data_offset": 0, 00:13:13.689 "data_size": 65536 00:13:13.689 }, 00:13:13.689 { 00:13:13.689 "name": "BaseBdev3", 00:13:13.689 "uuid": "df2de7df-b93f-4ee5-9885-32c18a1d375a", 00:13:13.689 "is_configured": true, 00:13:13.689 "data_offset": 0, 00:13:13.689 "data_size": 65536 00:13:13.689 } 00:13:13.689 ] 00:13:13.689 }' 00:13:13.689 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:13.689 18:49:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:13.948 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:13.948 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:13.948 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:13.948 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:13.948 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:13.948 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:14.207 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:14.207 18:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:14.207 [2024-07-24 18:49:59.109076] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:14.207 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:14.207 "name": "Existed_Raid", 00:13:14.207 "aliases": [ 00:13:14.207 "d3102d9f-ba63-437d-96c8-8cbfd231a052" 00:13:14.207 ], 00:13:14.207 "product_name": "Raid Volume", 00:13:14.207 "block_size": 512, 00:13:14.207 "num_blocks": 65536, 00:13:14.207 "uuid": "d3102d9f-ba63-437d-96c8-8cbfd231a052", 00:13:14.207 "assigned_rate_limits": { 00:13:14.207 "rw_ios_per_sec": 0, 00:13:14.207 "rw_mbytes_per_sec": 0, 00:13:14.207 "r_mbytes_per_sec": 0, 00:13:14.207 "w_mbytes_per_sec": 0 00:13:14.207 }, 00:13:14.207 "claimed": false, 00:13:14.207 "zoned": false, 00:13:14.207 "supported_io_types": { 00:13:14.207 "read": true, 00:13:14.207 "write": true, 00:13:14.207 "unmap": false, 00:13:14.207 "flush": false, 00:13:14.207 "reset": true, 00:13:14.207 "nvme_admin": false, 00:13:14.207 "nvme_io": false, 00:13:14.207 "nvme_io_md": false, 00:13:14.207 "write_zeroes": true, 00:13:14.207 "zcopy": false, 00:13:14.207 "get_zone_info": false, 00:13:14.207 "zone_management": false, 00:13:14.207 "zone_append": false, 00:13:14.207 "compare": false, 00:13:14.207 "compare_and_write": false, 00:13:14.207 "abort": false, 00:13:14.207 "seek_hole": false, 00:13:14.207 "seek_data": false, 00:13:14.207 "copy": false, 00:13:14.207 "nvme_iov_md": false 00:13:14.207 }, 00:13:14.207 "memory_domains": [ 00:13:14.207 { 00:13:14.207 "dma_device_id": "system", 00:13:14.207 "dma_device_type": 1 00:13:14.207 }, 00:13:14.207 { 00:13:14.207 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.207 "dma_device_type": 2 00:13:14.207 }, 00:13:14.207 { 00:13:14.207 "dma_device_id": "system", 00:13:14.207 "dma_device_type": 1 00:13:14.207 }, 00:13:14.207 { 00:13:14.207 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.207 "dma_device_type": 2 00:13:14.207 }, 00:13:14.207 { 00:13:14.207 "dma_device_id": "system", 00:13:14.207 "dma_device_type": 1 00:13:14.207 }, 00:13:14.207 { 00:13:14.207 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.207 "dma_device_type": 2 00:13:14.207 } 00:13:14.207 ], 00:13:14.207 "driver_specific": { 00:13:14.207 "raid": { 00:13:14.207 "uuid": "d3102d9f-ba63-437d-96c8-8cbfd231a052", 00:13:14.207 "strip_size_kb": 0, 00:13:14.207 "state": "online", 00:13:14.207 "raid_level": "raid1", 00:13:14.207 "superblock": false, 00:13:14.207 "num_base_bdevs": 3, 00:13:14.207 "num_base_bdevs_discovered": 3, 00:13:14.207 "num_base_bdevs_operational": 3, 00:13:14.207 "base_bdevs_list": [ 00:13:14.207 { 00:13:14.207 "name": "BaseBdev1", 00:13:14.207 "uuid": "0e2838fb-68c7-44c6-9a81-7ce33f2d25c0", 00:13:14.207 "is_configured": true, 00:13:14.207 "data_offset": 0, 00:13:14.207 "data_size": 65536 00:13:14.207 }, 00:13:14.207 { 00:13:14.207 "name": "BaseBdev2", 00:13:14.207 "uuid": "60002530-85a5-411c-a8fd-45005c4de68a", 00:13:14.207 "is_configured": true, 00:13:14.207 "data_offset": 0, 00:13:14.207 "data_size": 65536 00:13:14.207 }, 00:13:14.207 { 00:13:14.207 "name": "BaseBdev3", 00:13:14.207 "uuid": "df2de7df-b93f-4ee5-9885-32c18a1d375a", 00:13:14.207 "is_configured": true, 00:13:14.207 "data_offset": 0, 00:13:14.207 "data_size": 65536 00:13:14.207 } 00:13:14.207 ] 00:13:14.207 } 00:13:14.207 } 00:13:14.207 }' 00:13:14.207 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:14.207 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:14.207 BaseBdev2 00:13:14.207 BaseBdev3' 00:13:14.207 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:14.207 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:14.207 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:14.466 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:14.466 "name": "BaseBdev1", 00:13:14.466 "aliases": [ 00:13:14.466 "0e2838fb-68c7-44c6-9a81-7ce33f2d25c0" 00:13:14.466 ], 00:13:14.466 "product_name": "Malloc disk", 00:13:14.466 "block_size": 512, 00:13:14.466 "num_blocks": 65536, 00:13:14.466 "uuid": "0e2838fb-68c7-44c6-9a81-7ce33f2d25c0", 00:13:14.466 "assigned_rate_limits": { 00:13:14.466 "rw_ios_per_sec": 0, 00:13:14.466 "rw_mbytes_per_sec": 0, 00:13:14.466 "r_mbytes_per_sec": 0, 00:13:14.466 "w_mbytes_per_sec": 0 00:13:14.466 }, 00:13:14.466 "claimed": true, 00:13:14.466 "claim_type": "exclusive_write", 00:13:14.466 "zoned": false, 00:13:14.466 "supported_io_types": { 00:13:14.466 "read": true, 00:13:14.466 "write": true, 00:13:14.466 "unmap": true, 00:13:14.466 "flush": true, 00:13:14.466 "reset": true, 00:13:14.466 "nvme_admin": false, 00:13:14.466 "nvme_io": false, 00:13:14.466 "nvme_io_md": false, 00:13:14.466 "write_zeroes": true, 00:13:14.466 "zcopy": true, 00:13:14.466 "get_zone_info": false, 00:13:14.466 "zone_management": false, 00:13:14.466 "zone_append": false, 00:13:14.466 "compare": false, 00:13:14.466 "compare_and_write": false, 00:13:14.466 "abort": true, 00:13:14.466 "seek_hole": false, 00:13:14.466 "seek_data": false, 00:13:14.466 "copy": true, 00:13:14.466 "nvme_iov_md": false 00:13:14.466 }, 00:13:14.466 "memory_domains": [ 00:13:14.466 { 00:13:14.466 "dma_device_id": "system", 00:13:14.466 "dma_device_type": 1 00:13:14.466 }, 00:13:14.466 { 00:13:14.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.466 "dma_device_type": 2 00:13:14.466 } 00:13:14.466 ], 00:13:14.466 "driver_specific": {} 00:13:14.466 }' 00:13:14.466 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.466 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.466 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:14.466 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.466 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.725 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:14.725 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:14.725 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:14.725 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:14.725 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:14.725 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:14.725 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:14.725 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:14.725 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:14.725 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:15.012 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:15.012 "name": "BaseBdev2", 00:13:15.012 "aliases": [ 00:13:15.012 "60002530-85a5-411c-a8fd-45005c4de68a" 00:13:15.012 ], 00:13:15.012 "product_name": "Malloc disk", 00:13:15.012 "block_size": 512, 00:13:15.012 "num_blocks": 65536, 00:13:15.012 "uuid": "60002530-85a5-411c-a8fd-45005c4de68a", 00:13:15.012 "assigned_rate_limits": { 00:13:15.012 "rw_ios_per_sec": 0, 00:13:15.012 "rw_mbytes_per_sec": 0, 00:13:15.012 "r_mbytes_per_sec": 0, 00:13:15.012 "w_mbytes_per_sec": 0 00:13:15.012 }, 00:13:15.012 "claimed": true, 00:13:15.012 "claim_type": "exclusive_write", 00:13:15.012 "zoned": false, 00:13:15.012 "supported_io_types": { 00:13:15.012 "read": true, 00:13:15.012 "write": true, 00:13:15.012 "unmap": true, 00:13:15.012 "flush": true, 00:13:15.012 "reset": true, 00:13:15.012 "nvme_admin": false, 00:13:15.012 "nvme_io": false, 00:13:15.012 "nvme_io_md": false, 00:13:15.012 "write_zeroes": true, 00:13:15.012 "zcopy": true, 00:13:15.012 "get_zone_info": false, 00:13:15.012 "zone_management": false, 00:13:15.012 "zone_append": false, 00:13:15.012 "compare": false, 00:13:15.012 "compare_and_write": false, 00:13:15.012 "abort": true, 00:13:15.012 "seek_hole": false, 00:13:15.012 "seek_data": false, 00:13:15.012 "copy": true, 00:13:15.012 "nvme_iov_md": false 00:13:15.012 }, 00:13:15.012 "memory_domains": [ 00:13:15.012 { 00:13:15.012 "dma_device_id": "system", 00:13:15.012 "dma_device_type": 1 00:13:15.012 }, 00:13:15.012 { 00:13:15.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.012 "dma_device_type": 2 00:13:15.012 } 00:13:15.012 ], 00:13:15.012 "driver_specific": {} 00:13:15.012 }' 00:13:15.012 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.012 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.012 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:15.012 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.012 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.012 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:15.012 18:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.280 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.280 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:15.280 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.280 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.280 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:15.280 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:15.280 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:15.280 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:15.539 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:15.539 "name": "BaseBdev3", 00:13:15.539 "aliases": [ 00:13:15.539 "df2de7df-b93f-4ee5-9885-32c18a1d375a" 00:13:15.539 ], 00:13:15.539 "product_name": "Malloc disk", 00:13:15.539 "block_size": 512, 00:13:15.539 "num_blocks": 65536, 00:13:15.539 "uuid": "df2de7df-b93f-4ee5-9885-32c18a1d375a", 00:13:15.539 "assigned_rate_limits": { 00:13:15.539 "rw_ios_per_sec": 0, 00:13:15.539 "rw_mbytes_per_sec": 0, 00:13:15.539 "r_mbytes_per_sec": 0, 00:13:15.539 "w_mbytes_per_sec": 0 00:13:15.539 }, 00:13:15.539 "claimed": true, 00:13:15.539 "claim_type": "exclusive_write", 00:13:15.539 "zoned": false, 00:13:15.539 "supported_io_types": { 00:13:15.539 "read": true, 00:13:15.539 "write": true, 00:13:15.539 "unmap": true, 00:13:15.539 "flush": true, 00:13:15.539 "reset": true, 00:13:15.539 "nvme_admin": false, 00:13:15.539 "nvme_io": false, 00:13:15.539 "nvme_io_md": false, 00:13:15.539 "write_zeroes": true, 00:13:15.539 "zcopy": true, 00:13:15.539 "get_zone_info": false, 00:13:15.539 "zone_management": false, 00:13:15.539 "zone_append": false, 00:13:15.539 "compare": false, 00:13:15.539 "compare_and_write": false, 00:13:15.539 "abort": true, 00:13:15.539 "seek_hole": false, 00:13:15.539 "seek_data": false, 00:13:15.539 "copy": true, 00:13:15.539 "nvme_iov_md": false 00:13:15.539 }, 00:13:15.539 "memory_domains": [ 00:13:15.539 { 00:13:15.539 "dma_device_id": "system", 00:13:15.539 "dma_device_type": 1 00:13:15.539 }, 00:13:15.539 { 00:13:15.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.539 "dma_device_type": 2 00:13:15.539 } 00:13:15.539 ], 00:13:15.539 "driver_specific": {} 00:13:15.539 }' 00:13:15.539 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.539 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.539 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:15.539 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.539 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.539 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:15.539 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.539 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.797 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:15.797 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.797 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.797 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:15.797 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:15.797 [2024-07-24 18:50:00.773223] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:15.797 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:15.797 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:16.056 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:16.056 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:16.056 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:16.056 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:16.056 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:16.056 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:16.056 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:16.056 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:16.056 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:16.056 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:16.056 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:16.056 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:16.056 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:16.056 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.056 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:16.056 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:16.056 "name": "Existed_Raid", 00:13:16.056 "uuid": "d3102d9f-ba63-437d-96c8-8cbfd231a052", 00:13:16.056 "strip_size_kb": 0, 00:13:16.056 "state": "online", 00:13:16.056 "raid_level": "raid1", 00:13:16.056 "superblock": false, 00:13:16.056 "num_base_bdevs": 3, 00:13:16.056 "num_base_bdevs_discovered": 2, 00:13:16.056 "num_base_bdevs_operational": 2, 00:13:16.056 "base_bdevs_list": [ 00:13:16.056 { 00:13:16.056 "name": null, 00:13:16.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:16.056 "is_configured": false, 00:13:16.056 "data_offset": 0, 00:13:16.056 "data_size": 65536 00:13:16.056 }, 00:13:16.056 { 00:13:16.056 "name": "BaseBdev2", 00:13:16.056 "uuid": "60002530-85a5-411c-a8fd-45005c4de68a", 00:13:16.056 "is_configured": true, 00:13:16.056 "data_offset": 0, 00:13:16.056 "data_size": 65536 00:13:16.056 }, 00:13:16.056 { 00:13:16.056 "name": "BaseBdev3", 00:13:16.056 "uuid": "df2de7df-b93f-4ee5-9885-32c18a1d375a", 00:13:16.056 "is_configured": true, 00:13:16.056 "data_offset": 0, 00:13:16.056 "data_size": 65536 00:13:16.056 } 00:13:16.056 ] 00:13:16.056 }' 00:13:16.056 18:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:16.056 18:50:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:16.622 18:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:16.622 18:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:16.622 18:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:16.622 18:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.880 18:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:16.880 18:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:16.880 18:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:16.880 [2024-07-24 18:50:01.856785] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:16.880 18:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:16.880 18:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:17.139 18:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:17.139 18:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.139 18:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:17.139 18:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:17.139 18:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:17.397 [2024-07-24 18:50:02.203626] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:17.397 [2024-07-24 18:50:02.203682] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:17.397 [2024-07-24 18:50:02.221667] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:17.397 [2024-07-24 18:50:02.221691] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:17.397 [2024-07-24 18:50:02.221697] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf40360 name Existed_Raid, state offline 00:13:17.397 18:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:17.397 18:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:17.397 18:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.397 18:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:17.397 18:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:17.397 18:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:17.397 18:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:17.397 18:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:17.397 18:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:17.397 18:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:17.656 BaseBdev2 00:13:17.656 18:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:17.656 18:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:17.656 18:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:17.656 18:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:17.656 18:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:17.656 18:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:17.656 18:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:17.915 18:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:17.915 [ 00:13:17.915 { 00:13:17.915 "name": "BaseBdev2", 00:13:17.915 "aliases": [ 00:13:17.915 "2a6431d8-ce4c-441e-9995-d7f3182141f1" 00:13:17.915 ], 00:13:17.915 "product_name": "Malloc disk", 00:13:17.915 "block_size": 512, 00:13:17.915 "num_blocks": 65536, 00:13:17.915 "uuid": "2a6431d8-ce4c-441e-9995-d7f3182141f1", 00:13:17.915 "assigned_rate_limits": { 00:13:17.915 "rw_ios_per_sec": 0, 00:13:17.915 "rw_mbytes_per_sec": 0, 00:13:17.915 "r_mbytes_per_sec": 0, 00:13:17.915 "w_mbytes_per_sec": 0 00:13:17.915 }, 00:13:17.915 "claimed": false, 00:13:17.915 "zoned": false, 00:13:17.915 "supported_io_types": { 00:13:17.915 "read": true, 00:13:17.915 "write": true, 00:13:17.915 "unmap": true, 00:13:17.915 "flush": true, 00:13:17.915 "reset": true, 00:13:17.915 "nvme_admin": false, 00:13:17.915 "nvme_io": false, 00:13:17.915 "nvme_io_md": false, 00:13:17.915 "write_zeroes": true, 00:13:17.915 "zcopy": true, 00:13:17.915 "get_zone_info": false, 00:13:17.915 "zone_management": false, 00:13:17.915 "zone_append": false, 00:13:17.915 "compare": false, 00:13:17.915 "compare_and_write": false, 00:13:17.915 "abort": true, 00:13:17.915 "seek_hole": false, 00:13:17.915 "seek_data": false, 00:13:17.915 "copy": true, 00:13:17.915 "nvme_iov_md": false 00:13:17.915 }, 00:13:17.915 "memory_domains": [ 00:13:17.915 { 00:13:17.915 "dma_device_id": "system", 00:13:17.915 "dma_device_type": 1 00:13:17.915 }, 00:13:17.915 { 00:13:17.915 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.915 "dma_device_type": 2 00:13:17.915 } 00:13:17.915 ], 00:13:17.915 "driver_specific": {} 00:13:17.915 } 00:13:17.915 ] 00:13:17.915 18:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:17.915 18:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:17.915 18:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:17.915 18:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:18.173 BaseBdev3 00:13:18.173 18:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:18.173 18:50:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:18.173 18:50:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:18.173 18:50:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:18.173 18:50:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:18.173 18:50:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:18.173 18:50:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:18.440 18:50:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:18.440 [ 00:13:18.440 { 00:13:18.440 "name": "BaseBdev3", 00:13:18.440 "aliases": [ 00:13:18.440 "ee7e9a24-28e0-4483-8957-0138f513d97e" 00:13:18.440 ], 00:13:18.440 "product_name": "Malloc disk", 00:13:18.440 "block_size": 512, 00:13:18.440 "num_blocks": 65536, 00:13:18.440 "uuid": "ee7e9a24-28e0-4483-8957-0138f513d97e", 00:13:18.440 "assigned_rate_limits": { 00:13:18.440 "rw_ios_per_sec": 0, 00:13:18.440 "rw_mbytes_per_sec": 0, 00:13:18.440 "r_mbytes_per_sec": 0, 00:13:18.440 "w_mbytes_per_sec": 0 00:13:18.440 }, 00:13:18.440 "claimed": false, 00:13:18.440 "zoned": false, 00:13:18.440 "supported_io_types": { 00:13:18.440 "read": true, 00:13:18.440 "write": true, 00:13:18.440 "unmap": true, 00:13:18.440 "flush": true, 00:13:18.440 "reset": true, 00:13:18.440 "nvme_admin": false, 00:13:18.440 "nvme_io": false, 00:13:18.440 "nvme_io_md": false, 00:13:18.440 "write_zeroes": true, 00:13:18.440 "zcopy": true, 00:13:18.440 "get_zone_info": false, 00:13:18.440 "zone_management": false, 00:13:18.440 "zone_append": false, 00:13:18.440 "compare": false, 00:13:18.440 "compare_and_write": false, 00:13:18.440 "abort": true, 00:13:18.440 "seek_hole": false, 00:13:18.440 "seek_data": false, 00:13:18.440 "copy": true, 00:13:18.440 "nvme_iov_md": false 00:13:18.440 }, 00:13:18.440 "memory_domains": [ 00:13:18.440 { 00:13:18.440 "dma_device_id": "system", 00:13:18.440 "dma_device_type": 1 00:13:18.440 }, 00:13:18.440 { 00:13:18.440 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:18.440 "dma_device_type": 2 00:13:18.440 } 00:13:18.440 ], 00:13:18.440 "driver_specific": {} 00:13:18.440 } 00:13:18.440 ] 00:13:18.440 18:50:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:18.440 18:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:18.440 18:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:18.440 18:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:18.700 [2024-07-24 18:50:03.569898] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:18.700 [2024-07-24 18:50:03.569925] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:18.700 [2024-07-24 18:50:03.569937] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:18.700 [2024-07-24 18:50:03.571052] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:18.700 18:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:18.700 18:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:18.700 18:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:18.700 18:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:18.700 18:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:18.700 18:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:18.700 18:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:18.700 18:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:18.700 18:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:18.700 18:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:18.700 18:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:18.701 18:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:18.958 18:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:18.958 "name": "Existed_Raid", 00:13:18.958 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:18.958 "strip_size_kb": 0, 00:13:18.958 "state": "configuring", 00:13:18.958 "raid_level": "raid1", 00:13:18.958 "superblock": false, 00:13:18.958 "num_base_bdevs": 3, 00:13:18.958 "num_base_bdevs_discovered": 2, 00:13:18.958 "num_base_bdevs_operational": 3, 00:13:18.958 "base_bdevs_list": [ 00:13:18.958 { 00:13:18.958 "name": "BaseBdev1", 00:13:18.958 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:18.958 "is_configured": false, 00:13:18.958 "data_offset": 0, 00:13:18.958 "data_size": 0 00:13:18.958 }, 00:13:18.958 { 00:13:18.958 "name": "BaseBdev2", 00:13:18.958 "uuid": "2a6431d8-ce4c-441e-9995-d7f3182141f1", 00:13:18.958 "is_configured": true, 00:13:18.958 "data_offset": 0, 00:13:18.958 "data_size": 65536 00:13:18.958 }, 00:13:18.958 { 00:13:18.958 "name": "BaseBdev3", 00:13:18.958 "uuid": "ee7e9a24-28e0-4483-8957-0138f513d97e", 00:13:18.958 "is_configured": true, 00:13:18.958 "data_offset": 0, 00:13:18.958 "data_size": 65536 00:13:18.958 } 00:13:18.958 ] 00:13:18.958 }' 00:13:18.958 18:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:18.959 18:50:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:19.525 18:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:19.525 [2024-07-24 18:50:04.379978] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:19.525 18:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:19.525 18:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:19.525 18:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:19.525 18:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:19.525 18:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:19.525 18:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:19.525 18:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:19.525 18:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:19.525 18:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:19.525 18:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:19.525 18:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:19.525 18:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:19.784 18:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:19.784 "name": "Existed_Raid", 00:13:19.784 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:19.784 "strip_size_kb": 0, 00:13:19.784 "state": "configuring", 00:13:19.784 "raid_level": "raid1", 00:13:19.784 "superblock": false, 00:13:19.784 "num_base_bdevs": 3, 00:13:19.784 "num_base_bdevs_discovered": 1, 00:13:19.784 "num_base_bdevs_operational": 3, 00:13:19.784 "base_bdevs_list": [ 00:13:19.784 { 00:13:19.784 "name": "BaseBdev1", 00:13:19.784 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:19.784 "is_configured": false, 00:13:19.784 "data_offset": 0, 00:13:19.784 "data_size": 0 00:13:19.784 }, 00:13:19.784 { 00:13:19.784 "name": null, 00:13:19.784 "uuid": "2a6431d8-ce4c-441e-9995-d7f3182141f1", 00:13:19.784 "is_configured": false, 00:13:19.784 "data_offset": 0, 00:13:19.784 "data_size": 65536 00:13:19.784 }, 00:13:19.784 { 00:13:19.784 "name": "BaseBdev3", 00:13:19.784 "uuid": "ee7e9a24-28e0-4483-8957-0138f513d97e", 00:13:19.784 "is_configured": true, 00:13:19.784 "data_offset": 0, 00:13:19.784 "data_size": 65536 00:13:19.784 } 00:13:19.784 ] 00:13:19.784 }' 00:13:19.784 18:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:19.784 18:50:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:20.041 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.041 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:20.299 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:20.299 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:20.557 [2024-07-24 18:50:05.386463] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:20.557 BaseBdev1 00:13:20.557 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:20.557 18:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:20.557 18:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:20.557 18:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:20.557 18:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:20.557 18:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:20.557 18:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:20.557 18:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:20.816 [ 00:13:20.816 { 00:13:20.816 "name": "BaseBdev1", 00:13:20.816 "aliases": [ 00:13:20.816 "87382671-618d-4023-933a-d374f030ec48" 00:13:20.816 ], 00:13:20.816 "product_name": "Malloc disk", 00:13:20.816 "block_size": 512, 00:13:20.816 "num_blocks": 65536, 00:13:20.816 "uuid": "87382671-618d-4023-933a-d374f030ec48", 00:13:20.816 "assigned_rate_limits": { 00:13:20.816 "rw_ios_per_sec": 0, 00:13:20.816 "rw_mbytes_per_sec": 0, 00:13:20.816 "r_mbytes_per_sec": 0, 00:13:20.816 "w_mbytes_per_sec": 0 00:13:20.816 }, 00:13:20.816 "claimed": true, 00:13:20.816 "claim_type": "exclusive_write", 00:13:20.816 "zoned": false, 00:13:20.816 "supported_io_types": { 00:13:20.816 "read": true, 00:13:20.816 "write": true, 00:13:20.816 "unmap": true, 00:13:20.816 "flush": true, 00:13:20.816 "reset": true, 00:13:20.816 "nvme_admin": false, 00:13:20.816 "nvme_io": false, 00:13:20.816 "nvme_io_md": false, 00:13:20.816 "write_zeroes": true, 00:13:20.816 "zcopy": true, 00:13:20.816 "get_zone_info": false, 00:13:20.816 "zone_management": false, 00:13:20.816 "zone_append": false, 00:13:20.816 "compare": false, 00:13:20.816 "compare_and_write": false, 00:13:20.816 "abort": true, 00:13:20.816 "seek_hole": false, 00:13:20.816 "seek_data": false, 00:13:20.816 "copy": true, 00:13:20.816 "nvme_iov_md": false 00:13:20.816 }, 00:13:20.816 "memory_domains": [ 00:13:20.816 { 00:13:20.816 "dma_device_id": "system", 00:13:20.816 "dma_device_type": 1 00:13:20.816 }, 00:13:20.816 { 00:13:20.816 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.816 "dma_device_type": 2 00:13:20.816 } 00:13:20.816 ], 00:13:20.816 "driver_specific": {} 00:13:20.816 } 00:13:20.816 ] 00:13:20.816 18:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:20.816 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:20.816 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:20.816 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:20.816 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:20.816 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:20.816 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:20.816 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:20.816 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:20.816 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:20.816 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:20.816 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.816 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:21.075 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:21.075 "name": "Existed_Raid", 00:13:21.075 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:21.075 "strip_size_kb": 0, 00:13:21.075 "state": "configuring", 00:13:21.075 "raid_level": "raid1", 00:13:21.075 "superblock": false, 00:13:21.075 "num_base_bdevs": 3, 00:13:21.075 "num_base_bdevs_discovered": 2, 00:13:21.075 "num_base_bdevs_operational": 3, 00:13:21.075 "base_bdevs_list": [ 00:13:21.075 { 00:13:21.075 "name": "BaseBdev1", 00:13:21.075 "uuid": "87382671-618d-4023-933a-d374f030ec48", 00:13:21.075 "is_configured": true, 00:13:21.075 "data_offset": 0, 00:13:21.075 "data_size": 65536 00:13:21.075 }, 00:13:21.075 { 00:13:21.075 "name": null, 00:13:21.075 "uuid": "2a6431d8-ce4c-441e-9995-d7f3182141f1", 00:13:21.075 "is_configured": false, 00:13:21.075 "data_offset": 0, 00:13:21.075 "data_size": 65536 00:13:21.075 }, 00:13:21.075 { 00:13:21.075 "name": "BaseBdev3", 00:13:21.075 "uuid": "ee7e9a24-28e0-4483-8957-0138f513d97e", 00:13:21.075 "is_configured": true, 00:13:21.075 "data_offset": 0, 00:13:21.075 "data_size": 65536 00:13:21.075 } 00:13:21.075 ] 00:13:21.075 }' 00:13:21.075 18:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:21.075 18:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:21.642 18:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.642 18:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:21.642 18:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:21.642 18:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:21.900 [2024-07-24 18:50:06.689960] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:21.900 18:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:21.900 18:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:21.900 18:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:21.900 18:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:21.900 18:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:21.900 18:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:21.900 18:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:21.900 18:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:21.900 18:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:21.900 18:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:21.901 18:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.901 18:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:21.901 18:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:21.901 "name": "Existed_Raid", 00:13:21.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:21.901 "strip_size_kb": 0, 00:13:21.901 "state": "configuring", 00:13:21.901 "raid_level": "raid1", 00:13:21.901 "superblock": false, 00:13:21.901 "num_base_bdevs": 3, 00:13:21.901 "num_base_bdevs_discovered": 1, 00:13:21.901 "num_base_bdevs_operational": 3, 00:13:21.901 "base_bdevs_list": [ 00:13:21.901 { 00:13:21.901 "name": "BaseBdev1", 00:13:21.901 "uuid": "87382671-618d-4023-933a-d374f030ec48", 00:13:21.901 "is_configured": true, 00:13:21.901 "data_offset": 0, 00:13:21.901 "data_size": 65536 00:13:21.901 }, 00:13:21.901 { 00:13:21.901 "name": null, 00:13:21.901 "uuid": "2a6431d8-ce4c-441e-9995-d7f3182141f1", 00:13:21.901 "is_configured": false, 00:13:21.901 "data_offset": 0, 00:13:21.901 "data_size": 65536 00:13:21.901 }, 00:13:21.901 { 00:13:21.901 "name": null, 00:13:21.901 "uuid": "ee7e9a24-28e0-4483-8957-0138f513d97e", 00:13:21.901 "is_configured": false, 00:13:21.901 "data_offset": 0, 00:13:21.901 "data_size": 65536 00:13:21.901 } 00:13:21.901 ] 00:13:21.901 }' 00:13:21.901 18:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:21.901 18:50:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:22.467 18:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.467 18:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:22.726 18:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:22.726 18:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:22.726 [2024-07-24 18:50:07.648442] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:22.726 18:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:22.726 18:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:22.726 18:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:22.726 18:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:22.726 18:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:22.726 18:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:22.726 18:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:22.726 18:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:22.726 18:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:22.726 18:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:22.726 18:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.726 18:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:22.984 18:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:22.984 "name": "Existed_Raid", 00:13:22.984 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:22.984 "strip_size_kb": 0, 00:13:22.984 "state": "configuring", 00:13:22.984 "raid_level": "raid1", 00:13:22.984 "superblock": false, 00:13:22.984 "num_base_bdevs": 3, 00:13:22.984 "num_base_bdevs_discovered": 2, 00:13:22.984 "num_base_bdevs_operational": 3, 00:13:22.984 "base_bdevs_list": [ 00:13:22.984 { 00:13:22.984 "name": "BaseBdev1", 00:13:22.984 "uuid": "87382671-618d-4023-933a-d374f030ec48", 00:13:22.984 "is_configured": true, 00:13:22.984 "data_offset": 0, 00:13:22.984 "data_size": 65536 00:13:22.984 }, 00:13:22.984 { 00:13:22.984 "name": null, 00:13:22.984 "uuid": "2a6431d8-ce4c-441e-9995-d7f3182141f1", 00:13:22.984 "is_configured": false, 00:13:22.984 "data_offset": 0, 00:13:22.984 "data_size": 65536 00:13:22.984 }, 00:13:22.984 { 00:13:22.984 "name": "BaseBdev3", 00:13:22.984 "uuid": "ee7e9a24-28e0-4483-8957-0138f513d97e", 00:13:22.984 "is_configured": true, 00:13:22.984 "data_offset": 0, 00:13:22.984 "data_size": 65536 00:13:22.984 } 00:13:22.984 ] 00:13:22.984 }' 00:13:22.984 18:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:22.984 18:50:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:23.551 18:50:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:23.551 18:50:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:23.551 18:50:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:23.551 18:50:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:23.810 [2024-07-24 18:50:08.639014] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:23.810 18:50:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:23.810 18:50:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:23.810 18:50:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:23.810 18:50:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:23.810 18:50:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:23.810 18:50:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:23.810 18:50:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:23.810 18:50:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:23.810 18:50:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:23.810 18:50:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:23.810 18:50:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:23.810 18:50:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.068 18:50:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:24.068 "name": "Existed_Raid", 00:13:24.068 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:24.068 "strip_size_kb": 0, 00:13:24.068 "state": "configuring", 00:13:24.068 "raid_level": "raid1", 00:13:24.068 "superblock": false, 00:13:24.068 "num_base_bdevs": 3, 00:13:24.068 "num_base_bdevs_discovered": 1, 00:13:24.068 "num_base_bdevs_operational": 3, 00:13:24.068 "base_bdevs_list": [ 00:13:24.068 { 00:13:24.068 "name": null, 00:13:24.068 "uuid": "87382671-618d-4023-933a-d374f030ec48", 00:13:24.068 "is_configured": false, 00:13:24.068 "data_offset": 0, 00:13:24.068 "data_size": 65536 00:13:24.068 }, 00:13:24.068 { 00:13:24.068 "name": null, 00:13:24.068 "uuid": "2a6431d8-ce4c-441e-9995-d7f3182141f1", 00:13:24.068 "is_configured": false, 00:13:24.068 "data_offset": 0, 00:13:24.068 "data_size": 65536 00:13:24.068 }, 00:13:24.068 { 00:13:24.068 "name": "BaseBdev3", 00:13:24.068 "uuid": "ee7e9a24-28e0-4483-8957-0138f513d97e", 00:13:24.068 "is_configured": true, 00:13:24.068 "data_offset": 0, 00:13:24.068 "data_size": 65536 00:13:24.068 } 00:13:24.068 ] 00:13:24.068 }' 00:13:24.068 18:50:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:24.068 18:50:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:24.326 18:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:24.326 18:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.584 18:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:24.584 18:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:24.842 [2024-07-24 18:50:09.651562] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:24.842 18:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:24.842 18:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:24.842 18:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:24.842 18:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:24.842 18:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:24.842 18:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:24.842 18:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:24.842 18:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:24.842 18:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:24.842 18:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:24.842 18:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.842 18:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:24.842 18:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:24.842 "name": "Existed_Raid", 00:13:24.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:24.842 "strip_size_kb": 0, 00:13:24.842 "state": "configuring", 00:13:24.842 "raid_level": "raid1", 00:13:24.842 "superblock": false, 00:13:24.842 "num_base_bdevs": 3, 00:13:24.842 "num_base_bdevs_discovered": 2, 00:13:24.842 "num_base_bdevs_operational": 3, 00:13:24.842 "base_bdevs_list": [ 00:13:24.842 { 00:13:24.842 "name": null, 00:13:24.842 "uuid": "87382671-618d-4023-933a-d374f030ec48", 00:13:24.842 "is_configured": false, 00:13:24.842 "data_offset": 0, 00:13:24.842 "data_size": 65536 00:13:24.842 }, 00:13:24.842 { 00:13:24.842 "name": "BaseBdev2", 00:13:24.842 "uuid": "2a6431d8-ce4c-441e-9995-d7f3182141f1", 00:13:24.842 "is_configured": true, 00:13:24.842 "data_offset": 0, 00:13:24.842 "data_size": 65536 00:13:24.842 }, 00:13:24.842 { 00:13:24.842 "name": "BaseBdev3", 00:13:24.842 "uuid": "ee7e9a24-28e0-4483-8957-0138f513d97e", 00:13:24.842 "is_configured": true, 00:13:24.842 "data_offset": 0, 00:13:24.842 "data_size": 65536 00:13:24.842 } 00:13:24.842 ] 00:13:24.842 }' 00:13:24.842 18:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:24.842 18:50:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:25.408 18:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.408 18:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:25.666 18:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:25.666 18:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:25.666 18:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.667 18:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 87382671-618d-4023-933a-d374f030ec48 00:13:25.925 [2024-07-24 18:50:10.842595] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:25.925 [2024-07-24 18:50:10.842624] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xf3ffd0 00:13:25.925 [2024-07-24 18:50:10.842628] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:13:25.925 [2024-07-24 18:50:10.842777] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf40ee0 00:13:25.925 [2024-07-24 18:50:10.842874] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf3ffd0 00:13:25.925 [2024-07-24 18:50:10.842880] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf3ffd0 00:13:25.925 [2024-07-24 18:50:10.843001] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:25.925 NewBaseBdev 00:13:25.925 18:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:25.925 18:50:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:13:25.925 18:50:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:25.925 18:50:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:25.925 18:50:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:25.925 18:50:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:25.925 18:50:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:26.184 18:50:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:26.184 [ 00:13:26.184 { 00:13:26.184 "name": "NewBaseBdev", 00:13:26.184 "aliases": [ 00:13:26.184 "87382671-618d-4023-933a-d374f030ec48" 00:13:26.184 ], 00:13:26.184 "product_name": "Malloc disk", 00:13:26.184 "block_size": 512, 00:13:26.184 "num_blocks": 65536, 00:13:26.184 "uuid": "87382671-618d-4023-933a-d374f030ec48", 00:13:26.184 "assigned_rate_limits": { 00:13:26.184 "rw_ios_per_sec": 0, 00:13:26.184 "rw_mbytes_per_sec": 0, 00:13:26.184 "r_mbytes_per_sec": 0, 00:13:26.184 "w_mbytes_per_sec": 0 00:13:26.184 }, 00:13:26.184 "claimed": true, 00:13:26.184 "claim_type": "exclusive_write", 00:13:26.184 "zoned": false, 00:13:26.184 "supported_io_types": { 00:13:26.184 "read": true, 00:13:26.184 "write": true, 00:13:26.184 "unmap": true, 00:13:26.184 "flush": true, 00:13:26.184 "reset": true, 00:13:26.184 "nvme_admin": false, 00:13:26.184 "nvme_io": false, 00:13:26.184 "nvme_io_md": false, 00:13:26.184 "write_zeroes": true, 00:13:26.184 "zcopy": true, 00:13:26.184 "get_zone_info": false, 00:13:26.184 "zone_management": false, 00:13:26.184 "zone_append": false, 00:13:26.184 "compare": false, 00:13:26.184 "compare_and_write": false, 00:13:26.184 "abort": true, 00:13:26.184 "seek_hole": false, 00:13:26.184 "seek_data": false, 00:13:26.184 "copy": true, 00:13:26.184 "nvme_iov_md": false 00:13:26.184 }, 00:13:26.184 "memory_domains": [ 00:13:26.184 { 00:13:26.184 "dma_device_id": "system", 00:13:26.184 "dma_device_type": 1 00:13:26.184 }, 00:13:26.184 { 00:13:26.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.184 "dma_device_type": 2 00:13:26.184 } 00:13:26.184 ], 00:13:26.184 "driver_specific": {} 00:13:26.184 } 00:13:26.184 ] 00:13:26.443 18:50:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:26.443 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:13:26.443 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:26.443 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:26.443 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:26.443 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:26.443 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:26.443 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:26.443 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:26.443 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:26.443 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:26.443 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:26.443 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.443 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:26.443 "name": "Existed_Raid", 00:13:26.443 "uuid": "bd2d60dc-36db-4bbd-9081-469c55918337", 00:13:26.443 "strip_size_kb": 0, 00:13:26.443 "state": "online", 00:13:26.443 "raid_level": "raid1", 00:13:26.443 "superblock": false, 00:13:26.443 "num_base_bdevs": 3, 00:13:26.443 "num_base_bdevs_discovered": 3, 00:13:26.443 "num_base_bdevs_operational": 3, 00:13:26.443 "base_bdevs_list": [ 00:13:26.443 { 00:13:26.443 "name": "NewBaseBdev", 00:13:26.443 "uuid": "87382671-618d-4023-933a-d374f030ec48", 00:13:26.443 "is_configured": true, 00:13:26.443 "data_offset": 0, 00:13:26.443 "data_size": 65536 00:13:26.443 }, 00:13:26.443 { 00:13:26.443 "name": "BaseBdev2", 00:13:26.443 "uuid": "2a6431d8-ce4c-441e-9995-d7f3182141f1", 00:13:26.443 "is_configured": true, 00:13:26.443 "data_offset": 0, 00:13:26.443 "data_size": 65536 00:13:26.443 }, 00:13:26.443 { 00:13:26.443 "name": "BaseBdev3", 00:13:26.443 "uuid": "ee7e9a24-28e0-4483-8957-0138f513d97e", 00:13:26.443 "is_configured": true, 00:13:26.443 "data_offset": 0, 00:13:26.443 "data_size": 65536 00:13:26.443 } 00:13:26.443 ] 00:13:26.443 }' 00:13:26.443 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:26.443 18:50:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:27.009 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:27.010 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:27.010 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:27.010 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:27.010 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:27.010 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:27.010 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:27.010 18:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:27.010 [2024-07-24 18:50:12.009805] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:27.267 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:27.267 "name": "Existed_Raid", 00:13:27.267 "aliases": [ 00:13:27.267 "bd2d60dc-36db-4bbd-9081-469c55918337" 00:13:27.267 ], 00:13:27.267 "product_name": "Raid Volume", 00:13:27.267 "block_size": 512, 00:13:27.267 "num_blocks": 65536, 00:13:27.267 "uuid": "bd2d60dc-36db-4bbd-9081-469c55918337", 00:13:27.267 "assigned_rate_limits": { 00:13:27.267 "rw_ios_per_sec": 0, 00:13:27.267 "rw_mbytes_per_sec": 0, 00:13:27.267 "r_mbytes_per_sec": 0, 00:13:27.267 "w_mbytes_per_sec": 0 00:13:27.267 }, 00:13:27.267 "claimed": false, 00:13:27.267 "zoned": false, 00:13:27.267 "supported_io_types": { 00:13:27.267 "read": true, 00:13:27.267 "write": true, 00:13:27.267 "unmap": false, 00:13:27.267 "flush": false, 00:13:27.267 "reset": true, 00:13:27.267 "nvme_admin": false, 00:13:27.267 "nvme_io": false, 00:13:27.267 "nvme_io_md": false, 00:13:27.267 "write_zeroes": true, 00:13:27.267 "zcopy": false, 00:13:27.267 "get_zone_info": false, 00:13:27.267 "zone_management": false, 00:13:27.267 "zone_append": false, 00:13:27.267 "compare": false, 00:13:27.267 "compare_and_write": false, 00:13:27.267 "abort": false, 00:13:27.267 "seek_hole": false, 00:13:27.267 "seek_data": false, 00:13:27.267 "copy": false, 00:13:27.267 "nvme_iov_md": false 00:13:27.267 }, 00:13:27.267 "memory_domains": [ 00:13:27.267 { 00:13:27.267 "dma_device_id": "system", 00:13:27.267 "dma_device_type": 1 00:13:27.267 }, 00:13:27.267 { 00:13:27.267 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:27.267 "dma_device_type": 2 00:13:27.267 }, 00:13:27.267 { 00:13:27.267 "dma_device_id": "system", 00:13:27.267 "dma_device_type": 1 00:13:27.267 }, 00:13:27.267 { 00:13:27.267 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:27.267 "dma_device_type": 2 00:13:27.267 }, 00:13:27.267 { 00:13:27.267 "dma_device_id": "system", 00:13:27.268 "dma_device_type": 1 00:13:27.268 }, 00:13:27.268 { 00:13:27.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:27.268 "dma_device_type": 2 00:13:27.268 } 00:13:27.268 ], 00:13:27.268 "driver_specific": { 00:13:27.268 "raid": { 00:13:27.268 "uuid": "bd2d60dc-36db-4bbd-9081-469c55918337", 00:13:27.268 "strip_size_kb": 0, 00:13:27.268 "state": "online", 00:13:27.268 "raid_level": "raid1", 00:13:27.268 "superblock": false, 00:13:27.268 "num_base_bdevs": 3, 00:13:27.268 "num_base_bdevs_discovered": 3, 00:13:27.268 "num_base_bdevs_operational": 3, 00:13:27.268 "base_bdevs_list": [ 00:13:27.268 { 00:13:27.268 "name": "NewBaseBdev", 00:13:27.268 "uuid": "87382671-618d-4023-933a-d374f030ec48", 00:13:27.268 "is_configured": true, 00:13:27.268 "data_offset": 0, 00:13:27.268 "data_size": 65536 00:13:27.268 }, 00:13:27.268 { 00:13:27.268 "name": "BaseBdev2", 00:13:27.268 "uuid": "2a6431d8-ce4c-441e-9995-d7f3182141f1", 00:13:27.268 "is_configured": true, 00:13:27.268 "data_offset": 0, 00:13:27.268 "data_size": 65536 00:13:27.268 }, 00:13:27.268 { 00:13:27.268 "name": "BaseBdev3", 00:13:27.268 "uuid": "ee7e9a24-28e0-4483-8957-0138f513d97e", 00:13:27.268 "is_configured": true, 00:13:27.268 "data_offset": 0, 00:13:27.268 "data_size": 65536 00:13:27.268 } 00:13:27.268 ] 00:13:27.268 } 00:13:27.268 } 00:13:27.268 }' 00:13:27.268 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:27.268 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:27.268 BaseBdev2 00:13:27.268 BaseBdev3' 00:13:27.268 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:27.268 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:27.268 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:27.268 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:27.268 "name": "NewBaseBdev", 00:13:27.268 "aliases": [ 00:13:27.268 "87382671-618d-4023-933a-d374f030ec48" 00:13:27.268 ], 00:13:27.268 "product_name": "Malloc disk", 00:13:27.268 "block_size": 512, 00:13:27.268 "num_blocks": 65536, 00:13:27.268 "uuid": "87382671-618d-4023-933a-d374f030ec48", 00:13:27.268 "assigned_rate_limits": { 00:13:27.268 "rw_ios_per_sec": 0, 00:13:27.268 "rw_mbytes_per_sec": 0, 00:13:27.268 "r_mbytes_per_sec": 0, 00:13:27.268 "w_mbytes_per_sec": 0 00:13:27.268 }, 00:13:27.268 "claimed": true, 00:13:27.268 "claim_type": "exclusive_write", 00:13:27.268 "zoned": false, 00:13:27.268 "supported_io_types": { 00:13:27.268 "read": true, 00:13:27.268 "write": true, 00:13:27.268 "unmap": true, 00:13:27.268 "flush": true, 00:13:27.268 "reset": true, 00:13:27.268 "nvme_admin": false, 00:13:27.268 "nvme_io": false, 00:13:27.268 "nvme_io_md": false, 00:13:27.268 "write_zeroes": true, 00:13:27.268 "zcopy": true, 00:13:27.268 "get_zone_info": false, 00:13:27.268 "zone_management": false, 00:13:27.268 "zone_append": false, 00:13:27.268 "compare": false, 00:13:27.268 "compare_and_write": false, 00:13:27.268 "abort": true, 00:13:27.268 "seek_hole": false, 00:13:27.268 "seek_data": false, 00:13:27.268 "copy": true, 00:13:27.268 "nvme_iov_md": false 00:13:27.268 }, 00:13:27.268 "memory_domains": [ 00:13:27.268 { 00:13:27.268 "dma_device_id": "system", 00:13:27.268 "dma_device_type": 1 00:13:27.268 }, 00:13:27.268 { 00:13:27.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:27.268 "dma_device_type": 2 00:13:27.268 } 00:13:27.268 ], 00:13:27.268 "driver_specific": {} 00:13:27.268 }' 00:13:27.268 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:27.526 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:27.526 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:27.526 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:27.526 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:27.526 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:27.526 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:27.526 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:27.526 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:27.526 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:27.526 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:27.784 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:27.784 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:27.784 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:27.784 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:27.784 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:27.784 "name": "BaseBdev2", 00:13:27.784 "aliases": [ 00:13:27.784 "2a6431d8-ce4c-441e-9995-d7f3182141f1" 00:13:27.784 ], 00:13:27.784 "product_name": "Malloc disk", 00:13:27.784 "block_size": 512, 00:13:27.784 "num_blocks": 65536, 00:13:27.784 "uuid": "2a6431d8-ce4c-441e-9995-d7f3182141f1", 00:13:27.784 "assigned_rate_limits": { 00:13:27.784 "rw_ios_per_sec": 0, 00:13:27.784 "rw_mbytes_per_sec": 0, 00:13:27.784 "r_mbytes_per_sec": 0, 00:13:27.784 "w_mbytes_per_sec": 0 00:13:27.784 }, 00:13:27.784 "claimed": true, 00:13:27.784 "claim_type": "exclusive_write", 00:13:27.784 "zoned": false, 00:13:27.784 "supported_io_types": { 00:13:27.784 "read": true, 00:13:27.784 "write": true, 00:13:27.784 "unmap": true, 00:13:27.784 "flush": true, 00:13:27.784 "reset": true, 00:13:27.784 "nvme_admin": false, 00:13:27.784 "nvme_io": false, 00:13:27.784 "nvme_io_md": false, 00:13:27.784 "write_zeroes": true, 00:13:27.784 "zcopy": true, 00:13:27.784 "get_zone_info": false, 00:13:27.784 "zone_management": false, 00:13:27.784 "zone_append": false, 00:13:27.784 "compare": false, 00:13:27.784 "compare_and_write": false, 00:13:27.784 "abort": true, 00:13:27.784 "seek_hole": false, 00:13:27.784 "seek_data": false, 00:13:27.784 "copy": true, 00:13:27.784 "nvme_iov_md": false 00:13:27.784 }, 00:13:27.784 "memory_domains": [ 00:13:27.784 { 00:13:27.784 "dma_device_id": "system", 00:13:27.784 "dma_device_type": 1 00:13:27.784 }, 00:13:27.784 { 00:13:27.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:27.784 "dma_device_type": 2 00:13:27.784 } 00:13:27.784 ], 00:13:27.784 "driver_specific": {} 00:13:27.784 }' 00:13:27.784 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:27.784 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:28.043 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:28.043 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:28.043 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:28.043 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:28.043 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:28.043 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:28.043 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:28.043 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:28.043 18:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:28.043 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:28.043 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:28.043 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:28.043 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:28.301 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:28.301 "name": "BaseBdev3", 00:13:28.301 "aliases": [ 00:13:28.301 "ee7e9a24-28e0-4483-8957-0138f513d97e" 00:13:28.301 ], 00:13:28.301 "product_name": "Malloc disk", 00:13:28.301 "block_size": 512, 00:13:28.301 "num_blocks": 65536, 00:13:28.301 "uuid": "ee7e9a24-28e0-4483-8957-0138f513d97e", 00:13:28.301 "assigned_rate_limits": { 00:13:28.301 "rw_ios_per_sec": 0, 00:13:28.301 "rw_mbytes_per_sec": 0, 00:13:28.301 "r_mbytes_per_sec": 0, 00:13:28.301 "w_mbytes_per_sec": 0 00:13:28.301 }, 00:13:28.301 "claimed": true, 00:13:28.301 "claim_type": "exclusive_write", 00:13:28.301 "zoned": false, 00:13:28.301 "supported_io_types": { 00:13:28.301 "read": true, 00:13:28.301 "write": true, 00:13:28.301 "unmap": true, 00:13:28.301 "flush": true, 00:13:28.301 "reset": true, 00:13:28.301 "nvme_admin": false, 00:13:28.301 "nvme_io": false, 00:13:28.301 "nvme_io_md": false, 00:13:28.301 "write_zeroes": true, 00:13:28.301 "zcopy": true, 00:13:28.301 "get_zone_info": false, 00:13:28.301 "zone_management": false, 00:13:28.301 "zone_append": false, 00:13:28.301 "compare": false, 00:13:28.301 "compare_and_write": false, 00:13:28.301 "abort": true, 00:13:28.301 "seek_hole": false, 00:13:28.301 "seek_data": false, 00:13:28.301 "copy": true, 00:13:28.301 "nvme_iov_md": false 00:13:28.301 }, 00:13:28.301 "memory_domains": [ 00:13:28.301 { 00:13:28.301 "dma_device_id": "system", 00:13:28.301 "dma_device_type": 1 00:13:28.301 }, 00:13:28.301 { 00:13:28.301 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:28.301 "dma_device_type": 2 00:13:28.301 } 00:13:28.301 ], 00:13:28.301 "driver_specific": {} 00:13:28.301 }' 00:13:28.301 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:28.301 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:28.301 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:28.301 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:28.301 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:28.559 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:28.559 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:28.559 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:28.559 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:28.559 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:28.559 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:28.559 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:28.559 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:28.817 [2024-07-24 18:50:13.645844] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:28.817 [2024-07-24 18:50:13.645861] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:28.817 [2024-07-24 18:50:13.645897] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:28.817 [2024-07-24 18:50:13.646099] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:28.817 [2024-07-24 18:50:13.646104] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf3ffd0 name Existed_Raid, state offline 00:13:28.817 18:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2092186 00:13:28.817 18:50:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2092186 ']' 00:13:28.817 18:50:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2092186 00:13:28.817 18:50:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:13:28.817 18:50:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:28.817 18:50:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2092186 00:13:28.817 18:50:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:28.817 18:50:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:28.817 18:50:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2092186' 00:13:28.817 killing process with pid 2092186 00:13:28.817 18:50:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2092186 00:13:28.817 [2024-07-24 18:50:13.702690] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:28.817 18:50:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2092186 00:13:28.817 [2024-07-24 18:50:13.744195] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:29.075 18:50:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:29.075 00:13:29.075 real 0m21.553s 00:13:29.075 user 0m39.901s 00:13:29.075 sys 0m3.264s 00:13:29.075 18:50:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:29.075 18:50:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:29.075 ************************************ 00:13:29.075 END TEST raid_state_function_test 00:13:29.075 ************************************ 00:13:29.075 18:50:14 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:13:29.075 18:50:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:29.075 18:50:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:29.075 18:50:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:29.334 ************************************ 00:13:29.334 START TEST raid_state_function_test_sb 00:13:29.334 ************************************ 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2096427 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2096427' 00:13:29.334 Process raid pid: 2096427 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2096427 /var/tmp/spdk-raid.sock 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2096427 ']' 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:29.334 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:29.334 18:50:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:29.334 [2024-07-24 18:50:14.150124] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:13:29.334 [2024-07-24 18:50:14.150161] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:29.334 [2024-07-24 18:50:14.213342] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:29.334 [2024-07-24 18:50:14.284203] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:29.334 [2024-07-24 18:50:14.336056] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:29.334 [2024-07-24 18:50:14.336083] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:30.269 18:50:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:30.269 18:50:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:13:30.269 18:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:30.269 [2024-07-24 18:50:15.095055] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:30.269 [2024-07-24 18:50:15.095084] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:30.269 [2024-07-24 18:50:15.095089] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:30.269 [2024-07-24 18:50:15.095094] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:30.269 [2024-07-24 18:50:15.095100] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:30.269 [2024-07-24 18:50:15.095121] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:30.269 18:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:30.269 18:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:30.269 18:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:30.269 18:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:30.269 18:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:30.269 18:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:30.269 18:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:30.269 18:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:30.269 18:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:30.269 18:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:30.269 18:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.269 18:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:30.527 18:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:30.527 "name": "Existed_Raid", 00:13:30.527 "uuid": "eefd77b4-83a4-47a8-a5e8-5349381552c5", 00:13:30.527 "strip_size_kb": 0, 00:13:30.527 "state": "configuring", 00:13:30.527 "raid_level": "raid1", 00:13:30.527 "superblock": true, 00:13:30.527 "num_base_bdevs": 3, 00:13:30.527 "num_base_bdevs_discovered": 0, 00:13:30.527 "num_base_bdevs_operational": 3, 00:13:30.527 "base_bdevs_list": [ 00:13:30.527 { 00:13:30.527 "name": "BaseBdev1", 00:13:30.527 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:30.527 "is_configured": false, 00:13:30.527 "data_offset": 0, 00:13:30.527 "data_size": 0 00:13:30.527 }, 00:13:30.527 { 00:13:30.527 "name": "BaseBdev2", 00:13:30.527 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:30.527 "is_configured": false, 00:13:30.527 "data_offset": 0, 00:13:30.527 "data_size": 0 00:13:30.527 }, 00:13:30.527 { 00:13:30.527 "name": "BaseBdev3", 00:13:30.527 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:30.527 "is_configured": false, 00:13:30.527 "data_offset": 0, 00:13:30.527 "data_size": 0 00:13:30.527 } 00:13:30.527 ] 00:13:30.527 }' 00:13:30.527 18:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:30.527 18:50:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:30.785 18:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:31.043 [2024-07-24 18:50:15.933191] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:31.043 [2024-07-24 18:50:15.933212] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x250eba0 name Existed_Raid, state configuring 00:13:31.043 18:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:31.302 [2024-07-24 18:50:16.109677] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:31.302 [2024-07-24 18:50:16.109694] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:31.302 [2024-07-24 18:50:16.109699] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:31.302 [2024-07-24 18:50:16.109704] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:31.302 [2024-07-24 18:50:16.109708] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:31.302 [2024-07-24 18:50:16.109712] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:31.302 18:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:31.302 [2024-07-24 18:50:16.290321] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:31.302 BaseBdev1 00:13:31.302 18:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:31.302 18:50:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:31.302 18:50:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:31.302 18:50:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:31.302 18:50:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:31.302 18:50:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:31.302 18:50:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:31.561 18:50:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:31.819 [ 00:13:31.819 { 00:13:31.819 "name": "BaseBdev1", 00:13:31.819 "aliases": [ 00:13:31.819 "09536c77-77e5-4213-a24e-2b9deea83af0" 00:13:31.819 ], 00:13:31.819 "product_name": "Malloc disk", 00:13:31.819 "block_size": 512, 00:13:31.819 "num_blocks": 65536, 00:13:31.819 "uuid": "09536c77-77e5-4213-a24e-2b9deea83af0", 00:13:31.819 "assigned_rate_limits": { 00:13:31.819 "rw_ios_per_sec": 0, 00:13:31.819 "rw_mbytes_per_sec": 0, 00:13:31.819 "r_mbytes_per_sec": 0, 00:13:31.819 "w_mbytes_per_sec": 0 00:13:31.819 }, 00:13:31.819 "claimed": true, 00:13:31.819 "claim_type": "exclusive_write", 00:13:31.819 "zoned": false, 00:13:31.819 "supported_io_types": { 00:13:31.819 "read": true, 00:13:31.819 "write": true, 00:13:31.819 "unmap": true, 00:13:31.819 "flush": true, 00:13:31.819 "reset": true, 00:13:31.819 "nvme_admin": false, 00:13:31.819 "nvme_io": false, 00:13:31.819 "nvme_io_md": false, 00:13:31.819 "write_zeroes": true, 00:13:31.819 "zcopy": true, 00:13:31.819 "get_zone_info": false, 00:13:31.819 "zone_management": false, 00:13:31.819 "zone_append": false, 00:13:31.819 "compare": false, 00:13:31.819 "compare_and_write": false, 00:13:31.819 "abort": true, 00:13:31.819 "seek_hole": false, 00:13:31.819 "seek_data": false, 00:13:31.819 "copy": true, 00:13:31.819 "nvme_iov_md": false 00:13:31.819 }, 00:13:31.819 "memory_domains": [ 00:13:31.819 { 00:13:31.819 "dma_device_id": "system", 00:13:31.819 "dma_device_type": 1 00:13:31.819 }, 00:13:31.819 { 00:13:31.819 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:31.819 "dma_device_type": 2 00:13:31.819 } 00:13:31.819 ], 00:13:31.819 "driver_specific": {} 00:13:31.819 } 00:13:31.819 ] 00:13:31.819 18:50:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:31.819 18:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:31.819 18:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:31.819 18:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:31.819 18:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:31.819 18:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:31.819 18:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:31.819 18:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:31.819 18:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:31.819 18:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:31.819 18:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:31.819 18:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.819 18:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:31.819 18:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:31.819 "name": "Existed_Raid", 00:13:31.819 "uuid": "65693509-766b-468d-9544-ce0d2b5274d4", 00:13:31.819 "strip_size_kb": 0, 00:13:31.819 "state": "configuring", 00:13:31.819 "raid_level": "raid1", 00:13:31.819 "superblock": true, 00:13:31.819 "num_base_bdevs": 3, 00:13:31.819 "num_base_bdevs_discovered": 1, 00:13:31.819 "num_base_bdevs_operational": 3, 00:13:31.819 "base_bdevs_list": [ 00:13:31.819 { 00:13:31.819 "name": "BaseBdev1", 00:13:31.819 "uuid": "09536c77-77e5-4213-a24e-2b9deea83af0", 00:13:31.819 "is_configured": true, 00:13:31.819 "data_offset": 2048, 00:13:31.819 "data_size": 63488 00:13:31.819 }, 00:13:31.819 { 00:13:31.819 "name": "BaseBdev2", 00:13:31.819 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:31.819 "is_configured": false, 00:13:31.819 "data_offset": 0, 00:13:31.819 "data_size": 0 00:13:31.819 }, 00:13:31.819 { 00:13:31.819 "name": "BaseBdev3", 00:13:31.819 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:31.819 "is_configured": false, 00:13:31.819 "data_offset": 0, 00:13:31.819 "data_size": 0 00:13:31.819 } 00:13:31.819 ] 00:13:31.819 }' 00:13:31.819 18:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:31.819 18:50:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:32.387 18:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:32.645 [2024-07-24 18:50:17.457320] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:32.645 [2024-07-24 18:50:17.457352] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x250e470 name Existed_Raid, state configuring 00:13:32.645 18:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:32.645 [2024-07-24 18:50:17.625785] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:32.645 [2024-07-24 18:50:17.626839] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:32.645 [2024-07-24 18:50:17.626864] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:32.645 [2024-07-24 18:50:17.626875] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:32.645 [2024-07-24 18:50:17.626880] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:32.645 18:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:32.645 18:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:32.645 18:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:32.645 18:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:32.645 18:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:32.645 18:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:32.645 18:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:32.645 18:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:32.645 18:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:32.645 18:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:32.645 18:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:32.645 18:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:32.645 18:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.645 18:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:32.904 18:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:32.904 "name": "Existed_Raid", 00:13:32.904 "uuid": "42daf345-dc29-4ee1-afbf-2778a204f760", 00:13:32.904 "strip_size_kb": 0, 00:13:32.904 "state": "configuring", 00:13:32.904 "raid_level": "raid1", 00:13:32.904 "superblock": true, 00:13:32.904 "num_base_bdevs": 3, 00:13:32.904 "num_base_bdevs_discovered": 1, 00:13:32.904 "num_base_bdevs_operational": 3, 00:13:32.904 "base_bdevs_list": [ 00:13:32.904 { 00:13:32.904 "name": "BaseBdev1", 00:13:32.904 "uuid": "09536c77-77e5-4213-a24e-2b9deea83af0", 00:13:32.904 "is_configured": true, 00:13:32.904 "data_offset": 2048, 00:13:32.904 "data_size": 63488 00:13:32.904 }, 00:13:32.904 { 00:13:32.904 "name": "BaseBdev2", 00:13:32.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:32.904 "is_configured": false, 00:13:32.904 "data_offset": 0, 00:13:32.904 "data_size": 0 00:13:32.904 }, 00:13:32.904 { 00:13:32.904 "name": "BaseBdev3", 00:13:32.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:32.904 "is_configured": false, 00:13:32.904 "data_offset": 0, 00:13:32.904 "data_size": 0 00:13:32.904 } 00:13:32.904 ] 00:13:32.904 }' 00:13:32.904 18:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:32.904 18:50:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:33.470 18:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:33.470 [2024-07-24 18:50:18.442445] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:33.470 BaseBdev2 00:13:33.470 18:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:33.470 18:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:33.470 18:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:33.470 18:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:33.470 18:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:33.470 18:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:33.470 18:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:33.729 18:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:34.004 [ 00:13:34.004 { 00:13:34.004 "name": "BaseBdev2", 00:13:34.004 "aliases": [ 00:13:34.004 "b81d6928-ca67-4ab4-8aac-a1bb9baa0fbf" 00:13:34.004 ], 00:13:34.004 "product_name": "Malloc disk", 00:13:34.004 "block_size": 512, 00:13:34.004 "num_blocks": 65536, 00:13:34.004 "uuid": "b81d6928-ca67-4ab4-8aac-a1bb9baa0fbf", 00:13:34.004 "assigned_rate_limits": { 00:13:34.004 "rw_ios_per_sec": 0, 00:13:34.004 "rw_mbytes_per_sec": 0, 00:13:34.004 "r_mbytes_per_sec": 0, 00:13:34.004 "w_mbytes_per_sec": 0 00:13:34.004 }, 00:13:34.004 "claimed": true, 00:13:34.004 "claim_type": "exclusive_write", 00:13:34.004 "zoned": false, 00:13:34.004 "supported_io_types": { 00:13:34.004 "read": true, 00:13:34.004 "write": true, 00:13:34.004 "unmap": true, 00:13:34.004 "flush": true, 00:13:34.004 "reset": true, 00:13:34.004 "nvme_admin": false, 00:13:34.004 "nvme_io": false, 00:13:34.004 "nvme_io_md": false, 00:13:34.004 "write_zeroes": true, 00:13:34.004 "zcopy": true, 00:13:34.005 "get_zone_info": false, 00:13:34.005 "zone_management": false, 00:13:34.005 "zone_append": false, 00:13:34.005 "compare": false, 00:13:34.005 "compare_and_write": false, 00:13:34.005 "abort": true, 00:13:34.005 "seek_hole": false, 00:13:34.005 "seek_data": false, 00:13:34.005 "copy": true, 00:13:34.005 "nvme_iov_md": false 00:13:34.005 }, 00:13:34.005 "memory_domains": [ 00:13:34.005 { 00:13:34.005 "dma_device_id": "system", 00:13:34.005 "dma_device_type": 1 00:13:34.005 }, 00:13:34.005 { 00:13:34.005 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:34.005 "dma_device_type": 2 00:13:34.005 } 00:13:34.005 ], 00:13:34.005 "driver_specific": {} 00:13:34.005 } 00:13:34.005 ] 00:13:34.005 18:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:34.005 18:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:34.005 18:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:34.005 18:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:34.005 18:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:34.005 18:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:34.005 18:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:34.005 18:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:34.005 18:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:34.005 18:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:34.005 18:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:34.005 18:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:34.005 18:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:34.005 18:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.005 18:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:34.005 18:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:34.005 "name": "Existed_Raid", 00:13:34.005 "uuid": "42daf345-dc29-4ee1-afbf-2778a204f760", 00:13:34.005 "strip_size_kb": 0, 00:13:34.005 "state": "configuring", 00:13:34.005 "raid_level": "raid1", 00:13:34.005 "superblock": true, 00:13:34.005 "num_base_bdevs": 3, 00:13:34.005 "num_base_bdevs_discovered": 2, 00:13:34.005 "num_base_bdevs_operational": 3, 00:13:34.005 "base_bdevs_list": [ 00:13:34.005 { 00:13:34.005 "name": "BaseBdev1", 00:13:34.005 "uuid": "09536c77-77e5-4213-a24e-2b9deea83af0", 00:13:34.005 "is_configured": true, 00:13:34.005 "data_offset": 2048, 00:13:34.005 "data_size": 63488 00:13:34.005 }, 00:13:34.005 { 00:13:34.005 "name": "BaseBdev2", 00:13:34.005 "uuid": "b81d6928-ca67-4ab4-8aac-a1bb9baa0fbf", 00:13:34.005 "is_configured": true, 00:13:34.005 "data_offset": 2048, 00:13:34.005 "data_size": 63488 00:13:34.005 }, 00:13:34.005 { 00:13:34.005 "name": "BaseBdev3", 00:13:34.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:34.005 "is_configured": false, 00:13:34.005 "data_offset": 0, 00:13:34.005 "data_size": 0 00:13:34.005 } 00:13:34.005 ] 00:13:34.005 }' 00:13:34.005 18:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:34.005 18:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:34.581 18:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:34.840 [2024-07-24 18:50:19.624102] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:34.840 [2024-07-24 18:50:19.624216] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x250f360 00:13:34.840 [2024-07-24 18:50:19.624224] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:34.840 [2024-07-24 18:50:19.624344] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26b3660 00:13:34.840 [2024-07-24 18:50:19.624426] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x250f360 00:13:34.840 [2024-07-24 18:50:19.624431] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x250f360 00:13:34.840 [2024-07-24 18:50:19.624508] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:34.840 BaseBdev3 00:13:34.840 18:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:34.840 18:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:34.840 18:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:34.840 18:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:34.840 18:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:34.840 18:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:34.840 18:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:34.840 18:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:35.099 [ 00:13:35.099 { 00:13:35.099 "name": "BaseBdev3", 00:13:35.099 "aliases": [ 00:13:35.099 "5f396ec7-dc28-4e27-85dc-37df4929a7ee" 00:13:35.099 ], 00:13:35.099 "product_name": "Malloc disk", 00:13:35.099 "block_size": 512, 00:13:35.099 "num_blocks": 65536, 00:13:35.099 "uuid": "5f396ec7-dc28-4e27-85dc-37df4929a7ee", 00:13:35.099 "assigned_rate_limits": { 00:13:35.099 "rw_ios_per_sec": 0, 00:13:35.099 "rw_mbytes_per_sec": 0, 00:13:35.099 "r_mbytes_per_sec": 0, 00:13:35.099 "w_mbytes_per_sec": 0 00:13:35.099 }, 00:13:35.099 "claimed": true, 00:13:35.099 "claim_type": "exclusive_write", 00:13:35.099 "zoned": false, 00:13:35.099 "supported_io_types": { 00:13:35.099 "read": true, 00:13:35.099 "write": true, 00:13:35.099 "unmap": true, 00:13:35.099 "flush": true, 00:13:35.099 "reset": true, 00:13:35.099 "nvme_admin": false, 00:13:35.099 "nvme_io": false, 00:13:35.099 "nvme_io_md": false, 00:13:35.099 "write_zeroes": true, 00:13:35.099 "zcopy": true, 00:13:35.099 "get_zone_info": false, 00:13:35.099 "zone_management": false, 00:13:35.099 "zone_append": false, 00:13:35.099 "compare": false, 00:13:35.099 "compare_and_write": false, 00:13:35.099 "abort": true, 00:13:35.099 "seek_hole": false, 00:13:35.099 "seek_data": false, 00:13:35.099 "copy": true, 00:13:35.099 "nvme_iov_md": false 00:13:35.099 }, 00:13:35.099 "memory_domains": [ 00:13:35.099 { 00:13:35.099 "dma_device_id": "system", 00:13:35.099 "dma_device_type": 1 00:13:35.099 }, 00:13:35.099 { 00:13:35.099 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.099 "dma_device_type": 2 00:13:35.099 } 00:13:35.099 ], 00:13:35.099 "driver_specific": {} 00:13:35.099 } 00:13:35.099 ] 00:13:35.099 18:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:35.099 18:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:35.099 18:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:35.099 18:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:13:35.099 18:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:35.099 18:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:35.099 18:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:35.099 18:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:35.099 18:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:35.099 18:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:35.099 18:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:35.099 18:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:35.099 18:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:35.099 18:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:35.099 18:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:35.357 18:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:35.357 "name": "Existed_Raid", 00:13:35.357 "uuid": "42daf345-dc29-4ee1-afbf-2778a204f760", 00:13:35.357 "strip_size_kb": 0, 00:13:35.357 "state": "online", 00:13:35.357 "raid_level": "raid1", 00:13:35.357 "superblock": true, 00:13:35.357 "num_base_bdevs": 3, 00:13:35.357 "num_base_bdevs_discovered": 3, 00:13:35.357 "num_base_bdevs_operational": 3, 00:13:35.357 "base_bdevs_list": [ 00:13:35.357 { 00:13:35.357 "name": "BaseBdev1", 00:13:35.357 "uuid": "09536c77-77e5-4213-a24e-2b9deea83af0", 00:13:35.357 "is_configured": true, 00:13:35.357 "data_offset": 2048, 00:13:35.357 "data_size": 63488 00:13:35.357 }, 00:13:35.357 { 00:13:35.357 "name": "BaseBdev2", 00:13:35.357 "uuid": "b81d6928-ca67-4ab4-8aac-a1bb9baa0fbf", 00:13:35.357 "is_configured": true, 00:13:35.357 "data_offset": 2048, 00:13:35.357 "data_size": 63488 00:13:35.357 }, 00:13:35.357 { 00:13:35.357 "name": "BaseBdev3", 00:13:35.357 "uuid": "5f396ec7-dc28-4e27-85dc-37df4929a7ee", 00:13:35.357 "is_configured": true, 00:13:35.357 "data_offset": 2048, 00:13:35.357 "data_size": 63488 00:13:35.357 } 00:13:35.357 ] 00:13:35.357 }' 00:13:35.357 18:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:35.357 18:50:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:35.924 18:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:35.924 18:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:35.924 18:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:35.924 18:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:35.924 18:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:35.924 18:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:35.924 18:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:35.924 18:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:35.924 [2024-07-24 18:50:20.803339] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:35.924 18:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:35.924 "name": "Existed_Raid", 00:13:35.924 "aliases": [ 00:13:35.924 "42daf345-dc29-4ee1-afbf-2778a204f760" 00:13:35.924 ], 00:13:35.924 "product_name": "Raid Volume", 00:13:35.924 "block_size": 512, 00:13:35.924 "num_blocks": 63488, 00:13:35.924 "uuid": "42daf345-dc29-4ee1-afbf-2778a204f760", 00:13:35.924 "assigned_rate_limits": { 00:13:35.924 "rw_ios_per_sec": 0, 00:13:35.924 "rw_mbytes_per_sec": 0, 00:13:35.924 "r_mbytes_per_sec": 0, 00:13:35.924 "w_mbytes_per_sec": 0 00:13:35.924 }, 00:13:35.924 "claimed": false, 00:13:35.924 "zoned": false, 00:13:35.924 "supported_io_types": { 00:13:35.924 "read": true, 00:13:35.924 "write": true, 00:13:35.924 "unmap": false, 00:13:35.924 "flush": false, 00:13:35.924 "reset": true, 00:13:35.924 "nvme_admin": false, 00:13:35.924 "nvme_io": false, 00:13:35.924 "nvme_io_md": false, 00:13:35.924 "write_zeroes": true, 00:13:35.924 "zcopy": false, 00:13:35.924 "get_zone_info": false, 00:13:35.924 "zone_management": false, 00:13:35.924 "zone_append": false, 00:13:35.924 "compare": false, 00:13:35.924 "compare_and_write": false, 00:13:35.924 "abort": false, 00:13:35.924 "seek_hole": false, 00:13:35.924 "seek_data": false, 00:13:35.924 "copy": false, 00:13:35.924 "nvme_iov_md": false 00:13:35.924 }, 00:13:35.924 "memory_domains": [ 00:13:35.924 { 00:13:35.924 "dma_device_id": "system", 00:13:35.924 "dma_device_type": 1 00:13:35.924 }, 00:13:35.924 { 00:13:35.924 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.924 "dma_device_type": 2 00:13:35.924 }, 00:13:35.924 { 00:13:35.924 "dma_device_id": "system", 00:13:35.924 "dma_device_type": 1 00:13:35.924 }, 00:13:35.924 { 00:13:35.924 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.924 "dma_device_type": 2 00:13:35.924 }, 00:13:35.924 { 00:13:35.924 "dma_device_id": "system", 00:13:35.924 "dma_device_type": 1 00:13:35.924 }, 00:13:35.924 { 00:13:35.924 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.924 "dma_device_type": 2 00:13:35.924 } 00:13:35.924 ], 00:13:35.924 "driver_specific": { 00:13:35.924 "raid": { 00:13:35.924 "uuid": "42daf345-dc29-4ee1-afbf-2778a204f760", 00:13:35.924 "strip_size_kb": 0, 00:13:35.924 "state": "online", 00:13:35.924 "raid_level": "raid1", 00:13:35.924 "superblock": true, 00:13:35.924 "num_base_bdevs": 3, 00:13:35.924 "num_base_bdevs_discovered": 3, 00:13:35.924 "num_base_bdevs_operational": 3, 00:13:35.924 "base_bdevs_list": [ 00:13:35.924 { 00:13:35.924 "name": "BaseBdev1", 00:13:35.924 "uuid": "09536c77-77e5-4213-a24e-2b9deea83af0", 00:13:35.924 "is_configured": true, 00:13:35.924 "data_offset": 2048, 00:13:35.924 "data_size": 63488 00:13:35.924 }, 00:13:35.924 { 00:13:35.924 "name": "BaseBdev2", 00:13:35.924 "uuid": "b81d6928-ca67-4ab4-8aac-a1bb9baa0fbf", 00:13:35.924 "is_configured": true, 00:13:35.924 "data_offset": 2048, 00:13:35.924 "data_size": 63488 00:13:35.924 }, 00:13:35.924 { 00:13:35.924 "name": "BaseBdev3", 00:13:35.924 "uuid": "5f396ec7-dc28-4e27-85dc-37df4929a7ee", 00:13:35.924 "is_configured": true, 00:13:35.924 "data_offset": 2048, 00:13:35.924 "data_size": 63488 00:13:35.924 } 00:13:35.924 ] 00:13:35.924 } 00:13:35.924 } 00:13:35.924 }' 00:13:35.924 18:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:35.924 18:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:35.924 BaseBdev2 00:13:35.924 BaseBdev3' 00:13:35.924 18:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:35.924 18:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:35.924 18:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:36.183 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:36.183 "name": "BaseBdev1", 00:13:36.183 "aliases": [ 00:13:36.183 "09536c77-77e5-4213-a24e-2b9deea83af0" 00:13:36.183 ], 00:13:36.183 "product_name": "Malloc disk", 00:13:36.183 "block_size": 512, 00:13:36.183 "num_blocks": 65536, 00:13:36.183 "uuid": "09536c77-77e5-4213-a24e-2b9deea83af0", 00:13:36.183 "assigned_rate_limits": { 00:13:36.183 "rw_ios_per_sec": 0, 00:13:36.183 "rw_mbytes_per_sec": 0, 00:13:36.183 "r_mbytes_per_sec": 0, 00:13:36.183 "w_mbytes_per_sec": 0 00:13:36.183 }, 00:13:36.183 "claimed": true, 00:13:36.183 "claim_type": "exclusive_write", 00:13:36.183 "zoned": false, 00:13:36.183 "supported_io_types": { 00:13:36.183 "read": true, 00:13:36.183 "write": true, 00:13:36.183 "unmap": true, 00:13:36.183 "flush": true, 00:13:36.183 "reset": true, 00:13:36.183 "nvme_admin": false, 00:13:36.183 "nvme_io": false, 00:13:36.183 "nvme_io_md": false, 00:13:36.183 "write_zeroes": true, 00:13:36.183 "zcopy": true, 00:13:36.183 "get_zone_info": false, 00:13:36.183 "zone_management": false, 00:13:36.183 "zone_append": false, 00:13:36.183 "compare": false, 00:13:36.183 "compare_and_write": false, 00:13:36.183 "abort": true, 00:13:36.183 "seek_hole": false, 00:13:36.183 "seek_data": false, 00:13:36.183 "copy": true, 00:13:36.183 "nvme_iov_md": false 00:13:36.183 }, 00:13:36.183 "memory_domains": [ 00:13:36.183 { 00:13:36.183 "dma_device_id": "system", 00:13:36.183 "dma_device_type": 1 00:13:36.183 }, 00:13:36.183 { 00:13:36.183 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.183 "dma_device_type": 2 00:13:36.183 } 00:13:36.183 ], 00:13:36.183 "driver_specific": {} 00:13:36.183 }' 00:13:36.183 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:36.183 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:36.183 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:36.183 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:36.183 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:36.183 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:36.183 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:36.442 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:36.442 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:36.442 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.442 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.442 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:36.442 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:36.442 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:36.442 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:36.700 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:36.700 "name": "BaseBdev2", 00:13:36.700 "aliases": [ 00:13:36.700 "b81d6928-ca67-4ab4-8aac-a1bb9baa0fbf" 00:13:36.700 ], 00:13:36.700 "product_name": "Malloc disk", 00:13:36.700 "block_size": 512, 00:13:36.700 "num_blocks": 65536, 00:13:36.700 "uuid": "b81d6928-ca67-4ab4-8aac-a1bb9baa0fbf", 00:13:36.700 "assigned_rate_limits": { 00:13:36.700 "rw_ios_per_sec": 0, 00:13:36.700 "rw_mbytes_per_sec": 0, 00:13:36.700 "r_mbytes_per_sec": 0, 00:13:36.700 "w_mbytes_per_sec": 0 00:13:36.700 }, 00:13:36.700 "claimed": true, 00:13:36.700 "claim_type": "exclusive_write", 00:13:36.700 "zoned": false, 00:13:36.700 "supported_io_types": { 00:13:36.700 "read": true, 00:13:36.700 "write": true, 00:13:36.700 "unmap": true, 00:13:36.700 "flush": true, 00:13:36.700 "reset": true, 00:13:36.700 "nvme_admin": false, 00:13:36.700 "nvme_io": false, 00:13:36.700 "nvme_io_md": false, 00:13:36.700 "write_zeroes": true, 00:13:36.700 "zcopy": true, 00:13:36.700 "get_zone_info": false, 00:13:36.700 "zone_management": false, 00:13:36.700 "zone_append": false, 00:13:36.700 "compare": false, 00:13:36.700 "compare_and_write": false, 00:13:36.700 "abort": true, 00:13:36.700 "seek_hole": false, 00:13:36.700 "seek_data": false, 00:13:36.700 "copy": true, 00:13:36.700 "nvme_iov_md": false 00:13:36.700 }, 00:13:36.700 "memory_domains": [ 00:13:36.700 { 00:13:36.700 "dma_device_id": "system", 00:13:36.700 "dma_device_type": 1 00:13:36.700 }, 00:13:36.700 { 00:13:36.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.700 "dma_device_type": 2 00:13:36.700 } 00:13:36.700 ], 00:13:36.700 "driver_specific": {} 00:13:36.700 }' 00:13:36.700 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:36.700 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:36.700 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:36.700 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:36.700 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:36.700 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:36.700 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:36.700 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:36.700 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:36.700 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.958 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.958 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:36.958 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:36.958 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:36.958 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:36.958 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:36.958 "name": "BaseBdev3", 00:13:36.958 "aliases": [ 00:13:36.958 "5f396ec7-dc28-4e27-85dc-37df4929a7ee" 00:13:36.958 ], 00:13:36.958 "product_name": "Malloc disk", 00:13:36.958 "block_size": 512, 00:13:36.958 "num_blocks": 65536, 00:13:36.958 "uuid": "5f396ec7-dc28-4e27-85dc-37df4929a7ee", 00:13:36.958 "assigned_rate_limits": { 00:13:36.958 "rw_ios_per_sec": 0, 00:13:36.958 "rw_mbytes_per_sec": 0, 00:13:36.958 "r_mbytes_per_sec": 0, 00:13:36.958 "w_mbytes_per_sec": 0 00:13:36.958 }, 00:13:36.958 "claimed": true, 00:13:36.958 "claim_type": "exclusive_write", 00:13:36.958 "zoned": false, 00:13:36.958 "supported_io_types": { 00:13:36.958 "read": true, 00:13:36.958 "write": true, 00:13:36.958 "unmap": true, 00:13:36.958 "flush": true, 00:13:36.958 "reset": true, 00:13:36.958 "nvme_admin": false, 00:13:36.958 "nvme_io": false, 00:13:36.958 "nvme_io_md": false, 00:13:36.958 "write_zeroes": true, 00:13:36.958 "zcopy": true, 00:13:36.958 "get_zone_info": false, 00:13:36.958 "zone_management": false, 00:13:36.958 "zone_append": false, 00:13:36.958 "compare": false, 00:13:36.958 "compare_and_write": false, 00:13:36.958 "abort": true, 00:13:36.958 "seek_hole": false, 00:13:36.958 "seek_data": false, 00:13:36.958 "copy": true, 00:13:36.958 "nvme_iov_md": false 00:13:36.958 }, 00:13:36.958 "memory_domains": [ 00:13:36.958 { 00:13:36.958 "dma_device_id": "system", 00:13:36.958 "dma_device_type": 1 00:13:36.958 }, 00:13:36.958 { 00:13:36.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.958 "dma_device_type": 2 00:13:36.958 } 00:13:36.958 ], 00:13:36.958 "driver_specific": {} 00:13:36.958 }' 00:13:36.958 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:37.217 18:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:37.217 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:37.217 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:37.217 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:37.217 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:37.217 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:37.217 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:37.217 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:37.217 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:37.217 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:37.476 [2024-07-24 18:50:22.403320] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:37.476 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:37.734 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:37.734 "name": "Existed_Raid", 00:13:37.734 "uuid": "42daf345-dc29-4ee1-afbf-2778a204f760", 00:13:37.734 "strip_size_kb": 0, 00:13:37.734 "state": "online", 00:13:37.734 "raid_level": "raid1", 00:13:37.734 "superblock": true, 00:13:37.734 "num_base_bdevs": 3, 00:13:37.734 "num_base_bdevs_discovered": 2, 00:13:37.734 "num_base_bdevs_operational": 2, 00:13:37.734 "base_bdevs_list": [ 00:13:37.734 { 00:13:37.734 "name": null, 00:13:37.734 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:37.734 "is_configured": false, 00:13:37.734 "data_offset": 2048, 00:13:37.734 "data_size": 63488 00:13:37.734 }, 00:13:37.734 { 00:13:37.734 "name": "BaseBdev2", 00:13:37.734 "uuid": "b81d6928-ca67-4ab4-8aac-a1bb9baa0fbf", 00:13:37.734 "is_configured": true, 00:13:37.734 "data_offset": 2048, 00:13:37.734 "data_size": 63488 00:13:37.734 }, 00:13:37.734 { 00:13:37.734 "name": "BaseBdev3", 00:13:37.734 "uuid": "5f396ec7-dc28-4e27-85dc-37df4929a7ee", 00:13:37.734 "is_configured": true, 00:13:37.734 "data_offset": 2048, 00:13:37.734 "data_size": 63488 00:13:37.734 } 00:13:37.734 ] 00:13:37.734 }' 00:13:37.734 18:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:37.734 18:50:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:38.301 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:38.301 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:38.301 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:38.301 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:38.301 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:38.301 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:38.301 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:38.559 [2024-07-24 18:50:23.378762] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:38.559 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:38.559 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:38.559 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:38.559 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:38.559 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:38.559 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:38.559 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:38.818 [2024-07-24 18:50:23.717308] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:38.818 [2024-07-24 18:50:23.717380] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:38.818 [2024-07-24 18:50:23.727405] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:38.818 [2024-07-24 18:50:23.727432] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:38.818 [2024-07-24 18:50:23.727438] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x250f360 name Existed_Raid, state offline 00:13:38.818 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:38.818 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:38.818 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:38.818 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:39.076 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:39.077 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:39.077 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:39.077 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:39.077 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:39.077 18:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:39.077 BaseBdev2 00:13:39.077 18:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:39.077 18:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:39.077 18:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:39.077 18:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:39.077 18:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:39.077 18:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:39.077 18:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:39.335 18:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:39.593 [ 00:13:39.593 { 00:13:39.593 "name": "BaseBdev2", 00:13:39.593 "aliases": [ 00:13:39.593 "ec88b282-91f3-4f3b-8f23-3968be56b030" 00:13:39.594 ], 00:13:39.594 "product_name": "Malloc disk", 00:13:39.594 "block_size": 512, 00:13:39.594 "num_blocks": 65536, 00:13:39.594 "uuid": "ec88b282-91f3-4f3b-8f23-3968be56b030", 00:13:39.594 "assigned_rate_limits": { 00:13:39.594 "rw_ios_per_sec": 0, 00:13:39.594 "rw_mbytes_per_sec": 0, 00:13:39.594 "r_mbytes_per_sec": 0, 00:13:39.594 "w_mbytes_per_sec": 0 00:13:39.594 }, 00:13:39.594 "claimed": false, 00:13:39.594 "zoned": false, 00:13:39.594 "supported_io_types": { 00:13:39.594 "read": true, 00:13:39.594 "write": true, 00:13:39.594 "unmap": true, 00:13:39.594 "flush": true, 00:13:39.594 "reset": true, 00:13:39.594 "nvme_admin": false, 00:13:39.594 "nvme_io": false, 00:13:39.594 "nvme_io_md": false, 00:13:39.594 "write_zeroes": true, 00:13:39.594 "zcopy": true, 00:13:39.594 "get_zone_info": false, 00:13:39.594 "zone_management": false, 00:13:39.594 "zone_append": false, 00:13:39.594 "compare": false, 00:13:39.594 "compare_and_write": false, 00:13:39.594 "abort": true, 00:13:39.594 "seek_hole": false, 00:13:39.594 "seek_data": false, 00:13:39.594 "copy": true, 00:13:39.594 "nvme_iov_md": false 00:13:39.594 }, 00:13:39.594 "memory_domains": [ 00:13:39.594 { 00:13:39.594 "dma_device_id": "system", 00:13:39.594 "dma_device_type": 1 00:13:39.594 }, 00:13:39.594 { 00:13:39.594 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:39.594 "dma_device_type": 2 00:13:39.594 } 00:13:39.594 ], 00:13:39.594 "driver_specific": {} 00:13:39.594 } 00:13:39.594 ] 00:13:39.594 18:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:39.594 18:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:39.594 18:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:39.594 18:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:39.594 BaseBdev3 00:13:39.594 18:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:39.594 18:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:39.594 18:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:39.594 18:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:39.594 18:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:39.594 18:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:39.594 18:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:39.852 18:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:40.111 [ 00:13:40.111 { 00:13:40.111 "name": "BaseBdev3", 00:13:40.111 "aliases": [ 00:13:40.111 "cf281e99-13a0-44f5-8354-9324e89db88c" 00:13:40.111 ], 00:13:40.111 "product_name": "Malloc disk", 00:13:40.111 "block_size": 512, 00:13:40.111 "num_blocks": 65536, 00:13:40.111 "uuid": "cf281e99-13a0-44f5-8354-9324e89db88c", 00:13:40.111 "assigned_rate_limits": { 00:13:40.111 "rw_ios_per_sec": 0, 00:13:40.111 "rw_mbytes_per_sec": 0, 00:13:40.111 "r_mbytes_per_sec": 0, 00:13:40.111 "w_mbytes_per_sec": 0 00:13:40.111 }, 00:13:40.111 "claimed": false, 00:13:40.111 "zoned": false, 00:13:40.111 "supported_io_types": { 00:13:40.111 "read": true, 00:13:40.111 "write": true, 00:13:40.111 "unmap": true, 00:13:40.111 "flush": true, 00:13:40.111 "reset": true, 00:13:40.111 "nvme_admin": false, 00:13:40.111 "nvme_io": false, 00:13:40.111 "nvme_io_md": false, 00:13:40.111 "write_zeroes": true, 00:13:40.111 "zcopy": true, 00:13:40.111 "get_zone_info": false, 00:13:40.111 "zone_management": false, 00:13:40.111 "zone_append": false, 00:13:40.111 "compare": false, 00:13:40.111 "compare_and_write": false, 00:13:40.111 "abort": true, 00:13:40.111 "seek_hole": false, 00:13:40.111 "seek_data": false, 00:13:40.111 "copy": true, 00:13:40.111 "nvme_iov_md": false 00:13:40.111 }, 00:13:40.111 "memory_domains": [ 00:13:40.111 { 00:13:40.111 "dma_device_id": "system", 00:13:40.111 "dma_device_type": 1 00:13:40.111 }, 00:13:40.111 { 00:13:40.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.111 "dma_device_type": 2 00:13:40.111 } 00:13:40.111 ], 00:13:40.111 "driver_specific": {} 00:13:40.111 } 00:13:40.111 ] 00:13:40.111 18:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:40.111 18:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:40.111 18:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:40.111 18:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:40.111 [2024-07-24 18:50:25.010243] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:40.111 [2024-07-24 18:50:25.010274] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:40.111 [2024-07-24 18:50:25.010286] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:40.111 [2024-07-24 18:50:25.011214] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:40.111 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:40.111 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:40.111 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:40.111 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:40.111 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:40.111 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:40.111 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:40.111 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:40.111 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:40.111 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:40.111 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.111 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:40.369 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:40.369 "name": "Existed_Raid", 00:13:40.369 "uuid": "5493521b-5ef4-4a72-b2e5-a62f25298a77", 00:13:40.369 "strip_size_kb": 0, 00:13:40.369 "state": "configuring", 00:13:40.369 "raid_level": "raid1", 00:13:40.369 "superblock": true, 00:13:40.369 "num_base_bdevs": 3, 00:13:40.369 "num_base_bdevs_discovered": 2, 00:13:40.369 "num_base_bdevs_operational": 3, 00:13:40.369 "base_bdevs_list": [ 00:13:40.369 { 00:13:40.369 "name": "BaseBdev1", 00:13:40.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:40.369 "is_configured": false, 00:13:40.369 "data_offset": 0, 00:13:40.369 "data_size": 0 00:13:40.369 }, 00:13:40.369 { 00:13:40.369 "name": "BaseBdev2", 00:13:40.369 "uuid": "ec88b282-91f3-4f3b-8f23-3968be56b030", 00:13:40.369 "is_configured": true, 00:13:40.369 "data_offset": 2048, 00:13:40.369 "data_size": 63488 00:13:40.369 }, 00:13:40.369 { 00:13:40.369 "name": "BaseBdev3", 00:13:40.369 "uuid": "cf281e99-13a0-44f5-8354-9324e89db88c", 00:13:40.369 "is_configured": true, 00:13:40.369 "data_offset": 2048, 00:13:40.370 "data_size": 63488 00:13:40.370 } 00:13:40.370 ] 00:13:40.370 }' 00:13:40.370 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:40.370 18:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:40.935 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:40.936 [2024-07-24 18:50:25.820354] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:40.936 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:40.936 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:40.936 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:40.936 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:40.936 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:40.936 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:40.936 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:40.936 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:40.936 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:40.936 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:40.936 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.936 18:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:41.195 18:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:41.195 "name": "Existed_Raid", 00:13:41.195 "uuid": "5493521b-5ef4-4a72-b2e5-a62f25298a77", 00:13:41.195 "strip_size_kb": 0, 00:13:41.195 "state": "configuring", 00:13:41.195 "raid_level": "raid1", 00:13:41.195 "superblock": true, 00:13:41.195 "num_base_bdevs": 3, 00:13:41.195 "num_base_bdevs_discovered": 1, 00:13:41.195 "num_base_bdevs_operational": 3, 00:13:41.195 "base_bdevs_list": [ 00:13:41.195 { 00:13:41.195 "name": "BaseBdev1", 00:13:41.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:41.195 "is_configured": false, 00:13:41.195 "data_offset": 0, 00:13:41.195 "data_size": 0 00:13:41.195 }, 00:13:41.195 { 00:13:41.195 "name": null, 00:13:41.195 "uuid": "ec88b282-91f3-4f3b-8f23-3968be56b030", 00:13:41.195 "is_configured": false, 00:13:41.195 "data_offset": 2048, 00:13:41.195 "data_size": 63488 00:13:41.195 }, 00:13:41.195 { 00:13:41.195 "name": "BaseBdev3", 00:13:41.195 "uuid": "cf281e99-13a0-44f5-8354-9324e89db88c", 00:13:41.195 "is_configured": true, 00:13:41.195 "data_offset": 2048, 00:13:41.195 "data_size": 63488 00:13:41.195 } 00:13:41.195 ] 00:13:41.195 }' 00:13:41.195 18:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:41.195 18:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:41.761 18:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.761 18:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:41.761 18:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:41.761 18:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:42.020 [2024-07-24 18:50:26.805614] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:42.020 BaseBdev1 00:13:42.020 18:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:42.020 18:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:42.020 18:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:42.020 18:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:42.020 18:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:42.020 18:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:42.020 18:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:42.020 18:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:42.279 [ 00:13:42.279 { 00:13:42.279 "name": "BaseBdev1", 00:13:42.279 "aliases": [ 00:13:42.279 "86c9a055-077c-4376-b87f-c6919ace8b2f" 00:13:42.279 ], 00:13:42.279 "product_name": "Malloc disk", 00:13:42.279 "block_size": 512, 00:13:42.279 "num_blocks": 65536, 00:13:42.279 "uuid": "86c9a055-077c-4376-b87f-c6919ace8b2f", 00:13:42.279 "assigned_rate_limits": { 00:13:42.279 "rw_ios_per_sec": 0, 00:13:42.279 "rw_mbytes_per_sec": 0, 00:13:42.279 "r_mbytes_per_sec": 0, 00:13:42.279 "w_mbytes_per_sec": 0 00:13:42.279 }, 00:13:42.279 "claimed": true, 00:13:42.279 "claim_type": "exclusive_write", 00:13:42.279 "zoned": false, 00:13:42.279 "supported_io_types": { 00:13:42.279 "read": true, 00:13:42.279 "write": true, 00:13:42.279 "unmap": true, 00:13:42.279 "flush": true, 00:13:42.279 "reset": true, 00:13:42.279 "nvme_admin": false, 00:13:42.279 "nvme_io": false, 00:13:42.279 "nvme_io_md": false, 00:13:42.279 "write_zeroes": true, 00:13:42.279 "zcopy": true, 00:13:42.279 "get_zone_info": false, 00:13:42.279 "zone_management": false, 00:13:42.279 "zone_append": false, 00:13:42.279 "compare": false, 00:13:42.279 "compare_and_write": false, 00:13:42.279 "abort": true, 00:13:42.279 "seek_hole": false, 00:13:42.279 "seek_data": false, 00:13:42.279 "copy": true, 00:13:42.279 "nvme_iov_md": false 00:13:42.279 }, 00:13:42.279 "memory_domains": [ 00:13:42.279 { 00:13:42.279 "dma_device_id": "system", 00:13:42.279 "dma_device_type": 1 00:13:42.279 }, 00:13:42.279 { 00:13:42.279 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:42.279 "dma_device_type": 2 00:13:42.279 } 00:13:42.279 ], 00:13:42.279 "driver_specific": {} 00:13:42.279 } 00:13:42.279 ] 00:13:42.279 18:50:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:42.279 18:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:42.279 18:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:42.279 18:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:42.279 18:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:42.279 18:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:42.279 18:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:42.279 18:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:42.279 18:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:42.279 18:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:42.279 18:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:42.279 18:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.279 18:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:42.537 18:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:42.537 "name": "Existed_Raid", 00:13:42.537 "uuid": "5493521b-5ef4-4a72-b2e5-a62f25298a77", 00:13:42.537 "strip_size_kb": 0, 00:13:42.537 "state": "configuring", 00:13:42.537 "raid_level": "raid1", 00:13:42.537 "superblock": true, 00:13:42.537 "num_base_bdevs": 3, 00:13:42.537 "num_base_bdevs_discovered": 2, 00:13:42.537 "num_base_bdevs_operational": 3, 00:13:42.537 "base_bdevs_list": [ 00:13:42.537 { 00:13:42.537 "name": "BaseBdev1", 00:13:42.537 "uuid": "86c9a055-077c-4376-b87f-c6919ace8b2f", 00:13:42.537 "is_configured": true, 00:13:42.537 "data_offset": 2048, 00:13:42.537 "data_size": 63488 00:13:42.537 }, 00:13:42.537 { 00:13:42.537 "name": null, 00:13:42.537 "uuid": "ec88b282-91f3-4f3b-8f23-3968be56b030", 00:13:42.537 "is_configured": false, 00:13:42.537 "data_offset": 2048, 00:13:42.537 "data_size": 63488 00:13:42.537 }, 00:13:42.537 { 00:13:42.537 "name": "BaseBdev3", 00:13:42.537 "uuid": "cf281e99-13a0-44f5-8354-9324e89db88c", 00:13:42.537 "is_configured": true, 00:13:42.537 "data_offset": 2048, 00:13:42.537 "data_size": 63488 00:13:42.537 } 00:13:42.537 ] 00:13:42.537 }' 00:13:42.537 18:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:42.537 18:50:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:43.104 18:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.104 18:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:43.104 18:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:43.104 18:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:43.362 [2024-07-24 18:50:28.133082] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:43.362 18:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:43.362 18:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:43.362 18:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:43.362 18:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:43.362 18:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:43.362 18:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:43.362 18:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:43.362 18:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:43.362 18:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:43.362 18:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:43.362 18:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.362 18:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:43.362 18:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:43.362 "name": "Existed_Raid", 00:13:43.362 "uuid": "5493521b-5ef4-4a72-b2e5-a62f25298a77", 00:13:43.362 "strip_size_kb": 0, 00:13:43.362 "state": "configuring", 00:13:43.362 "raid_level": "raid1", 00:13:43.362 "superblock": true, 00:13:43.362 "num_base_bdevs": 3, 00:13:43.362 "num_base_bdevs_discovered": 1, 00:13:43.362 "num_base_bdevs_operational": 3, 00:13:43.362 "base_bdevs_list": [ 00:13:43.362 { 00:13:43.362 "name": "BaseBdev1", 00:13:43.362 "uuid": "86c9a055-077c-4376-b87f-c6919ace8b2f", 00:13:43.362 "is_configured": true, 00:13:43.362 "data_offset": 2048, 00:13:43.362 "data_size": 63488 00:13:43.362 }, 00:13:43.362 { 00:13:43.362 "name": null, 00:13:43.362 "uuid": "ec88b282-91f3-4f3b-8f23-3968be56b030", 00:13:43.362 "is_configured": false, 00:13:43.362 "data_offset": 2048, 00:13:43.362 "data_size": 63488 00:13:43.362 }, 00:13:43.362 { 00:13:43.362 "name": null, 00:13:43.362 "uuid": "cf281e99-13a0-44f5-8354-9324e89db88c", 00:13:43.362 "is_configured": false, 00:13:43.362 "data_offset": 2048, 00:13:43.362 "data_size": 63488 00:13:43.362 } 00:13:43.362 ] 00:13:43.362 }' 00:13:43.362 18:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:43.362 18:50:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:43.928 18:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.928 18:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:44.186 18:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:44.186 18:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:44.186 [2024-07-24 18:50:29.123679] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:44.186 18:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:44.186 18:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:44.186 18:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:44.186 18:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:44.186 18:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:44.186 18:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:44.186 18:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:44.186 18:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:44.186 18:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:44.186 18:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:44.186 18:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:44.186 18:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:44.444 18:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:44.444 "name": "Existed_Raid", 00:13:44.444 "uuid": "5493521b-5ef4-4a72-b2e5-a62f25298a77", 00:13:44.444 "strip_size_kb": 0, 00:13:44.444 "state": "configuring", 00:13:44.444 "raid_level": "raid1", 00:13:44.444 "superblock": true, 00:13:44.444 "num_base_bdevs": 3, 00:13:44.444 "num_base_bdevs_discovered": 2, 00:13:44.444 "num_base_bdevs_operational": 3, 00:13:44.444 "base_bdevs_list": [ 00:13:44.444 { 00:13:44.444 "name": "BaseBdev1", 00:13:44.444 "uuid": "86c9a055-077c-4376-b87f-c6919ace8b2f", 00:13:44.444 "is_configured": true, 00:13:44.444 "data_offset": 2048, 00:13:44.444 "data_size": 63488 00:13:44.444 }, 00:13:44.444 { 00:13:44.444 "name": null, 00:13:44.444 "uuid": "ec88b282-91f3-4f3b-8f23-3968be56b030", 00:13:44.444 "is_configured": false, 00:13:44.444 "data_offset": 2048, 00:13:44.444 "data_size": 63488 00:13:44.444 }, 00:13:44.444 { 00:13:44.444 "name": "BaseBdev3", 00:13:44.444 "uuid": "cf281e99-13a0-44f5-8354-9324e89db88c", 00:13:44.444 "is_configured": true, 00:13:44.444 "data_offset": 2048, 00:13:44.444 "data_size": 63488 00:13:44.444 } 00:13:44.444 ] 00:13:44.444 }' 00:13:44.444 18:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:44.444 18:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:45.016 18:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.016 18:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:45.016 18:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:45.016 18:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:45.274 [2024-07-24 18:50:30.090228] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:45.274 18:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:45.274 18:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:45.274 18:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:45.274 18:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:45.274 18:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:45.274 18:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:45.274 18:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:45.274 18:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:45.274 18:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:45.274 18:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:45.274 18:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.274 18:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:45.532 18:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:45.532 "name": "Existed_Raid", 00:13:45.532 "uuid": "5493521b-5ef4-4a72-b2e5-a62f25298a77", 00:13:45.532 "strip_size_kb": 0, 00:13:45.532 "state": "configuring", 00:13:45.532 "raid_level": "raid1", 00:13:45.532 "superblock": true, 00:13:45.532 "num_base_bdevs": 3, 00:13:45.532 "num_base_bdevs_discovered": 1, 00:13:45.532 "num_base_bdevs_operational": 3, 00:13:45.532 "base_bdevs_list": [ 00:13:45.532 { 00:13:45.532 "name": null, 00:13:45.532 "uuid": "86c9a055-077c-4376-b87f-c6919ace8b2f", 00:13:45.532 "is_configured": false, 00:13:45.532 "data_offset": 2048, 00:13:45.532 "data_size": 63488 00:13:45.532 }, 00:13:45.532 { 00:13:45.532 "name": null, 00:13:45.532 "uuid": "ec88b282-91f3-4f3b-8f23-3968be56b030", 00:13:45.532 "is_configured": false, 00:13:45.532 "data_offset": 2048, 00:13:45.532 "data_size": 63488 00:13:45.532 }, 00:13:45.532 { 00:13:45.532 "name": "BaseBdev3", 00:13:45.532 "uuid": "cf281e99-13a0-44f5-8354-9324e89db88c", 00:13:45.532 "is_configured": true, 00:13:45.532 "data_offset": 2048, 00:13:45.532 "data_size": 63488 00:13:45.532 } 00:13:45.532 ] 00:13:45.532 }' 00:13:45.532 18:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:45.532 18:50:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:45.790 18:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:45.790 18:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.049 18:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:46.049 18:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:46.307 [2024-07-24 18:50:31.098414] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:46.307 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:46.307 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:46.307 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:46.307 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:46.307 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:46.307 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:46.307 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:46.307 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:46.307 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:46.307 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:46.307 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.307 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:46.307 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:46.307 "name": "Existed_Raid", 00:13:46.307 "uuid": "5493521b-5ef4-4a72-b2e5-a62f25298a77", 00:13:46.307 "strip_size_kb": 0, 00:13:46.307 "state": "configuring", 00:13:46.307 "raid_level": "raid1", 00:13:46.307 "superblock": true, 00:13:46.307 "num_base_bdevs": 3, 00:13:46.307 "num_base_bdevs_discovered": 2, 00:13:46.307 "num_base_bdevs_operational": 3, 00:13:46.307 "base_bdevs_list": [ 00:13:46.307 { 00:13:46.307 "name": null, 00:13:46.307 "uuid": "86c9a055-077c-4376-b87f-c6919ace8b2f", 00:13:46.307 "is_configured": false, 00:13:46.307 "data_offset": 2048, 00:13:46.307 "data_size": 63488 00:13:46.307 }, 00:13:46.307 { 00:13:46.307 "name": "BaseBdev2", 00:13:46.307 "uuid": "ec88b282-91f3-4f3b-8f23-3968be56b030", 00:13:46.307 "is_configured": true, 00:13:46.307 "data_offset": 2048, 00:13:46.307 "data_size": 63488 00:13:46.307 }, 00:13:46.307 { 00:13:46.307 "name": "BaseBdev3", 00:13:46.307 "uuid": "cf281e99-13a0-44f5-8354-9324e89db88c", 00:13:46.307 "is_configured": true, 00:13:46.307 "data_offset": 2048, 00:13:46.307 "data_size": 63488 00:13:46.307 } 00:13:46.307 ] 00:13:46.307 }' 00:13:46.307 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:46.307 18:50:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:46.892 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.892 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:47.151 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:47.151 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.151 18:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:47.151 18:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 86c9a055-077c-4376-b87f-c6919ace8b2f 00:13:47.409 [2024-07-24 18:50:32.268153] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:47.409 [2024-07-24 18:50:32.268264] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x250d580 00:13:47.409 [2024-07-24 18:50:32.268272] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:47.409 [2024-07-24 18:50:32.268383] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26bedb0 00:13:47.409 [2024-07-24 18:50:32.268477] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x250d580 00:13:47.409 [2024-07-24 18:50:32.268485] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x250d580 00:13:47.409 [2024-07-24 18:50:32.268557] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:47.409 NewBaseBdev 00:13:47.409 18:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:47.409 18:50:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:13:47.409 18:50:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:47.409 18:50:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:47.409 18:50:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:47.409 18:50:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:47.409 18:50:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:47.667 18:50:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:47.667 [ 00:13:47.667 { 00:13:47.667 "name": "NewBaseBdev", 00:13:47.667 "aliases": [ 00:13:47.667 "86c9a055-077c-4376-b87f-c6919ace8b2f" 00:13:47.667 ], 00:13:47.667 "product_name": "Malloc disk", 00:13:47.667 "block_size": 512, 00:13:47.667 "num_blocks": 65536, 00:13:47.667 "uuid": "86c9a055-077c-4376-b87f-c6919ace8b2f", 00:13:47.667 "assigned_rate_limits": { 00:13:47.667 "rw_ios_per_sec": 0, 00:13:47.667 "rw_mbytes_per_sec": 0, 00:13:47.667 "r_mbytes_per_sec": 0, 00:13:47.667 "w_mbytes_per_sec": 0 00:13:47.667 }, 00:13:47.667 "claimed": true, 00:13:47.667 "claim_type": "exclusive_write", 00:13:47.667 "zoned": false, 00:13:47.667 "supported_io_types": { 00:13:47.667 "read": true, 00:13:47.667 "write": true, 00:13:47.667 "unmap": true, 00:13:47.667 "flush": true, 00:13:47.667 "reset": true, 00:13:47.667 "nvme_admin": false, 00:13:47.667 "nvme_io": false, 00:13:47.667 "nvme_io_md": false, 00:13:47.667 "write_zeroes": true, 00:13:47.667 "zcopy": true, 00:13:47.667 "get_zone_info": false, 00:13:47.667 "zone_management": false, 00:13:47.667 "zone_append": false, 00:13:47.667 "compare": false, 00:13:47.667 "compare_and_write": false, 00:13:47.667 "abort": true, 00:13:47.667 "seek_hole": false, 00:13:47.667 "seek_data": false, 00:13:47.667 "copy": true, 00:13:47.667 "nvme_iov_md": false 00:13:47.667 }, 00:13:47.667 "memory_domains": [ 00:13:47.667 { 00:13:47.667 "dma_device_id": "system", 00:13:47.667 "dma_device_type": 1 00:13:47.667 }, 00:13:47.667 { 00:13:47.667 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:47.667 "dma_device_type": 2 00:13:47.667 } 00:13:47.667 ], 00:13:47.667 "driver_specific": {} 00:13:47.667 } 00:13:47.667 ] 00:13:47.667 18:50:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:47.667 18:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:13:47.667 18:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:47.667 18:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:47.667 18:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:47.667 18:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:47.667 18:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:47.667 18:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:47.667 18:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:47.667 18:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:47.667 18:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:47.667 18:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.667 18:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:47.925 18:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:47.925 "name": "Existed_Raid", 00:13:47.925 "uuid": "5493521b-5ef4-4a72-b2e5-a62f25298a77", 00:13:47.925 "strip_size_kb": 0, 00:13:47.925 "state": "online", 00:13:47.925 "raid_level": "raid1", 00:13:47.925 "superblock": true, 00:13:47.925 "num_base_bdevs": 3, 00:13:47.925 "num_base_bdevs_discovered": 3, 00:13:47.925 "num_base_bdevs_operational": 3, 00:13:47.925 "base_bdevs_list": [ 00:13:47.925 { 00:13:47.925 "name": "NewBaseBdev", 00:13:47.925 "uuid": "86c9a055-077c-4376-b87f-c6919ace8b2f", 00:13:47.925 "is_configured": true, 00:13:47.925 "data_offset": 2048, 00:13:47.925 "data_size": 63488 00:13:47.925 }, 00:13:47.925 { 00:13:47.925 "name": "BaseBdev2", 00:13:47.925 "uuid": "ec88b282-91f3-4f3b-8f23-3968be56b030", 00:13:47.925 "is_configured": true, 00:13:47.925 "data_offset": 2048, 00:13:47.925 "data_size": 63488 00:13:47.925 }, 00:13:47.925 { 00:13:47.925 "name": "BaseBdev3", 00:13:47.925 "uuid": "cf281e99-13a0-44f5-8354-9324e89db88c", 00:13:47.925 "is_configured": true, 00:13:47.925 "data_offset": 2048, 00:13:47.925 "data_size": 63488 00:13:47.925 } 00:13:47.925 ] 00:13:47.925 }' 00:13:47.925 18:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:47.925 18:50:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:48.493 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:48.493 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:48.493 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:48.493 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:48.493 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:48.493 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:48.493 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:48.493 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:48.493 [2024-07-24 18:50:33.395256] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:48.493 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:48.493 "name": "Existed_Raid", 00:13:48.493 "aliases": [ 00:13:48.493 "5493521b-5ef4-4a72-b2e5-a62f25298a77" 00:13:48.493 ], 00:13:48.493 "product_name": "Raid Volume", 00:13:48.493 "block_size": 512, 00:13:48.493 "num_blocks": 63488, 00:13:48.493 "uuid": "5493521b-5ef4-4a72-b2e5-a62f25298a77", 00:13:48.493 "assigned_rate_limits": { 00:13:48.493 "rw_ios_per_sec": 0, 00:13:48.493 "rw_mbytes_per_sec": 0, 00:13:48.493 "r_mbytes_per_sec": 0, 00:13:48.493 "w_mbytes_per_sec": 0 00:13:48.493 }, 00:13:48.493 "claimed": false, 00:13:48.493 "zoned": false, 00:13:48.493 "supported_io_types": { 00:13:48.493 "read": true, 00:13:48.493 "write": true, 00:13:48.493 "unmap": false, 00:13:48.493 "flush": false, 00:13:48.493 "reset": true, 00:13:48.493 "nvme_admin": false, 00:13:48.493 "nvme_io": false, 00:13:48.493 "nvme_io_md": false, 00:13:48.493 "write_zeroes": true, 00:13:48.493 "zcopy": false, 00:13:48.493 "get_zone_info": false, 00:13:48.493 "zone_management": false, 00:13:48.493 "zone_append": false, 00:13:48.493 "compare": false, 00:13:48.493 "compare_and_write": false, 00:13:48.493 "abort": false, 00:13:48.493 "seek_hole": false, 00:13:48.493 "seek_data": false, 00:13:48.493 "copy": false, 00:13:48.493 "nvme_iov_md": false 00:13:48.493 }, 00:13:48.493 "memory_domains": [ 00:13:48.493 { 00:13:48.493 "dma_device_id": "system", 00:13:48.493 "dma_device_type": 1 00:13:48.493 }, 00:13:48.493 { 00:13:48.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.493 "dma_device_type": 2 00:13:48.493 }, 00:13:48.493 { 00:13:48.493 "dma_device_id": "system", 00:13:48.493 "dma_device_type": 1 00:13:48.493 }, 00:13:48.493 { 00:13:48.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.493 "dma_device_type": 2 00:13:48.493 }, 00:13:48.493 { 00:13:48.493 "dma_device_id": "system", 00:13:48.493 "dma_device_type": 1 00:13:48.493 }, 00:13:48.493 { 00:13:48.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.493 "dma_device_type": 2 00:13:48.493 } 00:13:48.493 ], 00:13:48.493 "driver_specific": { 00:13:48.493 "raid": { 00:13:48.493 "uuid": "5493521b-5ef4-4a72-b2e5-a62f25298a77", 00:13:48.493 "strip_size_kb": 0, 00:13:48.493 "state": "online", 00:13:48.493 "raid_level": "raid1", 00:13:48.493 "superblock": true, 00:13:48.493 "num_base_bdevs": 3, 00:13:48.493 "num_base_bdevs_discovered": 3, 00:13:48.493 "num_base_bdevs_operational": 3, 00:13:48.493 "base_bdevs_list": [ 00:13:48.493 { 00:13:48.493 "name": "NewBaseBdev", 00:13:48.493 "uuid": "86c9a055-077c-4376-b87f-c6919ace8b2f", 00:13:48.493 "is_configured": true, 00:13:48.493 "data_offset": 2048, 00:13:48.493 "data_size": 63488 00:13:48.493 }, 00:13:48.493 { 00:13:48.493 "name": "BaseBdev2", 00:13:48.493 "uuid": "ec88b282-91f3-4f3b-8f23-3968be56b030", 00:13:48.493 "is_configured": true, 00:13:48.493 "data_offset": 2048, 00:13:48.493 "data_size": 63488 00:13:48.493 }, 00:13:48.493 { 00:13:48.493 "name": "BaseBdev3", 00:13:48.493 "uuid": "cf281e99-13a0-44f5-8354-9324e89db88c", 00:13:48.493 "is_configured": true, 00:13:48.493 "data_offset": 2048, 00:13:48.493 "data_size": 63488 00:13:48.493 } 00:13:48.493 ] 00:13:48.493 } 00:13:48.493 } 00:13:48.493 }' 00:13:48.493 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:48.493 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:48.493 BaseBdev2 00:13:48.493 BaseBdev3' 00:13:48.493 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:48.493 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:48.493 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:48.752 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:48.752 "name": "NewBaseBdev", 00:13:48.752 "aliases": [ 00:13:48.752 "86c9a055-077c-4376-b87f-c6919ace8b2f" 00:13:48.752 ], 00:13:48.752 "product_name": "Malloc disk", 00:13:48.752 "block_size": 512, 00:13:48.752 "num_blocks": 65536, 00:13:48.752 "uuid": "86c9a055-077c-4376-b87f-c6919ace8b2f", 00:13:48.752 "assigned_rate_limits": { 00:13:48.752 "rw_ios_per_sec": 0, 00:13:48.752 "rw_mbytes_per_sec": 0, 00:13:48.752 "r_mbytes_per_sec": 0, 00:13:48.752 "w_mbytes_per_sec": 0 00:13:48.752 }, 00:13:48.752 "claimed": true, 00:13:48.752 "claim_type": "exclusive_write", 00:13:48.752 "zoned": false, 00:13:48.752 "supported_io_types": { 00:13:48.752 "read": true, 00:13:48.752 "write": true, 00:13:48.752 "unmap": true, 00:13:48.752 "flush": true, 00:13:48.752 "reset": true, 00:13:48.752 "nvme_admin": false, 00:13:48.752 "nvme_io": false, 00:13:48.752 "nvme_io_md": false, 00:13:48.752 "write_zeroes": true, 00:13:48.752 "zcopy": true, 00:13:48.752 "get_zone_info": false, 00:13:48.752 "zone_management": false, 00:13:48.752 "zone_append": false, 00:13:48.752 "compare": false, 00:13:48.752 "compare_and_write": false, 00:13:48.752 "abort": true, 00:13:48.752 "seek_hole": false, 00:13:48.752 "seek_data": false, 00:13:48.752 "copy": true, 00:13:48.752 "nvme_iov_md": false 00:13:48.752 }, 00:13:48.752 "memory_domains": [ 00:13:48.752 { 00:13:48.752 "dma_device_id": "system", 00:13:48.752 "dma_device_type": 1 00:13:48.752 }, 00:13:48.752 { 00:13:48.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.752 "dma_device_type": 2 00:13:48.752 } 00:13:48.752 ], 00:13:48.752 "driver_specific": {} 00:13:48.752 }' 00:13:48.752 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:48.752 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:48.752 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:48.752 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.010 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.010 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:49.010 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.010 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.010 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:49.010 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.010 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.010 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:49.010 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:49.010 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:49.010 18:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:49.269 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:49.269 "name": "BaseBdev2", 00:13:49.269 "aliases": [ 00:13:49.269 "ec88b282-91f3-4f3b-8f23-3968be56b030" 00:13:49.269 ], 00:13:49.269 "product_name": "Malloc disk", 00:13:49.269 "block_size": 512, 00:13:49.269 "num_blocks": 65536, 00:13:49.269 "uuid": "ec88b282-91f3-4f3b-8f23-3968be56b030", 00:13:49.269 "assigned_rate_limits": { 00:13:49.269 "rw_ios_per_sec": 0, 00:13:49.269 "rw_mbytes_per_sec": 0, 00:13:49.269 "r_mbytes_per_sec": 0, 00:13:49.269 "w_mbytes_per_sec": 0 00:13:49.269 }, 00:13:49.269 "claimed": true, 00:13:49.269 "claim_type": "exclusive_write", 00:13:49.269 "zoned": false, 00:13:49.269 "supported_io_types": { 00:13:49.269 "read": true, 00:13:49.269 "write": true, 00:13:49.269 "unmap": true, 00:13:49.269 "flush": true, 00:13:49.269 "reset": true, 00:13:49.269 "nvme_admin": false, 00:13:49.269 "nvme_io": false, 00:13:49.269 "nvme_io_md": false, 00:13:49.269 "write_zeroes": true, 00:13:49.269 "zcopy": true, 00:13:49.269 "get_zone_info": false, 00:13:49.269 "zone_management": false, 00:13:49.269 "zone_append": false, 00:13:49.269 "compare": false, 00:13:49.269 "compare_and_write": false, 00:13:49.269 "abort": true, 00:13:49.269 "seek_hole": false, 00:13:49.269 "seek_data": false, 00:13:49.269 "copy": true, 00:13:49.269 "nvme_iov_md": false 00:13:49.269 }, 00:13:49.269 "memory_domains": [ 00:13:49.269 { 00:13:49.269 "dma_device_id": "system", 00:13:49.269 "dma_device_type": 1 00:13:49.269 }, 00:13:49.269 { 00:13:49.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.269 "dma_device_type": 2 00:13:49.269 } 00:13:49.269 ], 00:13:49.269 "driver_specific": {} 00:13:49.269 }' 00:13:49.269 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:49.269 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:49.269 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:49.269 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.269 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.527 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:49.527 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.527 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.527 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:49.527 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.527 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.527 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:49.527 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:49.527 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:49.527 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:49.786 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:49.786 "name": "BaseBdev3", 00:13:49.786 "aliases": [ 00:13:49.786 "cf281e99-13a0-44f5-8354-9324e89db88c" 00:13:49.786 ], 00:13:49.786 "product_name": "Malloc disk", 00:13:49.786 "block_size": 512, 00:13:49.786 "num_blocks": 65536, 00:13:49.786 "uuid": "cf281e99-13a0-44f5-8354-9324e89db88c", 00:13:49.786 "assigned_rate_limits": { 00:13:49.786 "rw_ios_per_sec": 0, 00:13:49.786 "rw_mbytes_per_sec": 0, 00:13:49.786 "r_mbytes_per_sec": 0, 00:13:49.786 "w_mbytes_per_sec": 0 00:13:49.786 }, 00:13:49.786 "claimed": true, 00:13:49.786 "claim_type": "exclusive_write", 00:13:49.786 "zoned": false, 00:13:49.786 "supported_io_types": { 00:13:49.786 "read": true, 00:13:49.786 "write": true, 00:13:49.786 "unmap": true, 00:13:49.786 "flush": true, 00:13:49.786 "reset": true, 00:13:49.786 "nvme_admin": false, 00:13:49.786 "nvme_io": false, 00:13:49.786 "nvme_io_md": false, 00:13:49.786 "write_zeroes": true, 00:13:49.786 "zcopy": true, 00:13:49.786 "get_zone_info": false, 00:13:49.786 "zone_management": false, 00:13:49.786 "zone_append": false, 00:13:49.786 "compare": false, 00:13:49.786 "compare_and_write": false, 00:13:49.786 "abort": true, 00:13:49.786 "seek_hole": false, 00:13:49.786 "seek_data": false, 00:13:49.786 "copy": true, 00:13:49.786 "nvme_iov_md": false 00:13:49.786 }, 00:13:49.786 "memory_domains": [ 00:13:49.786 { 00:13:49.786 "dma_device_id": "system", 00:13:49.786 "dma_device_type": 1 00:13:49.786 }, 00:13:49.786 { 00:13:49.786 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.786 "dma_device_type": 2 00:13:49.786 } 00:13:49.786 ], 00:13:49.786 "driver_specific": {} 00:13:49.786 }' 00:13:49.786 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:49.786 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:49.786 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:49.786 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.786 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.786 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:49.786 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.786 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:50.045 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:50.045 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:50.045 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:50.045 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:50.045 18:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:50.045 [2024-07-24 18:50:35.047337] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:50.045 [2024-07-24 18:50:35.047358] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:50.045 [2024-07-24 18:50:35.047397] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:50.045 [2024-07-24 18:50:35.047578] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:50.045 [2024-07-24 18:50:35.047585] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x250d580 name Existed_Raid, state offline 00:13:50.304 18:50:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2096427 00:13:50.304 18:50:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2096427 ']' 00:13:50.304 18:50:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2096427 00:13:50.304 18:50:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:13:50.304 18:50:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:50.304 18:50:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2096427 00:13:50.304 18:50:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:50.304 18:50:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:50.304 18:50:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2096427' 00:13:50.304 killing process with pid 2096427 00:13:50.304 18:50:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2096427 00:13:50.304 [2024-07-24 18:50:35.104516] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:50.304 18:50:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2096427 00:13:50.304 [2024-07-24 18:50:35.127592] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:50.304 18:50:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:50.304 00:13:50.304 real 0m21.207s 00:13:50.304 user 0m39.557s 00:13:50.304 sys 0m3.268s 00:13:50.304 18:50:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:50.304 18:50:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:50.304 ************************************ 00:13:50.304 END TEST raid_state_function_test_sb 00:13:50.304 ************************************ 00:13:50.562 18:50:35 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:13:50.562 18:50:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:13:50.562 18:50:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:50.562 18:50:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:50.562 ************************************ 00:13:50.562 START TEST raid_superblock_test 00:13:50.562 ************************************ 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2100525 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2100525 /var/tmp/spdk-raid.sock 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2100525 ']' 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:50.562 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:50.562 18:50:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:50.562 [2024-07-24 18:50:35.423372] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:13:50.562 [2024-07-24 18:50:35.423412] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2100525 ] 00:13:50.562 [2024-07-24 18:50:35.487676] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:50.562 [2024-07-24 18:50:35.566564] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:50.820 [2024-07-24 18:50:35.624681] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:50.820 [2024-07-24 18:50:35.624703] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:51.387 18:50:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:51.387 18:50:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:13:51.387 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:13:51.387 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:51.387 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:13:51.387 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:13:51.387 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:51.387 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:51.387 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:51.387 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:51.387 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:51.387 malloc1 00:13:51.387 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:51.646 [2024-07-24 18:50:36.544746] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:51.646 [2024-07-24 18:50:36.544783] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:51.646 [2024-07-24 18:50:36.544795] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x88ce20 00:13:51.646 [2024-07-24 18:50:36.544802] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:51.646 [2024-07-24 18:50:36.545954] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:51.646 [2024-07-24 18:50:36.545975] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:51.646 pt1 00:13:51.646 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:51.646 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:51.646 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:13:51.646 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:13:51.646 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:51.646 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:51.646 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:51.646 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:51.646 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:51.904 malloc2 00:13:51.904 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:51.904 [2024-07-24 18:50:36.881316] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:51.904 [2024-07-24 18:50:36.881347] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:51.904 [2024-07-24 18:50:36.881356] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa36ed0 00:13:51.904 [2024-07-24 18:50:36.881362] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:51.904 [2024-07-24 18:50:36.882393] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:51.904 [2024-07-24 18:50:36.882413] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:51.904 pt2 00:13:51.904 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:51.904 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:51.904 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:13:51.904 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:13:51.904 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:13:51.904 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:51.904 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:51.904 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:51.904 18:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:13:52.163 malloc3 00:13:52.163 18:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:52.422 [2024-07-24 18:50:37.217662] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:52.422 [2024-07-24 18:50:37.217693] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:52.422 [2024-07-24 18:50:37.217702] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa3aa30 00:13:52.422 [2024-07-24 18:50:37.217708] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:52.422 [2024-07-24 18:50:37.218765] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:52.422 [2024-07-24 18:50:37.218784] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:52.422 pt3 00:13:52.422 18:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:52.422 18:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:52.422 18:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:13:52.422 [2024-07-24 18:50:37.386110] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:52.422 [2024-07-24 18:50:37.386983] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:52.422 [2024-07-24 18:50:37.387021] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:52.422 [2024-07-24 18:50:37.387120] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xa3ba40 00:13:52.422 [2024-07-24 18:50:37.387126] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:52.422 [2024-07-24 18:50:37.387250] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa36050 00:13:52.422 [2024-07-24 18:50:37.387348] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa3ba40 00:13:52.422 [2024-07-24 18:50:37.387353] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa3ba40 00:13:52.422 [2024-07-24 18:50:37.387413] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:52.422 18:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:13:52.422 18:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:52.422 18:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:52.422 18:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:52.422 18:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:52.422 18:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:52.422 18:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:52.422 18:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:52.422 18:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:52.422 18:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:52.422 18:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:52.422 18:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:52.734 18:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:52.734 "name": "raid_bdev1", 00:13:52.734 "uuid": "5acb312c-3f96-4b8a-a04a-778dc7a27318", 00:13:52.734 "strip_size_kb": 0, 00:13:52.734 "state": "online", 00:13:52.734 "raid_level": "raid1", 00:13:52.734 "superblock": true, 00:13:52.734 "num_base_bdevs": 3, 00:13:52.734 "num_base_bdevs_discovered": 3, 00:13:52.734 "num_base_bdevs_operational": 3, 00:13:52.734 "base_bdevs_list": [ 00:13:52.734 { 00:13:52.734 "name": "pt1", 00:13:52.734 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:52.734 "is_configured": true, 00:13:52.734 "data_offset": 2048, 00:13:52.734 "data_size": 63488 00:13:52.734 }, 00:13:52.734 { 00:13:52.734 "name": "pt2", 00:13:52.734 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:52.734 "is_configured": true, 00:13:52.734 "data_offset": 2048, 00:13:52.734 "data_size": 63488 00:13:52.734 }, 00:13:52.734 { 00:13:52.734 "name": "pt3", 00:13:52.734 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:52.734 "is_configured": true, 00:13:52.734 "data_offset": 2048, 00:13:52.734 "data_size": 63488 00:13:52.734 } 00:13:52.734 ] 00:13:52.734 }' 00:13:52.734 18:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:52.734 18:50:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:53.302 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:13:53.302 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:53.302 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:53.302 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:53.302 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:53.302 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:53.302 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:53.302 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:53.302 [2024-07-24 18:50:38.204372] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:53.302 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:53.302 "name": "raid_bdev1", 00:13:53.302 "aliases": [ 00:13:53.302 "5acb312c-3f96-4b8a-a04a-778dc7a27318" 00:13:53.302 ], 00:13:53.302 "product_name": "Raid Volume", 00:13:53.302 "block_size": 512, 00:13:53.302 "num_blocks": 63488, 00:13:53.302 "uuid": "5acb312c-3f96-4b8a-a04a-778dc7a27318", 00:13:53.302 "assigned_rate_limits": { 00:13:53.302 "rw_ios_per_sec": 0, 00:13:53.302 "rw_mbytes_per_sec": 0, 00:13:53.302 "r_mbytes_per_sec": 0, 00:13:53.302 "w_mbytes_per_sec": 0 00:13:53.302 }, 00:13:53.302 "claimed": false, 00:13:53.302 "zoned": false, 00:13:53.302 "supported_io_types": { 00:13:53.302 "read": true, 00:13:53.302 "write": true, 00:13:53.302 "unmap": false, 00:13:53.302 "flush": false, 00:13:53.302 "reset": true, 00:13:53.302 "nvme_admin": false, 00:13:53.302 "nvme_io": false, 00:13:53.302 "nvme_io_md": false, 00:13:53.302 "write_zeroes": true, 00:13:53.302 "zcopy": false, 00:13:53.302 "get_zone_info": false, 00:13:53.302 "zone_management": false, 00:13:53.302 "zone_append": false, 00:13:53.302 "compare": false, 00:13:53.302 "compare_and_write": false, 00:13:53.302 "abort": false, 00:13:53.302 "seek_hole": false, 00:13:53.302 "seek_data": false, 00:13:53.302 "copy": false, 00:13:53.302 "nvme_iov_md": false 00:13:53.302 }, 00:13:53.302 "memory_domains": [ 00:13:53.302 { 00:13:53.302 "dma_device_id": "system", 00:13:53.302 "dma_device_type": 1 00:13:53.302 }, 00:13:53.302 { 00:13:53.302 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:53.302 "dma_device_type": 2 00:13:53.302 }, 00:13:53.302 { 00:13:53.302 "dma_device_id": "system", 00:13:53.302 "dma_device_type": 1 00:13:53.302 }, 00:13:53.302 { 00:13:53.302 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:53.302 "dma_device_type": 2 00:13:53.302 }, 00:13:53.302 { 00:13:53.302 "dma_device_id": "system", 00:13:53.302 "dma_device_type": 1 00:13:53.302 }, 00:13:53.302 { 00:13:53.302 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:53.302 "dma_device_type": 2 00:13:53.303 } 00:13:53.303 ], 00:13:53.303 "driver_specific": { 00:13:53.303 "raid": { 00:13:53.303 "uuid": "5acb312c-3f96-4b8a-a04a-778dc7a27318", 00:13:53.303 "strip_size_kb": 0, 00:13:53.303 "state": "online", 00:13:53.303 "raid_level": "raid1", 00:13:53.303 "superblock": true, 00:13:53.303 "num_base_bdevs": 3, 00:13:53.303 "num_base_bdevs_discovered": 3, 00:13:53.303 "num_base_bdevs_operational": 3, 00:13:53.303 "base_bdevs_list": [ 00:13:53.303 { 00:13:53.303 "name": "pt1", 00:13:53.303 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:53.303 "is_configured": true, 00:13:53.303 "data_offset": 2048, 00:13:53.303 "data_size": 63488 00:13:53.303 }, 00:13:53.303 { 00:13:53.303 "name": "pt2", 00:13:53.303 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:53.303 "is_configured": true, 00:13:53.303 "data_offset": 2048, 00:13:53.303 "data_size": 63488 00:13:53.303 }, 00:13:53.303 { 00:13:53.303 "name": "pt3", 00:13:53.303 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:53.303 "is_configured": true, 00:13:53.303 "data_offset": 2048, 00:13:53.303 "data_size": 63488 00:13:53.303 } 00:13:53.303 ] 00:13:53.303 } 00:13:53.303 } 00:13:53.303 }' 00:13:53.303 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:53.303 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:53.303 pt2 00:13:53.303 pt3' 00:13:53.303 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:53.303 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:53.303 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:53.562 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:53.562 "name": "pt1", 00:13:53.562 "aliases": [ 00:13:53.562 "00000000-0000-0000-0000-000000000001" 00:13:53.562 ], 00:13:53.562 "product_name": "passthru", 00:13:53.562 "block_size": 512, 00:13:53.562 "num_blocks": 65536, 00:13:53.562 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:53.562 "assigned_rate_limits": { 00:13:53.562 "rw_ios_per_sec": 0, 00:13:53.562 "rw_mbytes_per_sec": 0, 00:13:53.562 "r_mbytes_per_sec": 0, 00:13:53.562 "w_mbytes_per_sec": 0 00:13:53.562 }, 00:13:53.562 "claimed": true, 00:13:53.562 "claim_type": "exclusive_write", 00:13:53.562 "zoned": false, 00:13:53.562 "supported_io_types": { 00:13:53.562 "read": true, 00:13:53.562 "write": true, 00:13:53.562 "unmap": true, 00:13:53.562 "flush": true, 00:13:53.562 "reset": true, 00:13:53.562 "nvme_admin": false, 00:13:53.562 "nvme_io": false, 00:13:53.562 "nvme_io_md": false, 00:13:53.562 "write_zeroes": true, 00:13:53.562 "zcopy": true, 00:13:53.562 "get_zone_info": false, 00:13:53.562 "zone_management": false, 00:13:53.562 "zone_append": false, 00:13:53.562 "compare": false, 00:13:53.562 "compare_and_write": false, 00:13:53.562 "abort": true, 00:13:53.562 "seek_hole": false, 00:13:53.562 "seek_data": false, 00:13:53.562 "copy": true, 00:13:53.562 "nvme_iov_md": false 00:13:53.562 }, 00:13:53.562 "memory_domains": [ 00:13:53.562 { 00:13:53.562 "dma_device_id": "system", 00:13:53.562 "dma_device_type": 1 00:13:53.562 }, 00:13:53.562 { 00:13:53.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:53.562 "dma_device_type": 2 00:13:53.562 } 00:13:53.562 ], 00:13:53.562 "driver_specific": { 00:13:53.562 "passthru": { 00:13:53.562 "name": "pt1", 00:13:53.562 "base_bdev_name": "malloc1" 00:13:53.562 } 00:13:53.562 } 00:13:53.562 }' 00:13:53.562 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:53.562 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:53.562 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:53.562 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:53.562 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:53.821 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:53.821 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:53.821 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:53.821 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:53.821 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:53.821 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:53.821 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:53.821 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:53.821 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:53.821 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:54.080 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:54.080 "name": "pt2", 00:13:54.080 "aliases": [ 00:13:54.080 "00000000-0000-0000-0000-000000000002" 00:13:54.080 ], 00:13:54.080 "product_name": "passthru", 00:13:54.080 "block_size": 512, 00:13:54.080 "num_blocks": 65536, 00:13:54.080 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:54.080 "assigned_rate_limits": { 00:13:54.080 "rw_ios_per_sec": 0, 00:13:54.080 "rw_mbytes_per_sec": 0, 00:13:54.080 "r_mbytes_per_sec": 0, 00:13:54.080 "w_mbytes_per_sec": 0 00:13:54.080 }, 00:13:54.080 "claimed": true, 00:13:54.080 "claim_type": "exclusive_write", 00:13:54.080 "zoned": false, 00:13:54.080 "supported_io_types": { 00:13:54.080 "read": true, 00:13:54.080 "write": true, 00:13:54.080 "unmap": true, 00:13:54.080 "flush": true, 00:13:54.080 "reset": true, 00:13:54.080 "nvme_admin": false, 00:13:54.080 "nvme_io": false, 00:13:54.080 "nvme_io_md": false, 00:13:54.080 "write_zeroes": true, 00:13:54.080 "zcopy": true, 00:13:54.080 "get_zone_info": false, 00:13:54.080 "zone_management": false, 00:13:54.080 "zone_append": false, 00:13:54.080 "compare": false, 00:13:54.080 "compare_and_write": false, 00:13:54.080 "abort": true, 00:13:54.080 "seek_hole": false, 00:13:54.080 "seek_data": false, 00:13:54.080 "copy": true, 00:13:54.080 "nvme_iov_md": false 00:13:54.080 }, 00:13:54.080 "memory_domains": [ 00:13:54.080 { 00:13:54.080 "dma_device_id": "system", 00:13:54.080 "dma_device_type": 1 00:13:54.080 }, 00:13:54.080 { 00:13:54.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.080 "dma_device_type": 2 00:13:54.080 } 00:13:54.080 ], 00:13:54.080 "driver_specific": { 00:13:54.080 "passthru": { 00:13:54.080 "name": "pt2", 00:13:54.080 "base_bdev_name": "malloc2" 00:13:54.080 } 00:13:54.080 } 00:13:54.080 }' 00:13:54.080 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:54.080 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:54.080 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:54.080 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:54.080 18:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:54.080 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:54.080 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:54.080 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:54.338 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:54.338 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:54.338 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:54.338 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:54.338 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:54.338 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:54.338 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:54.597 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:54.597 "name": "pt3", 00:13:54.597 "aliases": [ 00:13:54.597 "00000000-0000-0000-0000-000000000003" 00:13:54.597 ], 00:13:54.597 "product_name": "passthru", 00:13:54.597 "block_size": 512, 00:13:54.597 "num_blocks": 65536, 00:13:54.597 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:54.597 "assigned_rate_limits": { 00:13:54.597 "rw_ios_per_sec": 0, 00:13:54.597 "rw_mbytes_per_sec": 0, 00:13:54.597 "r_mbytes_per_sec": 0, 00:13:54.597 "w_mbytes_per_sec": 0 00:13:54.597 }, 00:13:54.597 "claimed": true, 00:13:54.597 "claim_type": "exclusive_write", 00:13:54.597 "zoned": false, 00:13:54.597 "supported_io_types": { 00:13:54.597 "read": true, 00:13:54.597 "write": true, 00:13:54.597 "unmap": true, 00:13:54.597 "flush": true, 00:13:54.597 "reset": true, 00:13:54.597 "nvme_admin": false, 00:13:54.597 "nvme_io": false, 00:13:54.597 "nvme_io_md": false, 00:13:54.597 "write_zeroes": true, 00:13:54.597 "zcopy": true, 00:13:54.597 "get_zone_info": false, 00:13:54.597 "zone_management": false, 00:13:54.597 "zone_append": false, 00:13:54.597 "compare": false, 00:13:54.597 "compare_and_write": false, 00:13:54.597 "abort": true, 00:13:54.597 "seek_hole": false, 00:13:54.597 "seek_data": false, 00:13:54.597 "copy": true, 00:13:54.597 "nvme_iov_md": false 00:13:54.597 }, 00:13:54.597 "memory_domains": [ 00:13:54.597 { 00:13:54.597 "dma_device_id": "system", 00:13:54.597 "dma_device_type": 1 00:13:54.597 }, 00:13:54.597 { 00:13:54.597 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.597 "dma_device_type": 2 00:13:54.597 } 00:13:54.597 ], 00:13:54.597 "driver_specific": { 00:13:54.597 "passthru": { 00:13:54.597 "name": "pt3", 00:13:54.597 "base_bdev_name": "malloc3" 00:13:54.597 } 00:13:54.597 } 00:13:54.597 }' 00:13:54.597 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:54.597 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:54.597 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:54.597 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:54.597 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:54.597 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:54.597 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:54.597 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:54.597 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:54.597 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:54.856 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:54.856 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:54.856 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:54.856 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:13:54.856 [2024-07-24 18:50:39.808520] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:54.856 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=5acb312c-3f96-4b8a-a04a-778dc7a27318 00:13:54.856 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 5acb312c-3f96-4b8a-a04a-778dc7a27318 ']' 00:13:54.857 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:55.115 [2024-07-24 18:50:39.976779] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:55.115 [2024-07-24 18:50:39.976791] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:55.115 [2024-07-24 18:50:39.976824] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:55.115 [2024-07-24 18:50:39.976866] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:55.115 [2024-07-24 18:50:39.976871] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa3ba40 name raid_bdev1, state offline 00:13:55.115 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.115 18:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:13:55.374 18:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:13:55.374 18:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:13:55.374 18:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:55.374 18:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:55.374 18:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:55.374 18:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:55.633 18:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:55.633 18:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:13:55.892 18:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:55.892 18:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:55.892 18:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:13:55.892 18:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:55.892 18:50:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:13:55.893 18:50:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:55.893 18:50:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:55.893 18:50:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:55.893 18:50:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:55.893 18:50:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:55.893 18:50:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:55.893 18:50:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:55.893 18:50:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:55.893 18:50:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:55.893 18:50:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:56.152 [2024-07-24 18:50:40.979348] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:56.152 [2024-07-24 18:50:40.980329] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:56.152 [2024-07-24 18:50:40.980360] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:13:56.152 [2024-07-24 18:50:40.980392] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:56.152 [2024-07-24 18:50:40.980424] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:56.152 [2024-07-24 18:50:40.980436] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:13:56.152 [2024-07-24 18:50:40.980446] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:56.152 [2024-07-24 18:50:40.980451] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa37100 name raid_bdev1, state configuring 00:13:56.152 request: 00:13:56.152 { 00:13:56.152 "name": "raid_bdev1", 00:13:56.152 "raid_level": "raid1", 00:13:56.152 "base_bdevs": [ 00:13:56.152 "malloc1", 00:13:56.152 "malloc2", 00:13:56.152 "malloc3" 00:13:56.152 ], 00:13:56.152 "superblock": false, 00:13:56.152 "method": "bdev_raid_create", 00:13:56.152 "req_id": 1 00:13:56.152 } 00:13:56.152 Got JSON-RPC error response 00:13:56.152 response: 00:13:56.152 { 00:13:56.152 "code": -17, 00:13:56.152 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:56.152 } 00:13:56.152 18:50:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:13:56.152 18:50:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:13:56.152 18:50:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:13:56.152 18:50:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:13:56.152 18:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.152 18:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:13:56.411 18:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:13:56.411 18:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:13:56.411 18:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:56.411 [2024-07-24 18:50:41.320221] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:56.411 [2024-07-24 18:50:41.320250] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:56.411 [2024-07-24 18:50:41.320259] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa37570 00:13:56.411 [2024-07-24 18:50:41.320265] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:56.411 [2024-07-24 18:50:41.321455] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:56.411 [2024-07-24 18:50:41.321482] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:56.411 [2024-07-24 18:50:41.321528] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:56.411 [2024-07-24 18:50:41.321547] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:56.411 pt1 00:13:56.411 18:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:13:56.411 18:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:56.411 18:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:56.411 18:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:56.411 18:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:56.411 18:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:56.411 18:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:56.411 18:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:56.411 18:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:56.411 18:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:56.411 18:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.411 18:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:56.669 18:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:56.669 "name": "raid_bdev1", 00:13:56.669 "uuid": "5acb312c-3f96-4b8a-a04a-778dc7a27318", 00:13:56.669 "strip_size_kb": 0, 00:13:56.669 "state": "configuring", 00:13:56.669 "raid_level": "raid1", 00:13:56.669 "superblock": true, 00:13:56.669 "num_base_bdevs": 3, 00:13:56.669 "num_base_bdevs_discovered": 1, 00:13:56.669 "num_base_bdevs_operational": 3, 00:13:56.669 "base_bdevs_list": [ 00:13:56.669 { 00:13:56.669 "name": "pt1", 00:13:56.669 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:56.669 "is_configured": true, 00:13:56.669 "data_offset": 2048, 00:13:56.669 "data_size": 63488 00:13:56.669 }, 00:13:56.669 { 00:13:56.669 "name": null, 00:13:56.669 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:56.669 "is_configured": false, 00:13:56.669 "data_offset": 2048, 00:13:56.669 "data_size": 63488 00:13:56.669 }, 00:13:56.669 { 00:13:56.669 "name": null, 00:13:56.669 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:56.669 "is_configured": false, 00:13:56.669 "data_offset": 2048, 00:13:56.669 "data_size": 63488 00:13:56.669 } 00:13:56.669 ] 00:13:56.669 }' 00:13:56.669 18:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:56.669 18:50:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:57.236 18:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:13:57.236 18:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:57.236 [2024-07-24 18:50:42.158398] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:57.236 [2024-07-24 18:50:42.158439] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:57.236 [2024-07-24 18:50:42.158450] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa382e0 00:13:57.236 [2024-07-24 18:50:42.158456] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:57.236 [2024-07-24 18:50:42.158729] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:57.236 [2024-07-24 18:50:42.158739] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:57.236 [2024-07-24 18:50:42.158784] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:57.236 [2024-07-24 18:50:42.158796] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:57.236 pt2 00:13:57.236 18:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:57.494 [2024-07-24 18:50:42.338872] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:13:57.494 18:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:13:57.494 18:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:57.494 18:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:57.494 18:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:57.494 18:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:57.494 18:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:57.494 18:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:57.494 18:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:57.494 18:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:57.494 18:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:57.494 18:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.494 18:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:57.752 18:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:57.752 "name": "raid_bdev1", 00:13:57.752 "uuid": "5acb312c-3f96-4b8a-a04a-778dc7a27318", 00:13:57.752 "strip_size_kb": 0, 00:13:57.752 "state": "configuring", 00:13:57.752 "raid_level": "raid1", 00:13:57.752 "superblock": true, 00:13:57.752 "num_base_bdevs": 3, 00:13:57.752 "num_base_bdevs_discovered": 1, 00:13:57.752 "num_base_bdevs_operational": 3, 00:13:57.752 "base_bdevs_list": [ 00:13:57.752 { 00:13:57.752 "name": "pt1", 00:13:57.752 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:57.752 "is_configured": true, 00:13:57.752 "data_offset": 2048, 00:13:57.752 "data_size": 63488 00:13:57.752 }, 00:13:57.752 { 00:13:57.752 "name": null, 00:13:57.752 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:57.752 "is_configured": false, 00:13:57.752 "data_offset": 2048, 00:13:57.752 "data_size": 63488 00:13:57.752 }, 00:13:57.752 { 00:13:57.752 "name": null, 00:13:57.752 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:57.752 "is_configured": false, 00:13:57.752 "data_offset": 2048, 00:13:57.752 "data_size": 63488 00:13:57.752 } 00:13:57.752 ] 00:13:57.752 }' 00:13:57.752 18:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:57.752 18:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:58.320 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:13:58.320 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:58.320 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:58.320 [2024-07-24 18:50:43.185049] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:58.320 [2024-07-24 18:50:43.185084] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:58.320 [2024-07-24 18:50:43.185096] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x88da40 00:13:58.320 [2024-07-24 18:50:43.185102] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:58.320 [2024-07-24 18:50:43.185343] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:58.320 [2024-07-24 18:50:43.185353] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:58.320 [2024-07-24 18:50:43.185393] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:58.320 [2024-07-24 18:50:43.185404] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:58.320 pt2 00:13:58.320 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:58.320 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:58.320 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:58.578 [2024-07-24 18:50:43.353484] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:58.578 [2024-07-24 18:50:43.353502] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:58.578 [2024-07-24 18:50:43.353510] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa3c400 00:13:58.578 [2024-07-24 18:50:43.353515] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:58.578 [2024-07-24 18:50:43.353691] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:58.578 [2024-07-24 18:50:43.353699] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:58.578 [2024-07-24 18:50:43.353727] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:13:58.578 [2024-07-24 18:50:43.353737] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:58.578 [2024-07-24 18:50:43.353803] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xa38e00 00:13:58.578 [2024-07-24 18:50:43.353808] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:58.578 [2024-07-24 18:50:43.353918] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa3b0e0 00:13:58.578 [2024-07-24 18:50:43.353999] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa38e00 00:13:58.578 [2024-07-24 18:50:43.354004] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa38e00 00:13:58.578 [2024-07-24 18:50:43.354062] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:58.578 pt3 00:13:58.578 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:58.578 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:58.578 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:13:58.578 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:58.578 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:58.578 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:58.578 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:58.578 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:58.578 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:58.578 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:58.578 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:58.578 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:58.578 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.578 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:58.578 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:58.578 "name": "raid_bdev1", 00:13:58.578 "uuid": "5acb312c-3f96-4b8a-a04a-778dc7a27318", 00:13:58.578 "strip_size_kb": 0, 00:13:58.578 "state": "online", 00:13:58.578 "raid_level": "raid1", 00:13:58.578 "superblock": true, 00:13:58.578 "num_base_bdevs": 3, 00:13:58.578 "num_base_bdevs_discovered": 3, 00:13:58.578 "num_base_bdevs_operational": 3, 00:13:58.578 "base_bdevs_list": [ 00:13:58.578 { 00:13:58.578 "name": "pt1", 00:13:58.578 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:58.578 "is_configured": true, 00:13:58.578 "data_offset": 2048, 00:13:58.578 "data_size": 63488 00:13:58.578 }, 00:13:58.578 { 00:13:58.578 "name": "pt2", 00:13:58.578 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:58.578 "is_configured": true, 00:13:58.578 "data_offset": 2048, 00:13:58.578 "data_size": 63488 00:13:58.578 }, 00:13:58.578 { 00:13:58.578 "name": "pt3", 00:13:58.578 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:58.578 "is_configured": true, 00:13:58.578 "data_offset": 2048, 00:13:58.578 "data_size": 63488 00:13:58.578 } 00:13:58.578 ] 00:13:58.578 }' 00:13:58.578 18:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:58.578 18:50:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:59.143 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:59.144 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:59.144 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:59.144 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:59.144 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:59.144 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:59.144 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:59.144 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:59.402 [2024-07-24 18:50:44.191848] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:59.402 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:59.402 "name": "raid_bdev1", 00:13:59.402 "aliases": [ 00:13:59.402 "5acb312c-3f96-4b8a-a04a-778dc7a27318" 00:13:59.402 ], 00:13:59.402 "product_name": "Raid Volume", 00:13:59.402 "block_size": 512, 00:13:59.402 "num_blocks": 63488, 00:13:59.402 "uuid": "5acb312c-3f96-4b8a-a04a-778dc7a27318", 00:13:59.402 "assigned_rate_limits": { 00:13:59.402 "rw_ios_per_sec": 0, 00:13:59.402 "rw_mbytes_per_sec": 0, 00:13:59.402 "r_mbytes_per_sec": 0, 00:13:59.402 "w_mbytes_per_sec": 0 00:13:59.402 }, 00:13:59.402 "claimed": false, 00:13:59.402 "zoned": false, 00:13:59.402 "supported_io_types": { 00:13:59.402 "read": true, 00:13:59.402 "write": true, 00:13:59.402 "unmap": false, 00:13:59.402 "flush": false, 00:13:59.402 "reset": true, 00:13:59.402 "nvme_admin": false, 00:13:59.402 "nvme_io": false, 00:13:59.402 "nvme_io_md": false, 00:13:59.402 "write_zeroes": true, 00:13:59.402 "zcopy": false, 00:13:59.402 "get_zone_info": false, 00:13:59.402 "zone_management": false, 00:13:59.402 "zone_append": false, 00:13:59.402 "compare": false, 00:13:59.402 "compare_and_write": false, 00:13:59.402 "abort": false, 00:13:59.402 "seek_hole": false, 00:13:59.402 "seek_data": false, 00:13:59.402 "copy": false, 00:13:59.402 "nvme_iov_md": false 00:13:59.402 }, 00:13:59.402 "memory_domains": [ 00:13:59.402 { 00:13:59.402 "dma_device_id": "system", 00:13:59.402 "dma_device_type": 1 00:13:59.402 }, 00:13:59.402 { 00:13:59.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.402 "dma_device_type": 2 00:13:59.402 }, 00:13:59.402 { 00:13:59.402 "dma_device_id": "system", 00:13:59.402 "dma_device_type": 1 00:13:59.402 }, 00:13:59.402 { 00:13:59.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.402 "dma_device_type": 2 00:13:59.402 }, 00:13:59.402 { 00:13:59.402 "dma_device_id": "system", 00:13:59.402 "dma_device_type": 1 00:13:59.402 }, 00:13:59.402 { 00:13:59.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.402 "dma_device_type": 2 00:13:59.402 } 00:13:59.402 ], 00:13:59.402 "driver_specific": { 00:13:59.402 "raid": { 00:13:59.402 "uuid": "5acb312c-3f96-4b8a-a04a-778dc7a27318", 00:13:59.402 "strip_size_kb": 0, 00:13:59.402 "state": "online", 00:13:59.402 "raid_level": "raid1", 00:13:59.402 "superblock": true, 00:13:59.402 "num_base_bdevs": 3, 00:13:59.402 "num_base_bdevs_discovered": 3, 00:13:59.402 "num_base_bdevs_operational": 3, 00:13:59.402 "base_bdevs_list": [ 00:13:59.402 { 00:13:59.402 "name": "pt1", 00:13:59.402 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:59.402 "is_configured": true, 00:13:59.402 "data_offset": 2048, 00:13:59.402 "data_size": 63488 00:13:59.402 }, 00:13:59.402 { 00:13:59.402 "name": "pt2", 00:13:59.402 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:59.402 "is_configured": true, 00:13:59.402 "data_offset": 2048, 00:13:59.402 "data_size": 63488 00:13:59.402 }, 00:13:59.402 { 00:13:59.402 "name": "pt3", 00:13:59.402 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:59.402 "is_configured": true, 00:13:59.402 "data_offset": 2048, 00:13:59.402 "data_size": 63488 00:13:59.402 } 00:13:59.402 ] 00:13:59.402 } 00:13:59.402 } 00:13:59.402 }' 00:13:59.402 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:59.402 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:59.402 pt2 00:13:59.402 pt3' 00:13:59.402 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:59.402 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:59.402 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:59.660 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:59.660 "name": "pt1", 00:13:59.660 "aliases": [ 00:13:59.660 "00000000-0000-0000-0000-000000000001" 00:13:59.660 ], 00:13:59.660 "product_name": "passthru", 00:13:59.660 "block_size": 512, 00:13:59.660 "num_blocks": 65536, 00:13:59.660 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:59.660 "assigned_rate_limits": { 00:13:59.660 "rw_ios_per_sec": 0, 00:13:59.660 "rw_mbytes_per_sec": 0, 00:13:59.660 "r_mbytes_per_sec": 0, 00:13:59.660 "w_mbytes_per_sec": 0 00:13:59.660 }, 00:13:59.660 "claimed": true, 00:13:59.660 "claim_type": "exclusive_write", 00:13:59.660 "zoned": false, 00:13:59.660 "supported_io_types": { 00:13:59.660 "read": true, 00:13:59.660 "write": true, 00:13:59.660 "unmap": true, 00:13:59.660 "flush": true, 00:13:59.660 "reset": true, 00:13:59.660 "nvme_admin": false, 00:13:59.660 "nvme_io": false, 00:13:59.660 "nvme_io_md": false, 00:13:59.660 "write_zeroes": true, 00:13:59.660 "zcopy": true, 00:13:59.660 "get_zone_info": false, 00:13:59.660 "zone_management": false, 00:13:59.660 "zone_append": false, 00:13:59.660 "compare": false, 00:13:59.660 "compare_and_write": false, 00:13:59.660 "abort": true, 00:13:59.660 "seek_hole": false, 00:13:59.660 "seek_data": false, 00:13:59.660 "copy": true, 00:13:59.660 "nvme_iov_md": false 00:13:59.661 }, 00:13:59.661 "memory_domains": [ 00:13:59.661 { 00:13:59.661 "dma_device_id": "system", 00:13:59.661 "dma_device_type": 1 00:13:59.661 }, 00:13:59.661 { 00:13:59.661 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.661 "dma_device_type": 2 00:13:59.661 } 00:13:59.661 ], 00:13:59.661 "driver_specific": { 00:13:59.661 "passthru": { 00:13:59.661 "name": "pt1", 00:13:59.661 "base_bdev_name": "malloc1" 00:13:59.661 } 00:13:59.661 } 00:13:59.661 }' 00:13:59.661 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:59.661 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:59.661 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:59.661 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:59.661 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:59.661 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:59.661 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:59.661 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:59.661 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:59.661 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:59.661 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:59.919 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:59.919 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:59.919 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:59.919 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:59.919 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:59.919 "name": "pt2", 00:13:59.919 "aliases": [ 00:13:59.919 "00000000-0000-0000-0000-000000000002" 00:13:59.919 ], 00:13:59.919 "product_name": "passthru", 00:13:59.919 "block_size": 512, 00:13:59.919 "num_blocks": 65536, 00:13:59.919 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:59.919 "assigned_rate_limits": { 00:13:59.919 "rw_ios_per_sec": 0, 00:13:59.919 "rw_mbytes_per_sec": 0, 00:13:59.919 "r_mbytes_per_sec": 0, 00:13:59.919 "w_mbytes_per_sec": 0 00:13:59.919 }, 00:13:59.919 "claimed": true, 00:13:59.919 "claim_type": "exclusive_write", 00:13:59.919 "zoned": false, 00:13:59.919 "supported_io_types": { 00:13:59.919 "read": true, 00:13:59.919 "write": true, 00:13:59.919 "unmap": true, 00:13:59.919 "flush": true, 00:13:59.919 "reset": true, 00:13:59.919 "nvme_admin": false, 00:13:59.919 "nvme_io": false, 00:13:59.919 "nvme_io_md": false, 00:13:59.919 "write_zeroes": true, 00:13:59.919 "zcopy": true, 00:13:59.919 "get_zone_info": false, 00:13:59.919 "zone_management": false, 00:13:59.919 "zone_append": false, 00:13:59.919 "compare": false, 00:13:59.919 "compare_and_write": false, 00:13:59.919 "abort": true, 00:13:59.919 "seek_hole": false, 00:13:59.919 "seek_data": false, 00:13:59.919 "copy": true, 00:13:59.919 "nvme_iov_md": false 00:13:59.919 }, 00:13:59.919 "memory_domains": [ 00:13:59.919 { 00:13:59.919 "dma_device_id": "system", 00:13:59.919 "dma_device_type": 1 00:13:59.919 }, 00:13:59.919 { 00:13:59.919 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.919 "dma_device_type": 2 00:13:59.919 } 00:13:59.919 ], 00:13:59.919 "driver_specific": { 00:13:59.919 "passthru": { 00:13:59.919 "name": "pt2", 00:13:59.919 "base_bdev_name": "malloc2" 00:13:59.919 } 00:13:59.919 } 00:13:59.919 }' 00:13:59.919 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:59.919 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:00.178 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:00.178 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:00.178 18:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:00.178 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:00.178 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:00.178 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:00.178 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:00.178 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:00.178 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:00.178 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:00.178 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:00.178 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:00.178 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:00.437 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:00.437 "name": "pt3", 00:14:00.437 "aliases": [ 00:14:00.437 "00000000-0000-0000-0000-000000000003" 00:14:00.437 ], 00:14:00.437 "product_name": "passthru", 00:14:00.437 "block_size": 512, 00:14:00.437 "num_blocks": 65536, 00:14:00.437 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:00.437 "assigned_rate_limits": { 00:14:00.437 "rw_ios_per_sec": 0, 00:14:00.437 "rw_mbytes_per_sec": 0, 00:14:00.437 "r_mbytes_per_sec": 0, 00:14:00.437 "w_mbytes_per_sec": 0 00:14:00.437 }, 00:14:00.437 "claimed": true, 00:14:00.437 "claim_type": "exclusive_write", 00:14:00.437 "zoned": false, 00:14:00.437 "supported_io_types": { 00:14:00.437 "read": true, 00:14:00.437 "write": true, 00:14:00.437 "unmap": true, 00:14:00.437 "flush": true, 00:14:00.437 "reset": true, 00:14:00.437 "nvme_admin": false, 00:14:00.437 "nvme_io": false, 00:14:00.437 "nvme_io_md": false, 00:14:00.437 "write_zeroes": true, 00:14:00.437 "zcopy": true, 00:14:00.437 "get_zone_info": false, 00:14:00.437 "zone_management": false, 00:14:00.437 "zone_append": false, 00:14:00.437 "compare": false, 00:14:00.437 "compare_and_write": false, 00:14:00.437 "abort": true, 00:14:00.437 "seek_hole": false, 00:14:00.437 "seek_data": false, 00:14:00.437 "copy": true, 00:14:00.437 "nvme_iov_md": false 00:14:00.437 }, 00:14:00.437 "memory_domains": [ 00:14:00.437 { 00:14:00.437 "dma_device_id": "system", 00:14:00.437 "dma_device_type": 1 00:14:00.437 }, 00:14:00.437 { 00:14:00.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:00.437 "dma_device_type": 2 00:14:00.437 } 00:14:00.437 ], 00:14:00.437 "driver_specific": { 00:14:00.437 "passthru": { 00:14:00.437 "name": "pt3", 00:14:00.437 "base_bdev_name": "malloc3" 00:14:00.437 } 00:14:00.437 } 00:14:00.437 }' 00:14:00.437 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:00.437 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:00.437 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:00.437 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:00.696 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:00.696 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:00.696 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:00.696 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:00.696 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:00.696 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:00.696 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:00.696 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:00.696 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:00.696 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:00.974 [2024-07-24 18:50:45.792011] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:00.974 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 5acb312c-3f96-4b8a-a04a-778dc7a27318 '!=' 5acb312c-3f96-4b8a-a04a-778dc7a27318 ']' 00:14:00.974 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:14:00.974 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:00.974 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:00.974 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:00.974 [2024-07-24 18:50:45.960270] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:14:00.974 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:00.974 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:00.974 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:00.974 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:00.974 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:00.974 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:00.974 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:00.974 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:00.974 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:00.974 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:00.974 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.974 18:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:01.232 18:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:01.232 "name": "raid_bdev1", 00:14:01.232 "uuid": "5acb312c-3f96-4b8a-a04a-778dc7a27318", 00:14:01.232 "strip_size_kb": 0, 00:14:01.232 "state": "online", 00:14:01.232 "raid_level": "raid1", 00:14:01.232 "superblock": true, 00:14:01.232 "num_base_bdevs": 3, 00:14:01.232 "num_base_bdevs_discovered": 2, 00:14:01.232 "num_base_bdevs_operational": 2, 00:14:01.232 "base_bdevs_list": [ 00:14:01.232 { 00:14:01.232 "name": null, 00:14:01.232 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:01.232 "is_configured": false, 00:14:01.232 "data_offset": 2048, 00:14:01.232 "data_size": 63488 00:14:01.232 }, 00:14:01.232 { 00:14:01.232 "name": "pt2", 00:14:01.232 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:01.232 "is_configured": true, 00:14:01.232 "data_offset": 2048, 00:14:01.232 "data_size": 63488 00:14:01.232 }, 00:14:01.232 { 00:14:01.232 "name": "pt3", 00:14:01.232 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:01.232 "is_configured": true, 00:14:01.232 "data_offset": 2048, 00:14:01.232 "data_size": 63488 00:14:01.232 } 00:14:01.232 ] 00:14:01.232 }' 00:14:01.232 18:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:01.232 18:50:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:01.799 18:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:01.799 [2024-07-24 18:50:46.798434] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:01.799 [2024-07-24 18:50:46.798455] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:01.799 [2024-07-24 18:50:46.798500] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:01.799 [2024-07-24 18:50:46.798539] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:01.799 [2024-07-24 18:50:46.798545] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa38e00 name raid_bdev1, state offline 00:14:02.058 18:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.058 18:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:14:02.058 18:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:14:02.058 18:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:14:02.058 18:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:14:02.058 18:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:14:02.058 18:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:02.316 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:14:02.316 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:14:02.317 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:02.576 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:14:02.576 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:14:02.576 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:14:02.576 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:14:02.576 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:02.576 [2024-07-24 18:50:47.488191] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:02.576 [2024-07-24 18:50:47.488221] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:02.576 [2024-07-24 18:50:47.488231] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa3c630 00:14:02.576 [2024-07-24 18:50:47.488237] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:02.576 [2024-07-24 18:50:47.489432] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:02.576 [2024-07-24 18:50:47.489450] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:02.576 [2024-07-24 18:50:47.489499] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:02.576 [2024-07-24 18:50:47.489516] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:02.576 pt2 00:14:02.576 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:14:02.576 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:02.576 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:02.576 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:02.576 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:02.576 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:02.576 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:02.576 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:02.576 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:02.576 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:02.576 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:02.576 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.835 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:02.835 "name": "raid_bdev1", 00:14:02.835 "uuid": "5acb312c-3f96-4b8a-a04a-778dc7a27318", 00:14:02.835 "strip_size_kb": 0, 00:14:02.835 "state": "configuring", 00:14:02.835 "raid_level": "raid1", 00:14:02.835 "superblock": true, 00:14:02.835 "num_base_bdevs": 3, 00:14:02.835 "num_base_bdevs_discovered": 1, 00:14:02.835 "num_base_bdevs_operational": 2, 00:14:02.835 "base_bdevs_list": [ 00:14:02.835 { 00:14:02.836 "name": null, 00:14:02.836 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:02.836 "is_configured": false, 00:14:02.836 "data_offset": 2048, 00:14:02.836 "data_size": 63488 00:14:02.836 }, 00:14:02.836 { 00:14:02.836 "name": "pt2", 00:14:02.836 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:02.836 "is_configured": true, 00:14:02.836 "data_offset": 2048, 00:14:02.836 "data_size": 63488 00:14:02.836 }, 00:14:02.836 { 00:14:02.836 "name": null, 00:14:02.836 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:02.836 "is_configured": false, 00:14:02.836 "data_offset": 2048, 00:14:02.836 "data_size": 63488 00:14:02.836 } 00:14:02.836 ] 00:14:02.836 }' 00:14:02.836 18:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:02.836 18:50:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:03.402 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:14:03.402 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:14:03.402 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:14:03.402 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:03.402 [2024-07-24 18:50:48.318339] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:03.402 [2024-07-24 18:50:48.318367] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:03.402 [2024-07-24 18:50:48.318377] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa3e500 00:14:03.402 [2024-07-24 18:50:48.318382] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:03.402 [2024-07-24 18:50:48.318633] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:03.403 [2024-07-24 18:50:48.318643] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:03.403 [2024-07-24 18:50:48.318680] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:03.403 [2024-07-24 18:50:48.318696] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:03.403 [2024-07-24 18:50:48.318760] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xa39510 00:14:03.403 [2024-07-24 18:50:48.318765] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:03.403 [2024-07-24 18:50:48.318876] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa388a0 00:14:03.403 [2024-07-24 18:50:48.318959] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa39510 00:14:03.403 [2024-07-24 18:50:48.318964] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa39510 00:14:03.403 [2024-07-24 18:50:48.319026] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:03.403 pt3 00:14:03.403 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:03.403 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:03.403 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:03.403 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:03.403 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:03.403 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:03.403 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:03.403 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:03.403 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:03.403 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:03.403 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.403 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:03.661 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:03.661 "name": "raid_bdev1", 00:14:03.661 "uuid": "5acb312c-3f96-4b8a-a04a-778dc7a27318", 00:14:03.661 "strip_size_kb": 0, 00:14:03.661 "state": "online", 00:14:03.661 "raid_level": "raid1", 00:14:03.661 "superblock": true, 00:14:03.661 "num_base_bdevs": 3, 00:14:03.661 "num_base_bdevs_discovered": 2, 00:14:03.661 "num_base_bdevs_operational": 2, 00:14:03.661 "base_bdevs_list": [ 00:14:03.661 { 00:14:03.661 "name": null, 00:14:03.661 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:03.661 "is_configured": false, 00:14:03.661 "data_offset": 2048, 00:14:03.661 "data_size": 63488 00:14:03.661 }, 00:14:03.661 { 00:14:03.661 "name": "pt2", 00:14:03.661 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:03.661 "is_configured": true, 00:14:03.661 "data_offset": 2048, 00:14:03.661 "data_size": 63488 00:14:03.661 }, 00:14:03.661 { 00:14:03.661 "name": "pt3", 00:14:03.661 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:03.661 "is_configured": true, 00:14:03.661 "data_offset": 2048, 00:14:03.661 "data_size": 63488 00:14:03.661 } 00:14:03.661 ] 00:14:03.661 }' 00:14:03.661 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:03.661 18:50:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:04.229 18:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:04.229 [2024-07-24 18:50:49.148474] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:04.229 [2024-07-24 18:50:49.148494] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:04.229 [2024-07-24 18:50:49.148535] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:04.229 [2024-07-24 18:50:49.148573] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:04.229 [2024-07-24 18:50:49.148579] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa39510 name raid_bdev1, state offline 00:14:04.229 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.229 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:14:04.489 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:14:04.489 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:14:04.489 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:14:04.489 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:14:04.489 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:04.748 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:04.748 [2024-07-24 18:50:49.661778] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:04.748 [2024-07-24 18:50:49.661809] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:04.748 [2024-07-24 18:50:49.661818] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa3e0c0 00:14:04.748 [2024-07-24 18:50:49.661823] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:04.748 [2024-07-24 18:50:49.663009] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:04.748 [2024-07-24 18:50:49.663030] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:04.748 [2024-07-24 18:50:49.663075] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:04.748 [2024-07-24 18:50:49.663092] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:04.748 [2024-07-24 18:50:49.663158] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:14:04.748 [2024-07-24 18:50:49.663164] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:04.748 [2024-07-24 18:50:49.663172] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x88c3c0 name raid_bdev1, state configuring 00:14:04.748 [2024-07-24 18:50:49.663186] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:04.748 pt1 00:14:04.748 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:14:04.748 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:14:04.748 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:04.748 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:04.748 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:04.748 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:04.748 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:04.748 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:04.748 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:04.748 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:04.748 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:04.748 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.748 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:05.007 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:05.007 "name": "raid_bdev1", 00:14:05.007 "uuid": "5acb312c-3f96-4b8a-a04a-778dc7a27318", 00:14:05.007 "strip_size_kb": 0, 00:14:05.007 "state": "configuring", 00:14:05.007 "raid_level": "raid1", 00:14:05.007 "superblock": true, 00:14:05.007 "num_base_bdevs": 3, 00:14:05.007 "num_base_bdevs_discovered": 1, 00:14:05.007 "num_base_bdevs_operational": 2, 00:14:05.007 "base_bdevs_list": [ 00:14:05.007 { 00:14:05.007 "name": null, 00:14:05.007 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:05.007 "is_configured": false, 00:14:05.007 "data_offset": 2048, 00:14:05.007 "data_size": 63488 00:14:05.007 }, 00:14:05.007 { 00:14:05.007 "name": "pt2", 00:14:05.007 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:05.007 "is_configured": true, 00:14:05.007 "data_offset": 2048, 00:14:05.007 "data_size": 63488 00:14:05.007 }, 00:14:05.007 { 00:14:05.007 "name": null, 00:14:05.007 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:05.007 "is_configured": false, 00:14:05.007 "data_offset": 2048, 00:14:05.007 "data_size": 63488 00:14:05.007 } 00:14:05.007 ] 00:14:05.007 }' 00:14:05.007 18:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:05.007 18:50:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:05.573 18:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:14:05.573 18:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:14:05.573 18:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:14:05.573 18:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:05.832 [2024-07-24 18:50:50.656354] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:05.832 [2024-07-24 18:50:50.656394] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:05.832 [2024-07-24 18:50:50.656406] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa3e500 00:14:05.832 [2024-07-24 18:50:50.656412] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:05.832 [2024-07-24 18:50:50.656668] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:05.832 [2024-07-24 18:50:50.656679] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:05.832 [2024-07-24 18:50:50.656723] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:05.832 [2024-07-24 18:50:50.656737] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:05.832 [2024-07-24 18:50:50.656804] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x88d1f0 00:14:05.832 [2024-07-24 18:50:50.656810] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:05.832 [2024-07-24 18:50:50.656922] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x91e510 00:14:05.832 [2024-07-24 18:50:50.657009] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x88d1f0 00:14:05.832 [2024-07-24 18:50:50.657014] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x88d1f0 00:14:05.832 [2024-07-24 18:50:50.657077] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:05.832 pt3 00:14:05.832 18:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:05.832 18:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:05.832 18:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:05.832 18:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:05.832 18:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:05.832 18:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:05.832 18:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:05.832 18:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:05.832 18:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:05.832 18:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:05.832 18:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.832 18:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:05.832 18:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:05.832 "name": "raid_bdev1", 00:14:05.832 "uuid": "5acb312c-3f96-4b8a-a04a-778dc7a27318", 00:14:05.832 "strip_size_kb": 0, 00:14:05.832 "state": "online", 00:14:05.832 "raid_level": "raid1", 00:14:05.832 "superblock": true, 00:14:05.832 "num_base_bdevs": 3, 00:14:05.832 "num_base_bdevs_discovered": 2, 00:14:05.832 "num_base_bdevs_operational": 2, 00:14:05.832 "base_bdevs_list": [ 00:14:05.832 { 00:14:05.832 "name": null, 00:14:05.832 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:05.832 "is_configured": false, 00:14:05.832 "data_offset": 2048, 00:14:05.832 "data_size": 63488 00:14:05.832 }, 00:14:05.832 { 00:14:05.832 "name": "pt2", 00:14:05.832 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:05.832 "is_configured": true, 00:14:05.832 "data_offset": 2048, 00:14:05.832 "data_size": 63488 00:14:05.832 }, 00:14:05.832 { 00:14:05.832 "name": "pt3", 00:14:05.832 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:05.832 "is_configured": true, 00:14:05.832 "data_offset": 2048, 00:14:05.832 "data_size": 63488 00:14:05.832 } 00:14:05.832 ] 00:14:05.832 }' 00:14:05.832 18:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:05.832 18:50:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:06.399 18:50:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:14:06.399 18:50:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:14:06.657 18:50:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:14:06.657 18:50:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:06.657 18:50:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:14:06.657 [2024-07-24 18:50:51.590925] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:06.657 18:50:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 5acb312c-3f96-4b8a-a04a-778dc7a27318 '!=' 5acb312c-3f96-4b8a-a04a-778dc7a27318 ']' 00:14:06.657 18:50:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2100525 00:14:06.657 18:50:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2100525 ']' 00:14:06.657 18:50:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2100525 00:14:06.657 18:50:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:14:06.657 18:50:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:06.657 18:50:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2100525 00:14:06.657 18:50:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:06.657 18:50:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:06.657 18:50:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2100525' 00:14:06.657 killing process with pid 2100525 00:14:06.657 18:50:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2100525 00:14:06.657 [2024-07-24 18:50:51.649828] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:06.657 [2024-07-24 18:50:51.649868] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:06.657 [2024-07-24 18:50:51.649906] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:06.657 [2024-07-24 18:50:51.649911] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x88d1f0 name raid_bdev1, state offline 00:14:06.657 18:50:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2100525 00:14:06.915 [2024-07-24 18:50:51.673012] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:06.915 18:50:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:14:06.915 00:14:06.915 real 0m16.472s 00:14:06.915 user 0m30.514s 00:14:06.915 sys 0m2.564s 00:14:06.915 18:50:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:06.916 18:50:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:06.916 ************************************ 00:14:06.916 END TEST raid_superblock_test 00:14:06.916 ************************************ 00:14:06.916 18:50:51 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:14:06.916 18:50:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:06.916 18:50:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:06.916 18:50:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:06.916 ************************************ 00:14:06.916 START TEST raid_read_error_test 00:14:06.916 ************************************ 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:14:06.916 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:07.174 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.D4s04G1Aug 00:14:07.174 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2103711 00:14:07.174 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2103711 /var/tmp/spdk-raid.sock 00:14:07.174 18:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:07.174 18:50:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2103711 ']' 00:14:07.174 18:50:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:07.174 18:50:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:07.174 18:50:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:07.174 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:07.174 18:50:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:07.174 18:50:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:07.174 [2024-07-24 18:50:51.978036] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:14:07.174 [2024-07-24 18:50:51.978085] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2103711 ] 00:14:07.174 [2024-07-24 18:50:52.044194] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:07.174 [2024-07-24 18:50:52.123108] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:07.174 [2024-07-24 18:50:52.178241] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:07.174 [2024-07-24 18:50:52.178267] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:08.109 18:50:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:08.109 18:50:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:08.109 18:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:08.109 18:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:08.109 BaseBdev1_malloc 00:14:08.109 18:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:08.109 true 00:14:08.109 18:50:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:08.368 [2024-07-24 18:50:53.257901] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:08.368 [2024-07-24 18:50:53.257934] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:08.368 [2024-07-24 18:50:53.257946] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21e1d20 00:14:08.368 [2024-07-24 18:50:53.257952] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:08.368 [2024-07-24 18:50:53.259189] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:08.368 [2024-07-24 18:50:53.259211] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:08.368 BaseBdev1 00:14:08.368 18:50:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:08.368 18:50:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:08.625 BaseBdev2_malloc 00:14:08.625 18:50:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:08.625 true 00:14:08.625 18:50:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:08.882 [2024-07-24 18:50:53.766617] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:08.882 [2024-07-24 18:50:53.766660] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:08.882 [2024-07-24 18:50:53.766671] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21e6d50 00:14:08.882 [2024-07-24 18:50:53.766676] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:08.882 [2024-07-24 18:50:53.767753] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:08.882 [2024-07-24 18:50:53.767773] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:08.882 BaseBdev2 00:14:08.882 18:50:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:08.882 18:50:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:09.141 BaseBdev3_malloc 00:14:09.141 18:50:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:09.141 true 00:14:09.141 18:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:09.399 [2024-07-24 18:50:54.259356] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:09.399 [2024-07-24 18:50:54.259387] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:09.399 [2024-07-24 18:50:54.259398] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21e5ef0 00:14:09.399 [2024-07-24 18:50:54.259403] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:09.399 [2024-07-24 18:50:54.260489] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:09.399 [2024-07-24 18:50:54.260508] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:09.399 BaseBdev3 00:14:09.399 18:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:09.657 [2024-07-24 18:50:54.427814] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:09.657 [2024-07-24 18:50:54.428678] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:09.657 [2024-07-24 18:50:54.428723] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:09.657 [2024-07-24 18:50:54.428856] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x21e9a00 00:14:09.657 [2024-07-24 18:50:54.428863] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:09.657 [2024-07-24 18:50:54.428991] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x203d750 00:14:09.657 [2024-07-24 18:50:54.429099] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21e9a00 00:14:09.657 [2024-07-24 18:50:54.429104] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21e9a00 00:14:09.657 [2024-07-24 18:50:54.429172] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:09.657 18:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:09.657 18:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:09.657 18:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:09.657 18:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:09.657 18:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:09.657 18:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:09.657 18:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:09.657 18:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:09.657 18:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:09.657 18:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:09.657 18:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.657 18:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:09.657 18:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:09.657 "name": "raid_bdev1", 00:14:09.657 "uuid": "95ae6dcc-4f87-45d3-93c1-79fd8c6fea76", 00:14:09.657 "strip_size_kb": 0, 00:14:09.657 "state": "online", 00:14:09.657 "raid_level": "raid1", 00:14:09.657 "superblock": true, 00:14:09.657 "num_base_bdevs": 3, 00:14:09.657 "num_base_bdevs_discovered": 3, 00:14:09.657 "num_base_bdevs_operational": 3, 00:14:09.657 "base_bdevs_list": [ 00:14:09.657 { 00:14:09.657 "name": "BaseBdev1", 00:14:09.657 "uuid": "0591fd35-8cdd-5edc-8670-82817ddd214a", 00:14:09.657 "is_configured": true, 00:14:09.657 "data_offset": 2048, 00:14:09.657 "data_size": 63488 00:14:09.657 }, 00:14:09.657 { 00:14:09.657 "name": "BaseBdev2", 00:14:09.657 "uuid": "35e7f385-97bf-544f-b3f2-7380256055bc", 00:14:09.657 "is_configured": true, 00:14:09.657 "data_offset": 2048, 00:14:09.657 "data_size": 63488 00:14:09.657 }, 00:14:09.657 { 00:14:09.657 "name": "BaseBdev3", 00:14:09.657 "uuid": "718e70f6-9b40-5610-aa20-ee9bcbf8b357", 00:14:09.657 "is_configured": true, 00:14:09.657 "data_offset": 2048, 00:14:09.657 "data_size": 63488 00:14:09.657 } 00:14:09.657 ] 00:14:09.657 }' 00:14:09.657 18:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:09.657 18:50:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:10.223 18:50:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:10.223 18:50:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:10.223 [2024-07-24 18:50:55.173957] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21ef930 00:14:11.158 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:11.417 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:11.417 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:14:11.417 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:14:11.417 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:11.417 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:11.417 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:11.417 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:11.417 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:11.417 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:11.417 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:11.417 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:11.417 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:11.417 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:11.417 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:11.417 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.417 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:11.676 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:11.676 "name": "raid_bdev1", 00:14:11.676 "uuid": "95ae6dcc-4f87-45d3-93c1-79fd8c6fea76", 00:14:11.676 "strip_size_kb": 0, 00:14:11.676 "state": "online", 00:14:11.676 "raid_level": "raid1", 00:14:11.676 "superblock": true, 00:14:11.676 "num_base_bdevs": 3, 00:14:11.676 "num_base_bdevs_discovered": 3, 00:14:11.676 "num_base_bdevs_operational": 3, 00:14:11.676 "base_bdevs_list": [ 00:14:11.676 { 00:14:11.676 "name": "BaseBdev1", 00:14:11.676 "uuid": "0591fd35-8cdd-5edc-8670-82817ddd214a", 00:14:11.676 "is_configured": true, 00:14:11.676 "data_offset": 2048, 00:14:11.676 "data_size": 63488 00:14:11.676 }, 00:14:11.676 { 00:14:11.676 "name": "BaseBdev2", 00:14:11.676 "uuid": "35e7f385-97bf-544f-b3f2-7380256055bc", 00:14:11.676 "is_configured": true, 00:14:11.676 "data_offset": 2048, 00:14:11.676 "data_size": 63488 00:14:11.676 }, 00:14:11.676 { 00:14:11.676 "name": "BaseBdev3", 00:14:11.676 "uuid": "718e70f6-9b40-5610-aa20-ee9bcbf8b357", 00:14:11.676 "is_configured": true, 00:14:11.676 "data_offset": 2048, 00:14:11.676 "data_size": 63488 00:14:11.676 } 00:14:11.676 ] 00:14:11.676 }' 00:14:11.676 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:11.676 18:50:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:11.969 18:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:12.239 [2024-07-24 18:50:57.103254] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:12.239 [2024-07-24 18:50:57.103289] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:12.239 [2024-07-24 18:50:57.105336] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:12.239 [2024-07-24 18:50:57.105360] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:12.239 [2024-07-24 18:50:57.105422] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:12.239 [2024-07-24 18:50:57.105429] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21e9a00 name raid_bdev1, state offline 00:14:12.239 0 00:14:12.239 18:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2103711 00:14:12.239 18:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2103711 ']' 00:14:12.239 18:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2103711 00:14:12.239 18:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:14:12.239 18:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:12.239 18:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2103711 00:14:12.239 18:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:12.239 18:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:12.239 18:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2103711' 00:14:12.239 killing process with pid 2103711 00:14:12.239 18:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2103711 00:14:12.239 [2024-07-24 18:50:57.163792] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:12.239 18:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2103711 00:14:12.239 [2024-07-24 18:50:57.181763] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:12.498 18:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.D4s04G1Aug 00:14:12.498 18:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:12.498 18:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:12.498 18:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:14:12.498 18:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:14:12.498 18:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:12.498 18:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:12.498 18:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:14:12.498 00:14:12.498 real 0m5.455s 00:14:12.498 user 0m8.491s 00:14:12.498 sys 0m0.794s 00:14:12.498 18:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:12.498 18:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:12.498 ************************************ 00:14:12.498 END TEST raid_read_error_test 00:14:12.498 ************************************ 00:14:12.498 18:50:57 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:14:12.498 18:50:57 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:12.498 18:50:57 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:12.498 18:50:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:12.498 ************************************ 00:14:12.498 START TEST raid_write_error_test 00:14:12.498 ************************************ 00:14:12.498 18:50:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:14:12.498 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:14:12.498 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:12.498 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:14:12.498 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:12.498 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:12.498 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:12.498 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:12.498 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:12.498 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:12.498 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:12.498 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:12.498 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:12.498 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:12.498 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.REwErq8nwd 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2104710 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2104710 /var/tmp/spdk-raid.sock 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2104710 ']' 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:12.499 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:12.499 18:50:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:12.499 [2024-07-24 18:50:57.497114] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:14:12.499 [2024-07-24 18:50:57.497153] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2104710 ] 00:14:12.757 [2024-07-24 18:50:57.561852] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:12.757 [2024-07-24 18:50:57.634090] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:12.757 [2024-07-24 18:50:57.689150] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:12.757 [2024-07-24 18:50:57.689176] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:13.325 18:50:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:13.325 18:50:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:13.325 18:50:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:13.325 18:50:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:13.583 BaseBdev1_malloc 00:14:13.583 18:50:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:13.842 true 00:14:13.842 18:50:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:13.842 [2024-07-24 18:50:58.793335] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:13.842 [2024-07-24 18:50:58.793373] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:13.842 [2024-07-24 18:50:58.793384] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23e0d20 00:14:13.842 [2024-07-24 18:50:58.793389] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:13.842 [2024-07-24 18:50:58.794511] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:13.842 [2024-07-24 18:50:58.794532] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:13.842 BaseBdev1 00:14:13.842 18:50:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:13.842 18:50:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:14.101 BaseBdev2_malloc 00:14:14.101 18:50:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:14.359 true 00:14:14.359 18:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:14.359 [2024-07-24 18:50:59.326294] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:14.359 [2024-07-24 18:50:59.326326] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:14.359 [2024-07-24 18:50:59.326335] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23e5d50 00:14:14.359 [2024-07-24 18:50:59.326340] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:14.359 [2024-07-24 18:50:59.327284] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:14.359 [2024-07-24 18:50:59.327302] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:14.359 BaseBdev2 00:14:14.359 18:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:14.359 18:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:14.618 BaseBdev3_malloc 00:14:14.618 18:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:14.876 true 00:14:14.876 18:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:14.876 [2024-07-24 18:50:59.830881] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:14.876 [2024-07-24 18:50:59.830905] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:14.876 [2024-07-24 18:50:59.830914] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23e4ef0 00:14:14.876 [2024-07-24 18:50:59.830919] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:14.876 [2024-07-24 18:50:59.831829] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:14.876 [2024-07-24 18:50:59.831847] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:14.876 BaseBdev3 00:14:14.876 18:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:15.135 [2024-07-24 18:50:59.999346] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:15.135 [2024-07-24 18:51:00.000208] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:15.135 [2024-07-24 18:51:00.000253] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:15.135 [2024-07-24 18:51:00.000393] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23e8a00 00:14:15.135 [2024-07-24 18:51:00.000399] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:15.135 [2024-07-24 18:51:00.000531] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x223c750 00:14:15.135 [2024-07-24 18:51:00.000642] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23e8a00 00:14:15.135 [2024-07-24 18:51:00.000649] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23e8a00 00:14:15.135 [2024-07-24 18:51:00.000717] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:15.135 18:51:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:15.135 18:51:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:15.135 18:51:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:15.135 18:51:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:15.135 18:51:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:15.135 18:51:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:15.135 18:51:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:15.135 18:51:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:15.135 18:51:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:15.135 18:51:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:15.135 18:51:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.135 18:51:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:15.392 18:51:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:15.393 "name": "raid_bdev1", 00:14:15.393 "uuid": "96297bf1-f344-439c-b74b-998da2a9a0e5", 00:14:15.393 "strip_size_kb": 0, 00:14:15.393 "state": "online", 00:14:15.393 "raid_level": "raid1", 00:14:15.393 "superblock": true, 00:14:15.393 "num_base_bdevs": 3, 00:14:15.393 "num_base_bdevs_discovered": 3, 00:14:15.393 "num_base_bdevs_operational": 3, 00:14:15.393 "base_bdevs_list": [ 00:14:15.393 { 00:14:15.393 "name": "BaseBdev1", 00:14:15.393 "uuid": "fd4a5529-860f-5e97-9379-a93f9e133593", 00:14:15.393 "is_configured": true, 00:14:15.393 "data_offset": 2048, 00:14:15.393 "data_size": 63488 00:14:15.393 }, 00:14:15.393 { 00:14:15.393 "name": "BaseBdev2", 00:14:15.393 "uuid": "7420e1b0-1966-5853-81e1-63c80bff6f8a", 00:14:15.393 "is_configured": true, 00:14:15.393 "data_offset": 2048, 00:14:15.393 "data_size": 63488 00:14:15.393 }, 00:14:15.393 { 00:14:15.393 "name": "BaseBdev3", 00:14:15.393 "uuid": "302d4af1-2546-5f37-b73e-fa0f06b75e9a", 00:14:15.393 "is_configured": true, 00:14:15.393 "data_offset": 2048, 00:14:15.393 "data_size": 63488 00:14:15.393 } 00:14:15.393 ] 00:14:15.393 }' 00:14:15.393 18:51:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:15.393 18:51:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:15.959 18:51:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:15.959 18:51:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:15.959 [2024-07-24 18:51:00.777589] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23ee930 00:14:16.891 18:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:16.891 [2024-07-24 18:51:01.869453] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:14:16.891 [2024-07-24 18:51:01.869509] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:16.891 [2024-07-24 18:51:01.869687] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x23ee930 00:14:16.891 18:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:16.891 18:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:14:16.891 18:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:14:16.891 18:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:14:16.891 18:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:16.891 18:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:16.892 18:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:16.892 18:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:16.892 18:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:16.892 18:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:16.892 18:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:16.892 18:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:16.892 18:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:16.892 18:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:16.892 18:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.892 18:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:17.149 18:51:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:17.149 "name": "raid_bdev1", 00:14:17.149 "uuid": "96297bf1-f344-439c-b74b-998da2a9a0e5", 00:14:17.149 "strip_size_kb": 0, 00:14:17.149 "state": "online", 00:14:17.149 "raid_level": "raid1", 00:14:17.149 "superblock": true, 00:14:17.149 "num_base_bdevs": 3, 00:14:17.149 "num_base_bdevs_discovered": 2, 00:14:17.149 "num_base_bdevs_operational": 2, 00:14:17.149 "base_bdevs_list": [ 00:14:17.149 { 00:14:17.149 "name": null, 00:14:17.149 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:17.149 "is_configured": false, 00:14:17.149 "data_offset": 2048, 00:14:17.149 "data_size": 63488 00:14:17.149 }, 00:14:17.149 { 00:14:17.149 "name": "BaseBdev2", 00:14:17.149 "uuid": "7420e1b0-1966-5853-81e1-63c80bff6f8a", 00:14:17.149 "is_configured": true, 00:14:17.149 "data_offset": 2048, 00:14:17.149 "data_size": 63488 00:14:17.149 }, 00:14:17.149 { 00:14:17.149 "name": "BaseBdev3", 00:14:17.149 "uuid": "302d4af1-2546-5f37-b73e-fa0f06b75e9a", 00:14:17.149 "is_configured": true, 00:14:17.149 "data_offset": 2048, 00:14:17.149 "data_size": 63488 00:14:17.149 } 00:14:17.149 ] 00:14:17.149 }' 00:14:17.149 18:51:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:17.149 18:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:17.715 18:51:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:17.715 [2024-07-24 18:51:02.700106] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:17.715 [2024-07-24 18:51:02.700141] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:17.715 [2024-07-24 18:51:02.702166] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:17.715 [2024-07-24 18:51:02.702189] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:17.715 [2024-07-24 18:51:02.702237] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:17.715 [2024-07-24 18:51:02.702243] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23e8a00 name raid_bdev1, state offline 00:14:17.715 0 00:14:17.715 18:51:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2104710 00:14:17.715 18:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2104710 ']' 00:14:17.715 18:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2104710 00:14:17.715 18:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:14:17.715 18:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:17.715 18:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2104710 00:14:17.974 18:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:17.974 18:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:17.974 18:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2104710' 00:14:17.974 killing process with pid 2104710 00:14:17.974 18:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2104710 00:14:17.974 [2024-07-24 18:51:02.758508] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:17.974 18:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2104710 00:14:17.974 [2024-07-24 18:51:02.776282] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:17.974 18:51:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.REwErq8nwd 00:14:17.974 18:51:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:17.974 18:51:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:17.974 18:51:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:14:17.974 18:51:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:14:17.974 18:51:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:17.974 18:51:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:17.974 18:51:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:14:17.974 00:14:17.974 real 0m5.530s 00:14:17.974 user 0m8.577s 00:14:17.974 sys 0m0.832s 00:14:17.974 18:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:17.974 18:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:17.974 ************************************ 00:14:17.974 END TEST raid_write_error_test 00:14:17.974 ************************************ 00:14:18.233 18:51:03 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:14:18.233 18:51:03 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:14:18.233 18:51:03 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:14:18.233 18:51:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:18.233 18:51:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:18.233 18:51:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:18.233 ************************************ 00:14:18.233 START TEST raid_state_function_test 00:14:18.233 ************************************ 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2105902 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2105902' 00:14:18.233 Process raid pid: 2105902 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2105902 /var/tmp/spdk-raid.sock 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2105902 ']' 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:18.233 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:18.233 18:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:18.233 [2024-07-24 18:51:03.094764] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:14:18.233 [2024-07-24 18:51:03.094803] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:18.233 [2024-07-24 18:51:03.160149] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:18.233 [2024-07-24 18:51:03.240549] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:18.495 [2024-07-24 18:51:03.300043] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:18.495 [2024-07-24 18:51:03.300062] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:19.064 18:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:19.064 18:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:14:19.064 18:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:19.064 [2024-07-24 18:51:04.055148] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:19.064 [2024-07-24 18:51:04.055179] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:19.064 [2024-07-24 18:51:04.055184] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:19.064 [2024-07-24 18:51:04.055190] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:19.064 [2024-07-24 18:51:04.055194] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:19.064 [2024-07-24 18:51:04.055199] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:19.064 [2024-07-24 18:51:04.055219] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:19.064 [2024-07-24 18:51:04.055225] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:19.064 18:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:19.064 18:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:19.064 18:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:19.064 18:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:19.064 18:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:19.064 18:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:19.064 18:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:19.064 18:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:19.064 18:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:19.064 18:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:19.323 18:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.323 18:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:19.323 18:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:19.323 "name": "Existed_Raid", 00:14:19.323 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:19.323 "strip_size_kb": 64, 00:14:19.323 "state": "configuring", 00:14:19.323 "raid_level": "raid0", 00:14:19.323 "superblock": false, 00:14:19.323 "num_base_bdevs": 4, 00:14:19.323 "num_base_bdevs_discovered": 0, 00:14:19.323 "num_base_bdevs_operational": 4, 00:14:19.323 "base_bdevs_list": [ 00:14:19.323 { 00:14:19.323 "name": "BaseBdev1", 00:14:19.323 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:19.323 "is_configured": false, 00:14:19.323 "data_offset": 0, 00:14:19.323 "data_size": 0 00:14:19.323 }, 00:14:19.323 { 00:14:19.323 "name": "BaseBdev2", 00:14:19.323 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:19.323 "is_configured": false, 00:14:19.323 "data_offset": 0, 00:14:19.323 "data_size": 0 00:14:19.323 }, 00:14:19.323 { 00:14:19.323 "name": "BaseBdev3", 00:14:19.323 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:19.323 "is_configured": false, 00:14:19.323 "data_offset": 0, 00:14:19.323 "data_size": 0 00:14:19.323 }, 00:14:19.323 { 00:14:19.323 "name": "BaseBdev4", 00:14:19.323 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:19.323 "is_configured": false, 00:14:19.323 "data_offset": 0, 00:14:19.323 "data_size": 0 00:14:19.323 } 00:14:19.323 ] 00:14:19.323 }' 00:14:19.323 18:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:19.323 18:51:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:19.890 18:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:19.890 [2024-07-24 18:51:04.889235] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:19.890 [2024-07-24 18:51:04.889259] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2181bc0 name Existed_Raid, state configuring 00:14:20.148 18:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:20.148 [2024-07-24 18:51:05.057697] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:20.148 [2024-07-24 18:51:05.057720] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:20.148 [2024-07-24 18:51:05.057725] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:20.148 [2024-07-24 18:51:05.057730] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:20.148 [2024-07-24 18:51:05.057734] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:20.148 [2024-07-24 18:51:05.057739] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:20.148 [2024-07-24 18:51:05.057744] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:20.148 [2024-07-24 18:51:05.057748] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:20.148 18:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:20.407 [2024-07-24 18:51:05.238344] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:20.407 BaseBdev1 00:14:20.407 18:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:20.407 18:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:20.407 18:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:20.407 18:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:20.407 18:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:20.407 18:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:20.407 18:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:20.665 18:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:20.666 [ 00:14:20.666 { 00:14:20.666 "name": "BaseBdev1", 00:14:20.666 "aliases": [ 00:14:20.666 "0b937528-f5dd-4653-971e-6ec7ea13a88f" 00:14:20.666 ], 00:14:20.666 "product_name": "Malloc disk", 00:14:20.666 "block_size": 512, 00:14:20.666 "num_blocks": 65536, 00:14:20.666 "uuid": "0b937528-f5dd-4653-971e-6ec7ea13a88f", 00:14:20.666 "assigned_rate_limits": { 00:14:20.666 "rw_ios_per_sec": 0, 00:14:20.666 "rw_mbytes_per_sec": 0, 00:14:20.666 "r_mbytes_per_sec": 0, 00:14:20.666 "w_mbytes_per_sec": 0 00:14:20.666 }, 00:14:20.666 "claimed": true, 00:14:20.666 "claim_type": "exclusive_write", 00:14:20.666 "zoned": false, 00:14:20.666 "supported_io_types": { 00:14:20.666 "read": true, 00:14:20.666 "write": true, 00:14:20.666 "unmap": true, 00:14:20.666 "flush": true, 00:14:20.666 "reset": true, 00:14:20.666 "nvme_admin": false, 00:14:20.666 "nvme_io": false, 00:14:20.666 "nvme_io_md": false, 00:14:20.666 "write_zeroes": true, 00:14:20.666 "zcopy": true, 00:14:20.666 "get_zone_info": false, 00:14:20.666 "zone_management": false, 00:14:20.666 "zone_append": false, 00:14:20.666 "compare": false, 00:14:20.666 "compare_and_write": false, 00:14:20.666 "abort": true, 00:14:20.666 "seek_hole": false, 00:14:20.666 "seek_data": false, 00:14:20.666 "copy": true, 00:14:20.666 "nvme_iov_md": false 00:14:20.666 }, 00:14:20.666 "memory_domains": [ 00:14:20.666 { 00:14:20.666 "dma_device_id": "system", 00:14:20.666 "dma_device_type": 1 00:14:20.666 }, 00:14:20.666 { 00:14:20.666 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.666 "dma_device_type": 2 00:14:20.666 } 00:14:20.666 ], 00:14:20.666 "driver_specific": {} 00:14:20.666 } 00:14:20.666 ] 00:14:20.666 18:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:20.666 18:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:20.666 18:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:20.666 18:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:20.666 18:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:20.666 18:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:20.666 18:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:20.666 18:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:20.666 18:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:20.666 18:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:20.666 18:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:20.666 18:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.666 18:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:20.924 18:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:20.924 "name": "Existed_Raid", 00:14:20.924 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:20.924 "strip_size_kb": 64, 00:14:20.924 "state": "configuring", 00:14:20.924 "raid_level": "raid0", 00:14:20.924 "superblock": false, 00:14:20.924 "num_base_bdevs": 4, 00:14:20.924 "num_base_bdevs_discovered": 1, 00:14:20.924 "num_base_bdevs_operational": 4, 00:14:20.924 "base_bdevs_list": [ 00:14:20.924 { 00:14:20.924 "name": "BaseBdev1", 00:14:20.924 "uuid": "0b937528-f5dd-4653-971e-6ec7ea13a88f", 00:14:20.924 "is_configured": true, 00:14:20.925 "data_offset": 0, 00:14:20.925 "data_size": 65536 00:14:20.925 }, 00:14:20.925 { 00:14:20.925 "name": "BaseBdev2", 00:14:20.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:20.925 "is_configured": false, 00:14:20.925 "data_offset": 0, 00:14:20.925 "data_size": 0 00:14:20.925 }, 00:14:20.925 { 00:14:20.925 "name": "BaseBdev3", 00:14:20.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:20.925 "is_configured": false, 00:14:20.925 "data_offset": 0, 00:14:20.925 "data_size": 0 00:14:20.925 }, 00:14:20.925 { 00:14:20.925 "name": "BaseBdev4", 00:14:20.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:20.925 "is_configured": false, 00:14:20.925 "data_offset": 0, 00:14:20.925 "data_size": 0 00:14:20.925 } 00:14:20.925 ] 00:14:20.925 }' 00:14:20.925 18:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:20.925 18:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:21.491 18:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:21.491 [2024-07-24 18:51:06.397335] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:21.491 [2024-07-24 18:51:06.397367] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2181430 name Existed_Raid, state configuring 00:14:21.491 18:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:21.750 [2024-07-24 18:51:06.557785] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:21.750 [2024-07-24 18:51:06.558845] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:21.750 [2024-07-24 18:51:06.558869] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:21.750 [2024-07-24 18:51:06.558875] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:21.750 [2024-07-24 18:51:06.558880] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:21.750 [2024-07-24 18:51:06.558885] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:21.750 [2024-07-24 18:51:06.558890] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:21.750 18:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:21.750 18:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:21.750 18:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:21.750 18:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:21.750 18:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:21.750 18:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:21.750 18:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:21.750 18:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:21.750 18:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:21.750 18:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:21.750 18:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:21.750 18:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:21.750 18:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.750 18:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:21.750 18:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:21.750 "name": "Existed_Raid", 00:14:21.750 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:21.750 "strip_size_kb": 64, 00:14:21.750 "state": "configuring", 00:14:21.750 "raid_level": "raid0", 00:14:21.750 "superblock": false, 00:14:21.750 "num_base_bdevs": 4, 00:14:21.750 "num_base_bdevs_discovered": 1, 00:14:21.750 "num_base_bdevs_operational": 4, 00:14:21.750 "base_bdevs_list": [ 00:14:21.750 { 00:14:21.750 "name": "BaseBdev1", 00:14:21.750 "uuid": "0b937528-f5dd-4653-971e-6ec7ea13a88f", 00:14:21.750 "is_configured": true, 00:14:21.750 "data_offset": 0, 00:14:21.750 "data_size": 65536 00:14:21.750 }, 00:14:21.750 { 00:14:21.750 "name": "BaseBdev2", 00:14:21.750 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:21.750 "is_configured": false, 00:14:21.750 "data_offset": 0, 00:14:21.750 "data_size": 0 00:14:21.750 }, 00:14:21.750 { 00:14:21.750 "name": "BaseBdev3", 00:14:21.750 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:21.750 "is_configured": false, 00:14:21.750 "data_offset": 0, 00:14:21.750 "data_size": 0 00:14:21.750 }, 00:14:21.750 { 00:14:21.750 "name": "BaseBdev4", 00:14:21.750 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:21.750 "is_configured": false, 00:14:21.750 "data_offset": 0, 00:14:21.750 "data_size": 0 00:14:21.750 } 00:14:21.750 ] 00:14:21.750 }' 00:14:21.750 18:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:21.750 18:51:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:22.317 18:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:22.575 [2024-07-24 18:51:07.418864] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:22.575 BaseBdev2 00:14:22.575 18:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:22.575 18:51:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:22.575 18:51:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:22.575 18:51:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:22.575 18:51:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:22.575 18:51:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:22.575 18:51:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:22.834 18:51:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:22.834 [ 00:14:22.834 { 00:14:22.834 "name": "BaseBdev2", 00:14:22.834 "aliases": [ 00:14:22.834 "752c33aa-569d-430a-b5c3-08dda58fd03c" 00:14:22.834 ], 00:14:22.834 "product_name": "Malloc disk", 00:14:22.834 "block_size": 512, 00:14:22.834 "num_blocks": 65536, 00:14:22.834 "uuid": "752c33aa-569d-430a-b5c3-08dda58fd03c", 00:14:22.834 "assigned_rate_limits": { 00:14:22.834 "rw_ios_per_sec": 0, 00:14:22.834 "rw_mbytes_per_sec": 0, 00:14:22.834 "r_mbytes_per_sec": 0, 00:14:22.834 "w_mbytes_per_sec": 0 00:14:22.834 }, 00:14:22.834 "claimed": true, 00:14:22.834 "claim_type": "exclusive_write", 00:14:22.834 "zoned": false, 00:14:22.834 "supported_io_types": { 00:14:22.834 "read": true, 00:14:22.834 "write": true, 00:14:22.834 "unmap": true, 00:14:22.834 "flush": true, 00:14:22.834 "reset": true, 00:14:22.834 "nvme_admin": false, 00:14:22.834 "nvme_io": false, 00:14:22.834 "nvme_io_md": false, 00:14:22.834 "write_zeroes": true, 00:14:22.834 "zcopy": true, 00:14:22.834 "get_zone_info": false, 00:14:22.835 "zone_management": false, 00:14:22.835 "zone_append": false, 00:14:22.835 "compare": false, 00:14:22.835 "compare_and_write": false, 00:14:22.835 "abort": true, 00:14:22.835 "seek_hole": false, 00:14:22.835 "seek_data": false, 00:14:22.835 "copy": true, 00:14:22.835 "nvme_iov_md": false 00:14:22.835 }, 00:14:22.835 "memory_domains": [ 00:14:22.835 { 00:14:22.835 "dma_device_id": "system", 00:14:22.835 "dma_device_type": 1 00:14:22.835 }, 00:14:22.835 { 00:14:22.835 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:22.835 "dma_device_type": 2 00:14:22.835 } 00:14:22.835 ], 00:14:22.835 "driver_specific": {} 00:14:22.835 } 00:14:22.835 ] 00:14:22.835 18:51:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:22.835 18:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:22.835 18:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:22.835 18:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:22.835 18:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:22.835 18:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:22.835 18:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:22.835 18:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:22.835 18:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:22.835 18:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:22.835 18:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:22.835 18:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:22.835 18:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:22.835 18:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.835 18:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:23.094 18:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:23.094 "name": "Existed_Raid", 00:14:23.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:23.094 "strip_size_kb": 64, 00:14:23.094 "state": "configuring", 00:14:23.094 "raid_level": "raid0", 00:14:23.094 "superblock": false, 00:14:23.094 "num_base_bdevs": 4, 00:14:23.094 "num_base_bdevs_discovered": 2, 00:14:23.094 "num_base_bdevs_operational": 4, 00:14:23.094 "base_bdevs_list": [ 00:14:23.094 { 00:14:23.094 "name": "BaseBdev1", 00:14:23.094 "uuid": "0b937528-f5dd-4653-971e-6ec7ea13a88f", 00:14:23.094 "is_configured": true, 00:14:23.094 "data_offset": 0, 00:14:23.094 "data_size": 65536 00:14:23.094 }, 00:14:23.094 { 00:14:23.094 "name": "BaseBdev2", 00:14:23.094 "uuid": "752c33aa-569d-430a-b5c3-08dda58fd03c", 00:14:23.094 "is_configured": true, 00:14:23.094 "data_offset": 0, 00:14:23.094 "data_size": 65536 00:14:23.094 }, 00:14:23.094 { 00:14:23.094 "name": "BaseBdev3", 00:14:23.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:23.094 "is_configured": false, 00:14:23.094 "data_offset": 0, 00:14:23.094 "data_size": 0 00:14:23.094 }, 00:14:23.094 { 00:14:23.094 "name": "BaseBdev4", 00:14:23.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:23.094 "is_configured": false, 00:14:23.094 "data_offset": 0, 00:14:23.094 "data_size": 0 00:14:23.094 } 00:14:23.094 ] 00:14:23.094 }' 00:14:23.094 18:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:23.094 18:51:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:23.662 18:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:23.662 [2024-07-24 18:51:08.620790] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:23.662 BaseBdev3 00:14:23.662 18:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:23.662 18:51:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:23.662 18:51:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:23.662 18:51:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:23.662 18:51:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:23.662 18:51:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:23.662 18:51:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:23.920 18:51:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:24.179 [ 00:14:24.179 { 00:14:24.179 "name": "BaseBdev3", 00:14:24.179 "aliases": [ 00:14:24.179 "6bda2bb2-e5e3-4815-a096-18445feedc73" 00:14:24.179 ], 00:14:24.179 "product_name": "Malloc disk", 00:14:24.179 "block_size": 512, 00:14:24.179 "num_blocks": 65536, 00:14:24.179 "uuid": "6bda2bb2-e5e3-4815-a096-18445feedc73", 00:14:24.179 "assigned_rate_limits": { 00:14:24.179 "rw_ios_per_sec": 0, 00:14:24.179 "rw_mbytes_per_sec": 0, 00:14:24.179 "r_mbytes_per_sec": 0, 00:14:24.179 "w_mbytes_per_sec": 0 00:14:24.179 }, 00:14:24.179 "claimed": true, 00:14:24.179 "claim_type": "exclusive_write", 00:14:24.179 "zoned": false, 00:14:24.179 "supported_io_types": { 00:14:24.179 "read": true, 00:14:24.179 "write": true, 00:14:24.179 "unmap": true, 00:14:24.179 "flush": true, 00:14:24.179 "reset": true, 00:14:24.179 "nvme_admin": false, 00:14:24.179 "nvme_io": false, 00:14:24.179 "nvme_io_md": false, 00:14:24.179 "write_zeroes": true, 00:14:24.179 "zcopy": true, 00:14:24.179 "get_zone_info": false, 00:14:24.179 "zone_management": false, 00:14:24.179 "zone_append": false, 00:14:24.179 "compare": false, 00:14:24.179 "compare_and_write": false, 00:14:24.179 "abort": true, 00:14:24.179 "seek_hole": false, 00:14:24.179 "seek_data": false, 00:14:24.179 "copy": true, 00:14:24.179 "nvme_iov_md": false 00:14:24.179 }, 00:14:24.179 "memory_domains": [ 00:14:24.179 { 00:14:24.179 "dma_device_id": "system", 00:14:24.179 "dma_device_type": 1 00:14:24.179 }, 00:14:24.179 { 00:14:24.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:24.179 "dma_device_type": 2 00:14:24.179 } 00:14:24.179 ], 00:14:24.179 "driver_specific": {} 00:14:24.179 } 00:14:24.179 ] 00:14:24.179 18:51:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:24.179 18:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:24.179 18:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:24.179 18:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:24.179 18:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:24.179 18:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:24.179 18:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:24.179 18:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:24.179 18:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:24.179 18:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:24.179 18:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:24.179 18:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:24.179 18:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:24.179 18:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:24.179 18:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.438 18:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:24.438 "name": "Existed_Raid", 00:14:24.438 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:24.438 "strip_size_kb": 64, 00:14:24.438 "state": "configuring", 00:14:24.438 "raid_level": "raid0", 00:14:24.438 "superblock": false, 00:14:24.438 "num_base_bdevs": 4, 00:14:24.438 "num_base_bdevs_discovered": 3, 00:14:24.438 "num_base_bdevs_operational": 4, 00:14:24.438 "base_bdevs_list": [ 00:14:24.438 { 00:14:24.438 "name": "BaseBdev1", 00:14:24.438 "uuid": "0b937528-f5dd-4653-971e-6ec7ea13a88f", 00:14:24.438 "is_configured": true, 00:14:24.438 "data_offset": 0, 00:14:24.438 "data_size": 65536 00:14:24.438 }, 00:14:24.438 { 00:14:24.438 "name": "BaseBdev2", 00:14:24.438 "uuid": "752c33aa-569d-430a-b5c3-08dda58fd03c", 00:14:24.438 "is_configured": true, 00:14:24.438 "data_offset": 0, 00:14:24.438 "data_size": 65536 00:14:24.438 }, 00:14:24.438 { 00:14:24.438 "name": "BaseBdev3", 00:14:24.438 "uuid": "6bda2bb2-e5e3-4815-a096-18445feedc73", 00:14:24.438 "is_configured": true, 00:14:24.438 "data_offset": 0, 00:14:24.438 "data_size": 65536 00:14:24.438 }, 00:14:24.438 { 00:14:24.438 "name": "BaseBdev4", 00:14:24.438 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:24.438 "is_configured": false, 00:14:24.438 "data_offset": 0, 00:14:24.438 "data_size": 0 00:14:24.438 } 00:14:24.438 ] 00:14:24.438 }' 00:14:24.438 18:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:24.438 18:51:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:24.696 18:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:14:24.954 [2024-07-24 18:51:09.858746] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:14:24.954 [2024-07-24 18:51:09.858772] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2182490 00:14:24.954 [2024-07-24 18:51:09.858777] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:14:24.954 [2024-07-24 18:51:09.858906] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x216e2d0 00:14:24.954 [2024-07-24 18:51:09.858992] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2182490 00:14:24.954 [2024-07-24 18:51:09.858997] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2182490 00:14:24.954 [2024-07-24 18:51:09.859119] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:24.954 BaseBdev4 00:14:24.954 18:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:14:24.954 18:51:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:14:24.954 18:51:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:24.954 18:51:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:24.954 18:51:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:24.954 18:51:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:24.954 18:51:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:25.213 18:51:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:14:25.472 [ 00:14:25.472 { 00:14:25.472 "name": "BaseBdev4", 00:14:25.472 "aliases": [ 00:14:25.472 "ba4bcc11-ddfb-4db0-a535-b63b1cf348fd" 00:14:25.472 ], 00:14:25.472 "product_name": "Malloc disk", 00:14:25.472 "block_size": 512, 00:14:25.472 "num_blocks": 65536, 00:14:25.472 "uuid": "ba4bcc11-ddfb-4db0-a535-b63b1cf348fd", 00:14:25.472 "assigned_rate_limits": { 00:14:25.472 "rw_ios_per_sec": 0, 00:14:25.472 "rw_mbytes_per_sec": 0, 00:14:25.472 "r_mbytes_per_sec": 0, 00:14:25.472 "w_mbytes_per_sec": 0 00:14:25.472 }, 00:14:25.472 "claimed": true, 00:14:25.472 "claim_type": "exclusive_write", 00:14:25.472 "zoned": false, 00:14:25.472 "supported_io_types": { 00:14:25.472 "read": true, 00:14:25.472 "write": true, 00:14:25.472 "unmap": true, 00:14:25.472 "flush": true, 00:14:25.472 "reset": true, 00:14:25.472 "nvme_admin": false, 00:14:25.472 "nvme_io": false, 00:14:25.472 "nvme_io_md": false, 00:14:25.472 "write_zeroes": true, 00:14:25.472 "zcopy": true, 00:14:25.472 "get_zone_info": false, 00:14:25.472 "zone_management": false, 00:14:25.472 "zone_append": false, 00:14:25.472 "compare": false, 00:14:25.472 "compare_and_write": false, 00:14:25.472 "abort": true, 00:14:25.472 "seek_hole": false, 00:14:25.472 "seek_data": false, 00:14:25.472 "copy": true, 00:14:25.472 "nvme_iov_md": false 00:14:25.472 }, 00:14:25.472 "memory_domains": [ 00:14:25.472 { 00:14:25.472 "dma_device_id": "system", 00:14:25.472 "dma_device_type": 1 00:14:25.472 }, 00:14:25.472 { 00:14:25.472 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:25.472 "dma_device_type": 2 00:14:25.472 } 00:14:25.472 ], 00:14:25.472 "driver_specific": {} 00:14:25.472 } 00:14:25.472 ] 00:14:25.472 18:51:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:25.472 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:25.472 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:25.472 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:14:25.472 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:25.472 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:25.472 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:25.472 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:25.472 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:25.472 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:25.472 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:25.472 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:25.472 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:25.472 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.472 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:25.472 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:25.472 "name": "Existed_Raid", 00:14:25.472 "uuid": "cdf71350-fb3b-4b78-97a6-3411c069071c", 00:14:25.472 "strip_size_kb": 64, 00:14:25.472 "state": "online", 00:14:25.472 "raid_level": "raid0", 00:14:25.472 "superblock": false, 00:14:25.472 "num_base_bdevs": 4, 00:14:25.472 "num_base_bdevs_discovered": 4, 00:14:25.472 "num_base_bdevs_operational": 4, 00:14:25.472 "base_bdevs_list": [ 00:14:25.472 { 00:14:25.472 "name": "BaseBdev1", 00:14:25.472 "uuid": "0b937528-f5dd-4653-971e-6ec7ea13a88f", 00:14:25.472 "is_configured": true, 00:14:25.472 "data_offset": 0, 00:14:25.472 "data_size": 65536 00:14:25.472 }, 00:14:25.472 { 00:14:25.472 "name": "BaseBdev2", 00:14:25.472 "uuid": "752c33aa-569d-430a-b5c3-08dda58fd03c", 00:14:25.472 "is_configured": true, 00:14:25.472 "data_offset": 0, 00:14:25.472 "data_size": 65536 00:14:25.472 }, 00:14:25.472 { 00:14:25.472 "name": "BaseBdev3", 00:14:25.472 "uuid": "6bda2bb2-e5e3-4815-a096-18445feedc73", 00:14:25.472 "is_configured": true, 00:14:25.472 "data_offset": 0, 00:14:25.472 "data_size": 65536 00:14:25.473 }, 00:14:25.473 { 00:14:25.473 "name": "BaseBdev4", 00:14:25.473 "uuid": "ba4bcc11-ddfb-4db0-a535-b63b1cf348fd", 00:14:25.473 "is_configured": true, 00:14:25.473 "data_offset": 0, 00:14:25.473 "data_size": 65536 00:14:25.473 } 00:14:25.473 ] 00:14:25.473 }' 00:14:25.473 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:25.473 18:51:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:26.042 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:26.042 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:26.042 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:26.042 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:26.042 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:26.042 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:26.042 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:26.042 18:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:26.300 [2024-07-24 18:51:11.058038] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:26.300 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:26.300 "name": "Existed_Raid", 00:14:26.300 "aliases": [ 00:14:26.300 "cdf71350-fb3b-4b78-97a6-3411c069071c" 00:14:26.300 ], 00:14:26.300 "product_name": "Raid Volume", 00:14:26.300 "block_size": 512, 00:14:26.300 "num_blocks": 262144, 00:14:26.300 "uuid": "cdf71350-fb3b-4b78-97a6-3411c069071c", 00:14:26.300 "assigned_rate_limits": { 00:14:26.300 "rw_ios_per_sec": 0, 00:14:26.300 "rw_mbytes_per_sec": 0, 00:14:26.300 "r_mbytes_per_sec": 0, 00:14:26.300 "w_mbytes_per_sec": 0 00:14:26.300 }, 00:14:26.300 "claimed": false, 00:14:26.300 "zoned": false, 00:14:26.300 "supported_io_types": { 00:14:26.300 "read": true, 00:14:26.300 "write": true, 00:14:26.300 "unmap": true, 00:14:26.300 "flush": true, 00:14:26.300 "reset": true, 00:14:26.300 "nvme_admin": false, 00:14:26.300 "nvme_io": false, 00:14:26.300 "nvme_io_md": false, 00:14:26.300 "write_zeroes": true, 00:14:26.300 "zcopy": false, 00:14:26.300 "get_zone_info": false, 00:14:26.300 "zone_management": false, 00:14:26.300 "zone_append": false, 00:14:26.300 "compare": false, 00:14:26.300 "compare_and_write": false, 00:14:26.300 "abort": false, 00:14:26.300 "seek_hole": false, 00:14:26.300 "seek_data": false, 00:14:26.300 "copy": false, 00:14:26.300 "nvme_iov_md": false 00:14:26.300 }, 00:14:26.300 "memory_domains": [ 00:14:26.300 { 00:14:26.300 "dma_device_id": "system", 00:14:26.300 "dma_device_type": 1 00:14:26.300 }, 00:14:26.300 { 00:14:26.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.300 "dma_device_type": 2 00:14:26.300 }, 00:14:26.300 { 00:14:26.300 "dma_device_id": "system", 00:14:26.300 "dma_device_type": 1 00:14:26.300 }, 00:14:26.300 { 00:14:26.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.300 "dma_device_type": 2 00:14:26.300 }, 00:14:26.300 { 00:14:26.300 "dma_device_id": "system", 00:14:26.300 "dma_device_type": 1 00:14:26.300 }, 00:14:26.300 { 00:14:26.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.300 "dma_device_type": 2 00:14:26.300 }, 00:14:26.300 { 00:14:26.300 "dma_device_id": "system", 00:14:26.300 "dma_device_type": 1 00:14:26.300 }, 00:14:26.300 { 00:14:26.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.300 "dma_device_type": 2 00:14:26.300 } 00:14:26.300 ], 00:14:26.300 "driver_specific": { 00:14:26.300 "raid": { 00:14:26.300 "uuid": "cdf71350-fb3b-4b78-97a6-3411c069071c", 00:14:26.300 "strip_size_kb": 64, 00:14:26.300 "state": "online", 00:14:26.300 "raid_level": "raid0", 00:14:26.300 "superblock": false, 00:14:26.300 "num_base_bdevs": 4, 00:14:26.300 "num_base_bdevs_discovered": 4, 00:14:26.300 "num_base_bdevs_operational": 4, 00:14:26.300 "base_bdevs_list": [ 00:14:26.300 { 00:14:26.300 "name": "BaseBdev1", 00:14:26.300 "uuid": "0b937528-f5dd-4653-971e-6ec7ea13a88f", 00:14:26.300 "is_configured": true, 00:14:26.300 "data_offset": 0, 00:14:26.300 "data_size": 65536 00:14:26.300 }, 00:14:26.300 { 00:14:26.300 "name": "BaseBdev2", 00:14:26.300 "uuid": "752c33aa-569d-430a-b5c3-08dda58fd03c", 00:14:26.300 "is_configured": true, 00:14:26.300 "data_offset": 0, 00:14:26.300 "data_size": 65536 00:14:26.300 }, 00:14:26.300 { 00:14:26.300 "name": "BaseBdev3", 00:14:26.300 "uuid": "6bda2bb2-e5e3-4815-a096-18445feedc73", 00:14:26.300 "is_configured": true, 00:14:26.300 "data_offset": 0, 00:14:26.300 "data_size": 65536 00:14:26.300 }, 00:14:26.300 { 00:14:26.300 "name": "BaseBdev4", 00:14:26.300 "uuid": "ba4bcc11-ddfb-4db0-a535-b63b1cf348fd", 00:14:26.300 "is_configured": true, 00:14:26.300 "data_offset": 0, 00:14:26.300 "data_size": 65536 00:14:26.301 } 00:14:26.301 ] 00:14:26.301 } 00:14:26.301 } 00:14:26.301 }' 00:14:26.301 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:26.301 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:26.301 BaseBdev2 00:14:26.301 BaseBdev3 00:14:26.301 BaseBdev4' 00:14:26.301 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:26.301 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:26.301 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:26.301 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:26.301 "name": "BaseBdev1", 00:14:26.301 "aliases": [ 00:14:26.301 "0b937528-f5dd-4653-971e-6ec7ea13a88f" 00:14:26.301 ], 00:14:26.301 "product_name": "Malloc disk", 00:14:26.301 "block_size": 512, 00:14:26.301 "num_blocks": 65536, 00:14:26.301 "uuid": "0b937528-f5dd-4653-971e-6ec7ea13a88f", 00:14:26.301 "assigned_rate_limits": { 00:14:26.301 "rw_ios_per_sec": 0, 00:14:26.301 "rw_mbytes_per_sec": 0, 00:14:26.301 "r_mbytes_per_sec": 0, 00:14:26.301 "w_mbytes_per_sec": 0 00:14:26.301 }, 00:14:26.301 "claimed": true, 00:14:26.301 "claim_type": "exclusive_write", 00:14:26.301 "zoned": false, 00:14:26.301 "supported_io_types": { 00:14:26.301 "read": true, 00:14:26.301 "write": true, 00:14:26.301 "unmap": true, 00:14:26.301 "flush": true, 00:14:26.301 "reset": true, 00:14:26.301 "nvme_admin": false, 00:14:26.301 "nvme_io": false, 00:14:26.301 "nvme_io_md": false, 00:14:26.301 "write_zeroes": true, 00:14:26.301 "zcopy": true, 00:14:26.301 "get_zone_info": false, 00:14:26.301 "zone_management": false, 00:14:26.301 "zone_append": false, 00:14:26.301 "compare": false, 00:14:26.301 "compare_and_write": false, 00:14:26.301 "abort": true, 00:14:26.301 "seek_hole": false, 00:14:26.301 "seek_data": false, 00:14:26.301 "copy": true, 00:14:26.301 "nvme_iov_md": false 00:14:26.301 }, 00:14:26.301 "memory_domains": [ 00:14:26.301 { 00:14:26.301 "dma_device_id": "system", 00:14:26.301 "dma_device_type": 1 00:14:26.301 }, 00:14:26.301 { 00:14:26.301 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.301 "dma_device_type": 2 00:14:26.301 } 00:14:26.301 ], 00:14:26.301 "driver_specific": {} 00:14:26.301 }' 00:14:26.301 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:26.559 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:26.559 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:26.559 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:26.559 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:26.559 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:26.559 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:26.559 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:26.559 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:26.559 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:26.559 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:26.816 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:26.816 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:26.816 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:26.816 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:26.816 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:26.816 "name": "BaseBdev2", 00:14:26.816 "aliases": [ 00:14:26.816 "752c33aa-569d-430a-b5c3-08dda58fd03c" 00:14:26.816 ], 00:14:26.816 "product_name": "Malloc disk", 00:14:26.816 "block_size": 512, 00:14:26.816 "num_blocks": 65536, 00:14:26.816 "uuid": "752c33aa-569d-430a-b5c3-08dda58fd03c", 00:14:26.816 "assigned_rate_limits": { 00:14:26.816 "rw_ios_per_sec": 0, 00:14:26.816 "rw_mbytes_per_sec": 0, 00:14:26.816 "r_mbytes_per_sec": 0, 00:14:26.816 "w_mbytes_per_sec": 0 00:14:26.816 }, 00:14:26.816 "claimed": true, 00:14:26.816 "claim_type": "exclusive_write", 00:14:26.816 "zoned": false, 00:14:26.816 "supported_io_types": { 00:14:26.816 "read": true, 00:14:26.816 "write": true, 00:14:26.816 "unmap": true, 00:14:26.816 "flush": true, 00:14:26.816 "reset": true, 00:14:26.816 "nvme_admin": false, 00:14:26.816 "nvme_io": false, 00:14:26.816 "nvme_io_md": false, 00:14:26.816 "write_zeroes": true, 00:14:26.816 "zcopy": true, 00:14:26.816 "get_zone_info": false, 00:14:26.816 "zone_management": false, 00:14:26.816 "zone_append": false, 00:14:26.816 "compare": false, 00:14:26.816 "compare_and_write": false, 00:14:26.816 "abort": true, 00:14:26.816 "seek_hole": false, 00:14:26.816 "seek_data": false, 00:14:26.816 "copy": true, 00:14:26.816 "nvme_iov_md": false 00:14:26.816 }, 00:14:26.816 "memory_domains": [ 00:14:26.816 { 00:14:26.816 "dma_device_id": "system", 00:14:26.816 "dma_device_type": 1 00:14:26.816 }, 00:14:26.816 { 00:14:26.816 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.816 "dma_device_type": 2 00:14:26.816 } 00:14:26.816 ], 00:14:26.816 "driver_specific": {} 00:14:26.816 }' 00:14:26.816 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:26.816 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.074 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:27.074 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.074 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.074 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:27.074 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:27.074 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:27.074 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:27.074 18:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:27.074 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:27.074 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:27.074 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:27.074 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:27.074 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:27.333 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:27.333 "name": "BaseBdev3", 00:14:27.333 "aliases": [ 00:14:27.333 "6bda2bb2-e5e3-4815-a096-18445feedc73" 00:14:27.333 ], 00:14:27.333 "product_name": "Malloc disk", 00:14:27.333 "block_size": 512, 00:14:27.333 "num_blocks": 65536, 00:14:27.333 "uuid": "6bda2bb2-e5e3-4815-a096-18445feedc73", 00:14:27.333 "assigned_rate_limits": { 00:14:27.333 "rw_ios_per_sec": 0, 00:14:27.333 "rw_mbytes_per_sec": 0, 00:14:27.333 "r_mbytes_per_sec": 0, 00:14:27.333 "w_mbytes_per_sec": 0 00:14:27.333 }, 00:14:27.333 "claimed": true, 00:14:27.333 "claim_type": "exclusive_write", 00:14:27.333 "zoned": false, 00:14:27.333 "supported_io_types": { 00:14:27.333 "read": true, 00:14:27.333 "write": true, 00:14:27.333 "unmap": true, 00:14:27.333 "flush": true, 00:14:27.333 "reset": true, 00:14:27.333 "nvme_admin": false, 00:14:27.333 "nvme_io": false, 00:14:27.333 "nvme_io_md": false, 00:14:27.333 "write_zeroes": true, 00:14:27.333 "zcopy": true, 00:14:27.333 "get_zone_info": false, 00:14:27.333 "zone_management": false, 00:14:27.333 "zone_append": false, 00:14:27.333 "compare": false, 00:14:27.333 "compare_and_write": false, 00:14:27.333 "abort": true, 00:14:27.333 "seek_hole": false, 00:14:27.333 "seek_data": false, 00:14:27.333 "copy": true, 00:14:27.333 "nvme_iov_md": false 00:14:27.333 }, 00:14:27.333 "memory_domains": [ 00:14:27.333 { 00:14:27.333 "dma_device_id": "system", 00:14:27.333 "dma_device_type": 1 00:14:27.333 }, 00:14:27.333 { 00:14:27.333 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:27.333 "dma_device_type": 2 00:14:27.333 } 00:14:27.333 ], 00:14:27.333 "driver_specific": {} 00:14:27.333 }' 00:14:27.333 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.333 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.333 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:27.333 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.333 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.597 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:27.597 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:27.597 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:27.597 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:27.597 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:27.597 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:27.597 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:27.597 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:27.597 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:14:27.597 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:27.856 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:27.856 "name": "BaseBdev4", 00:14:27.856 "aliases": [ 00:14:27.856 "ba4bcc11-ddfb-4db0-a535-b63b1cf348fd" 00:14:27.856 ], 00:14:27.856 "product_name": "Malloc disk", 00:14:27.856 "block_size": 512, 00:14:27.856 "num_blocks": 65536, 00:14:27.856 "uuid": "ba4bcc11-ddfb-4db0-a535-b63b1cf348fd", 00:14:27.856 "assigned_rate_limits": { 00:14:27.856 "rw_ios_per_sec": 0, 00:14:27.856 "rw_mbytes_per_sec": 0, 00:14:27.856 "r_mbytes_per_sec": 0, 00:14:27.856 "w_mbytes_per_sec": 0 00:14:27.856 }, 00:14:27.856 "claimed": true, 00:14:27.856 "claim_type": "exclusive_write", 00:14:27.856 "zoned": false, 00:14:27.856 "supported_io_types": { 00:14:27.856 "read": true, 00:14:27.856 "write": true, 00:14:27.856 "unmap": true, 00:14:27.856 "flush": true, 00:14:27.856 "reset": true, 00:14:27.856 "nvme_admin": false, 00:14:27.856 "nvme_io": false, 00:14:27.856 "nvme_io_md": false, 00:14:27.856 "write_zeroes": true, 00:14:27.856 "zcopy": true, 00:14:27.856 "get_zone_info": false, 00:14:27.856 "zone_management": false, 00:14:27.856 "zone_append": false, 00:14:27.856 "compare": false, 00:14:27.856 "compare_and_write": false, 00:14:27.856 "abort": true, 00:14:27.856 "seek_hole": false, 00:14:27.856 "seek_data": false, 00:14:27.856 "copy": true, 00:14:27.856 "nvme_iov_md": false 00:14:27.856 }, 00:14:27.856 "memory_domains": [ 00:14:27.856 { 00:14:27.856 "dma_device_id": "system", 00:14:27.856 "dma_device_type": 1 00:14:27.856 }, 00:14:27.856 { 00:14:27.856 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:27.856 "dma_device_type": 2 00:14:27.856 } 00:14:27.856 ], 00:14:27.856 "driver_specific": {} 00:14:27.856 }' 00:14:27.856 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.856 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.856 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:27.856 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.856 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.856 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:27.856 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:28.115 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:28.115 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:28.115 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:28.115 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:28.115 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:28.115 18:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:28.373 [2024-07-24 18:51:13.131289] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:28.373 [2024-07-24 18:51:13.131307] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:28.373 [2024-07-24 18:51:13.131338] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:28.373 "name": "Existed_Raid", 00:14:28.373 "uuid": "cdf71350-fb3b-4b78-97a6-3411c069071c", 00:14:28.373 "strip_size_kb": 64, 00:14:28.373 "state": "offline", 00:14:28.373 "raid_level": "raid0", 00:14:28.373 "superblock": false, 00:14:28.373 "num_base_bdevs": 4, 00:14:28.373 "num_base_bdevs_discovered": 3, 00:14:28.373 "num_base_bdevs_operational": 3, 00:14:28.373 "base_bdevs_list": [ 00:14:28.373 { 00:14:28.373 "name": null, 00:14:28.373 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:28.373 "is_configured": false, 00:14:28.373 "data_offset": 0, 00:14:28.373 "data_size": 65536 00:14:28.373 }, 00:14:28.373 { 00:14:28.373 "name": "BaseBdev2", 00:14:28.373 "uuid": "752c33aa-569d-430a-b5c3-08dda58fd03c", 00:14:28.373 "is_configured": true, 00:14:28.373 "data_offset": 0, 00:14:28.373 "data_size": 65536 00:14:28.373 }, 00:14:28.373 { 00:14:28.373 "name": "BaseBdev3", 00:14:28.373 "uuid": "6bda2bb2-e5e3-4815-a096-18445feedc73", 00:14:28.373 "is_configured": true, 00:14:28.373 "data_offset": 0, 00:14:28.373 "data_size": 65536 00:14:28.373 }, 00:14:28.373 { 00:14:28.373 "name": "BaseBdev4", 00:14:28.373 "uuid": "ba4bcc11-ddfb-4db0-a535-b63b1cf348fd", 00:14:28.373 "is_configured": true, 00:14:28.373 "data_offset": 0, 00:14:28.373 "data_size": 65536 00:14:28.373 } 00:14:28.373 ] 00:14:28.373 }' 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:28.373 18:51:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:28.937 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:28.937 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:28.937 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:28.937 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:29.196 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:29.196 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:29.196 18:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:29.196 [2024-07-24 18:51:14.130709] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:29.196 18:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:29.196 18:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:29.196 18:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.196 18:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:29.455 18:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:29.455 18:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:29.455 18:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:29.713 [2024-07-24 18:51:14.477346] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:29.713 18:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:29.713 18:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:29.713 18:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.713 18:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:29.713 18:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:29.713 18:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:29.713 18:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:14:29.971 [2024-07-24 18:51:14.820026] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:14:29.971 [2024-07-24 18:51:14.820053] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2182490 name Existed_Raid, state offline 00:14:29.971 18:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:29.971 18:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:29.971 18:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:29.971 18:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:30.229 18:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:30.229 18:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:30.229 18:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:14:30.229 18:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:30.229 18:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:30.229 18:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:30.229 BaseBdev2 00:14:30.229 18:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:30.229 18:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:30.229 18:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:30.229 18:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:30.229 18:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:30.229 18:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:30.229 18:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:30.524 18:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:30.524 [ 00:14:30.524 { 00:14:30.524 "name": "BaseBdev2", 00:14:30.524 "aliases": [ 00:14:30.524 "70765780-f3ad-4f35-acaa-a5c3c52938f1" 00:14:30.524 ], 00:14:30.524 "product_name": "Malloc disk", 00:14:30.524 "block_size": 512, 00:14:30.524 "num_blocks": 65536, 00:14:30.524 "uuid": "70765780-f3ad-4f35-acaa-a5c3c52938f1", 00:14:30.524 "assigned_rate_limits": { 00:14:30.524 "rw_ios_per_sec": 0, 00:14:30.524 "rw_mbytes_per_sec": 0, 00:14:30.524 "r_mbytes_per_sec": 0, 00:14:30.524 "w_mbytes_per_sec": 0 00:14:30.524 }, 00:14:30.524 "claimed": false, 00:14:30.524 "zoned": false, 00:14:30.524 "supported_io_types": { 00:14:30.524 "read": true, 00:14:30.524 "write": true, 00:14:30.524 "unmap": true, 00:14:30.524 "flush": true, 00:14:30.524 "reset": true, 00:14:30.524 "nvme_admin": false, 00:14:30.524 "nvme_io": false, 00:14:30.524 "nvme_io_md": false, 00:14:30.524 "write_zeroes": true, 00:14:30.524 "zcopy": true, 00:14:30.524 "get_zone_info": false, 00:14:30.524 "zone_management": false, 00:14:30.524 "zone_append": false, 00:14:30.524 "compare": false, 00:14:30.524 "compare_and_write": false, 00:14:30.524 "abort": true, 00:14:30.524 "seek_hole": false, 00:14:30.524 "seek_data": false, 00:14:30.524 "copy": true, 00:14:30.524 "nvme_iov_md": false 00:14:30.524 }, 00:14:30.524 "memory_domains": [ 00:14:30.524 { 00:14:30.524 "dma_device_id": "system", 00:14:30.524 "dma_device_type": 1 00:14:30.524 }, 00:14:30.524 { 00:14:30.524 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:30.524 "dma_device_type": 2 00:14:30.524 } 00:14:30.524 ], 00:14:30.524 "driver_specific": {} 00:14:30.524 } 00:14:30.524 ] 00:14:30.807 18:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:30.807 18:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:30.807 18:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:30.807 18:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:30.807 BaseBdev3 00:14:30.807 18:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:30.807 18:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:30.807 18:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:30.807 18:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:30.807 18:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:30.807 18:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:30.807 18:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:31.065 18:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:31.065 [ 00:14:31.065 { 00:14:31.065 "name": "BaseBdev3", 00:14:31.065 "aliases": [ 00:14:31.065 "3804f4a4-e45a-4207-a6f0-b1c562e49361" 00:14:31.065 ], 00:14:31.065 "product_name": "Malloc disk", 00:14:31.065 "block_size": 512, 00:14:31.065 "num_blocks": 65536, 00:14:31.065 "uuid": "3804f4a4-e45a-4207-a6f0-b1c562e49361", 00:14:31.065 "assigned_rate_limits": { 00:14:31.065 "rw_ios_per_sec": 0, 00:14:31.065 "rw_mbytes_per_sec": 0, 00:14:31.065 "r_mbytes_per_sec": 0, 00:14:31.065 "w_mbytes_per_sec": 0 00:14:31.065 }, 00:14:31.065 "claimed": false, 00:14:31.065 "zoned": false, 00:14:31.065 "supported_io_types": { 00:14:31.065 "read": true, 00:14:31.065 "write": true, 00:14:31.065 "unmap": true, 00:14:31.065 "flush": true, 00:14:31.065 "reset": true, 00:14:31.065 "nvme_admin": false, 00:14:31.065 "nvme_io": false, 00:14:31.065 "nvme_io_md": false, 00:14:31.065 "write_zeroes": true, 00:14:31.065 "zcopy": true, 00:14:31.065 "get_zone_info": false, 00:14:31.065 "zone_management": false, 00:14:31.065 "zone_append": false, 00:14:31.065 "compare": false, 00:14:31.065 "compare_and_write": false, 00:14:31.065 "abort": true, 00:14:31.065 "seek_hole": false, 00:14:31.065 "seek_data": false, 00:14:31.065 "copy": true, 00:14:31.065 "nvme_iov_md": false 00:14:31.065 }, 00:14:31.065 "memory_domains": [ 00:14:31.065 { 00:14:31.065 "dma_device_id": "system", 00:14:31.065 "dma_device_type": 1 00:14:31.065 }, 00:14:31.065 { 00:14:31.065 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:31.065 "dma_device_type": 2 00:14:31.065 } 00:14:31.065 ], 00:14:31.065 "driver_specific": {} 00:14:31.065 } 00:14:31.065 ] 00:14:31.066 18:51:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:31.066 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:31.066 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:31.066 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:14:31.324 BaseBdev4 00:14:31.324 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:14:31.324 18:51:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:14:31.324 18:51:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:31.324 18:51:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:31.324 18:51:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:31.324 18:51:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:31.324 18:51:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:31.582 18:51:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:14:31.582 [ 00:14:31.582 { 00:14:31.582 "name": "BaseBdev4", 00:14:31.582 "aliases": [ 00:14:31.582 "546a5f34-3b1d-4a5e-b29c-02651f00ddbe" 00:14:31.582 ], 00:14:31.582 "product_name": "Malloc disk", 00:14:31.582 "block_size": 512, 00:14:31.582 "num_blocks": 65536, 00:14:31.582 "uuid": "546a5f34-3b1d-4a5e-b29c-02651f00ddbe", 00:14:31.582 "assigned_rate_limits": { 00:14:31.582 "rw_ios_per_sec": 0, 00:14:31.582 "rw_mbytes_per_sec": 0, 00:14:31.582 "r_mbytes_per_sec": 0, 00:14:31.582 "w_mbytes_per_sec": 0 00:14:31.582 }, 00:14:31.582 "claimed": false, 00:14:31.582 "zoned": false, 00:14:31.582 "supported_io_types": { 00:14:31.582 "read": true, 00:14:31.582 "write": true, 00:14:31.582 "unmap": true, 00:14:31.582 "flush": true, 00:14:31.582 "reset": true, 00:14:31.582 "nvme_admin": false, 00:14:31.582 "nvme_io": false, 00:14:31.582 "nvme_io_md": false, 00:14:31.582 "write_zeroes": true, 00:14:31.582 "zcopy": true, 00:14:31.582 "get_zone_info": false, 00:14:31.582 "zone_management": false, 00:14:31.582 "zone_append": false, 00:14:31.582 "compare": false, 00:14:31.582 "compare_and_write": false, 00:14:31.582 "abort": true, 00:14:31.582 "seek_hole": false, 00:14:31.582 "seek_data": false, 00:14:31.582 "copy": true, 00:14:31.582 "nvme_iov_md": false 00:14:31.582 }, 00:14:31.582 "memory_domains": [ 00:14:31.582 { 00:14:31.582 "dma_device_id": "system", 00:14:31.582 "dma_device_type": 1 00:14:31.582 }, 00:14:31.582 { 00:14:31.582 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:31.582 "dma_device_type": 2 00:14:31.582 } 00:14:31.582 ], 00:14:31.582 "driver_specific": {} 00:14:31.582 } 00:14:31.582 ] 00:14:31.582 18:51:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:31.582 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:31.582 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:31.582 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:31.840 [2024-07-24 18:51:16.653527] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:31.840 [2024-07-24 18:51:16.653557] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:31.840 [2024-07-24 18:51:16.653568] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:31.840 [2024-07-24 18:51:16.654568] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:31.840 [2024-07-24 18:51:16.654597] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:14:31.840 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:31.840 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:31.840 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:31.840 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:31.840 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:31.840 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:31.840 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:31.840 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:31.840 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:31.840 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:31.840 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.840 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:31.840 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:31.840 "name": "Existed_Raid", 00:14:31.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:31.840 "strip_size_kb": 64, 00:14:31.840 "state": "configuring", 00:14:31.840 "raid_level": "raid0", 00:14:31.840 "superblock": false, 00:14:31.840 "num_base_bdevs": 4, 00:14:31.840 "num_base_bdevs_discovered": 3, 00:14:31.840 "num_base_bdevs_operational": 4, 00:14:31.840 "base_bdevs_list": [ 00:14:31.840 { 00:14:31.840 "name": "BaseBdev1", 00:14:31.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:31.840 "is_configured": false, 00:14:31.840 "data_offset": 0, 00:14:31.840 "data_size": 0 00:14:31.840 }, 00:14:31.840 { 00:14:31.840 "name": "BaseBdev2", 00:14:31.840 "uuid": "70765780-f3ad-4f35-acaa-a5c3c52938f1", 00:14:31.840 "is_configured": true, 00:14:31.840 "data_offset": 0, 00:14:31.840 "data_size": 65536 00:14:31.840 }, 00:14:31.840 { 00:14:31.840 "name": "BaseBdev3", 00:14:31.840 "uuid": "3804f4a4-e45a-4207-a6f0-b1c562e49361", 00:14:31.840 "is_configured": true, 00:14:31.840 "data_offset": 0, 00:14:31.840 "data_size": 65536 00:14:31.840 }, 00:14:31.840 { 00:14:31.840 "name": "BaseBdev4", 00:14:31.840 "uuid": "546a5f34-3b1d-4a5e-b29c-02651f00ddbe", 00:14:31.840 "is_configured": true, 00:14:31.840 "data_offset": 0, 00:14:31.840 "data_size": 65536 00:14:31.840 } 00:14:31.840 ] 00:14:31.840 }' 00:14:31.840 18:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:31.840 18:51:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:32.406 18:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:32.665 [2024-07-24 18:51:17.479638] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:32.665 18:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:32.665 18:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:32.665 18:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:32.665 18:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:32.665 18:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:32.665 18:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:32.665 18:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:32.665 18:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:32.665 18:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:32.665 18:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:32.665 18:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:32.665 18:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:32.923 18:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:32.923 "name": "Existed_Raid", 00:14:32.923 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:32.923 "strip_size_kb": 64, 00:14:32.923 "state": "configuring", 00:14:32.923 "raid_level": "raid0", 00:14:32.923 "superblock": false, 00:14:32.923 "num_base_bdevs": 4, 00:14:32.923 "num_base_bdevs_discovered": 2, 00:14:32.923 "num_base_bdevs_operational": 4, 00:14:32.923 "base_bdevs_list": [ 00:14:32.923 { 00:14:32.923 "name": "BaseBdev1", 00:14:32.923 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:32.923 "is_configured": false, 00:14:32.923 "data_offset": 0, 00:14:32.923 "data_size": 0 00:14:32.923 }, 00:14:32.923 { 00:14:32.923 "name": null, 00:14:32.923 "uuid": "70765780-f3ad-4f35-acaa-a5c3c52938f1", 00:14:32.923 "is_configured": false, 00:14:32.923 "data_offset": 0, 00:14:32.923 "data_size": 65536 00:14:32.923 }, 00:14:32.923 { 00:14:32.923 "name": "BaseBdev3", 00:14:32.923 "uuid": "3804f4a4-e45a-4207-a6f0-b1c562e49361", 00:14:32.923 "is_configured": true, 00:14:32.923 "data_offset": 0, 00:14:32.923 "data_size": 65536 00:14:32.923 }, 00:14:32.923 { 00:14:32.923 "name": "BaseBdev4", 00:14:32.923 "uuid": "546a5f34-3b1d-4a5e-b29c-02651f00ddbe", 00:14:32.923 "is_configured": true, 00:14:32.923 "data_offset": 0, 00:14:32.923 "data_size": 65536 00:14:32.923 } 00:14:32.923 ] 00:14:32.923 }' 00:14:32.923 18:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:32.923 18:51:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:33.181 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:33.181 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:33.438 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:33.438 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:33.696 [2024-07-24 18:51:18.460823] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:33.696 BaseBdev1 00:14:33.696 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:33.696 18:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:33.696 18:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:33.696 18:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:33.696 18:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:33.696 18:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:33.696 18:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:33.697 18:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:33.955 [ 00:14:33.955 { 00:14:33.955 "name": "BaseBdev1", 00:14:33.955 "aliases": [ 00:14:33.955 "4a91854c-f6d4-4553-a185-ac7b0a283703" 00:14:33.955 ], 00:14:33.955 "product_name": "Malloc disk", 00:14:33.955 "block_size": 512, 00:14:33.955 "num_blocks": 65536, 00:14:33.955 "uuid": "4a91854c-f6d4-4553-a185-ac7b0a283703", 00:14:33.955 "assigned_rate_limits": { 00:14:33.955 "rw_ios_per_sec": 0, 00:14:33.955 "rw_mbytes_per_sec": 0, 00:14:33.955 "r_mbytes_per_sec": 0, 00:14:33.955 "w_mbytes_per_sec": 0 00:14:33.955 }, 00:14:33.955 "claimed": true, 00:14:33.955 "claim_type": "exclusive_write", 00:14:33.955 "zoned": false, 00:14:33.955 "supported_io_types": { 00:14:33.955 "read": true, 00:14:33.955 "write": true, 00:14:33.955 "unmap": true, 00:14:33.955 "flush": true, 00:14:33.955 "reset": true, 00:14:33.955 "nvme_admin": false, 00:14:33.955 "nvme_io": false, 00:14:33.955 "nvme_io_md": false, 00:14:33.955 "write_zeroes": true, 00:14:33.955 "zcopy": true, 00:14:33.955 "get_zone_info": false, 00:14:33.955 "zone_management": false, 00:14:33.955 "zone_append": false, 00:14:33.955 "compare": false, 00:14:33.955 "compare_and_write": false, 00:14:33.955 "abort": true, 00:14:33.955 "seek_hole": false, 00:14:33.955 "seek_data": false, 00:14:33.955 "copy": true, 00:14:33.955 "nvme_iov_md": false 00:14:33.955 }, 00:14:33.955 "memory_domains": [ 00:14:33.955 { 00:14:33.955 "dma_device_id": "system", 00:14:33.955 "dma_device_type": 1 00:14:33.955 }, 00:14:33.955 { 00:14:33.955 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:33.955 "dma_device_type": 2 00:14:33.955 } 00:14:33.955 ], 00:14:33.955 "driver_specific": {} 00:14:33.955 } 00:14:33.955 ] 00:14:33.955 18:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:33.955 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:33.955 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:33.955 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:33.955 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:33.955 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:33.955 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:33.955 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:33.955 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:33.955 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:33.955 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:33.955 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:33.955 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:33.955 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:33.955 "name": "Existed_Raid", 00:14:33.955 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:33.955 "strip_size_kb": 64, 00:14:33.955 "state": "configuring", 00:14:33.955 "raid_level": "raid0", 00:14:33.955 "superblock": false, 00:14:33.955 "num_base_bdevs": 4, 00:14:33.955 "num_base_bdevs_discovered": 3, 00:14:33.955 "num_base_bdevs_operational": 4, 00:14:33.955 "base_bdevs_list": [ 00:14:33.955 { 00:14:33.955 "name": "BaseBdev1", 00:14:33.955 "uuid": "4a91854c-f6d4-4553-a185-ac7b0a283703", 00:14:33.955 "is_configured": true, 00:14:33.955 "data_offset": 0, 00:14:33.955 "data_size": 65536 00:14:33.955 }, 00:14:33.955 { 00:14:33.955 "name": null, 00:14:33.955 "uuid": "70765780-f3ad-4f35-acaa-a5c3c52938f1", 00:14:33.955 "is_configured": false, 00:14:33.955 "data_offset": 0, 00:14:33.955 "data_size": 65536 00:14:33.955 }, 00:14:33.955 { 00:14:33.955 "name": "BaseBdev3", 00:14:33.955 "uuid": "3804f4a4-e45a-4207-a6f0-b1c562e49361", 00:14:33.955 "is_configured": true, 00:14:33.955 "data_offset": 0, 00:14:33.955 "data_size": 65536 00:14:33.955 }, 00:14:33.955 { 00:14:33.955 "name": "BaseBdev4", 00:14:33.955 "uuid": "546a5f34-3b1d-4a5e-b29c-02651f00ddbe", 00:14:33.955 "is_configured": true, 00:14:33.955 "data_offset": 0, 00:14:33.955 "data_size": 65536 00:14:33.955 } 00:14:33.955 ] 00:14:33.955 }' 00:14:33.955 18:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:33.955 18:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:34.522 18:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.522 18:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:34.779 18:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:34.779 18:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:34.779 [2024-07-24 18:51:19.744147] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:34.779 18:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:34.779 18:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:34.779 18:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:34.779 18:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:34.779 18:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:34.780 18:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:34.780 18:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:34.780 18:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:34.780 18:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:34.780 18:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:34.780 18:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.780 18:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:35.037 18:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:35.037 "name": "Existed_Raid", 00:14:35.037 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.037 "strip_size_kb": 64, 00:14:35.037 "state": "configuring", 00:14:35.037 "raid_level": "raid0", 00:14:35.037 "superblock": false, 00:14:35.037 "num_base_bdevs": 4, 00:14:35.037 "num_base_bdevs_discovered": 2, 00:14:35.037 "num_base_bdevs_operational": 4, 00:14:35.037 "base_bdevs_list": [ 00:14:35.037 { 00:14:35.037 "name": "BaseBdev1", 00:14:35.037 "uuid": "4a91854c-f6d4-4553-a185-ac7b0a283703", 00:14:35.037 "is_configured": true, 00:14:35.037 "data_offset": 0, 00:14:35.037 "data_size": 65536 00:14:35.037 }, 00:14:35.037 { 00:14:35.037 "name": null, 00:14:35.037 "uuid": "70765780-f3ad-4f35-acaa-a5c3c52938f1", 00:14:35.037 "is_configured": false, 00:14:35.037 "data_offset": 0, 00:14:35.037 "data_size": 65536 00:14:35.037 }, 00:14:35.037 { 00:14:35.037 "name": null, 00:14:35.037 "uuid": "3804f4a4-e45a-4207-a6f0-b1c562e49361", 00:14:35.037 "is_configured": false, 00:14:35.037 "data_offset": 0, 00:14:35.037 "data_size": 65536 00:14:35.037 }, 00:14:35.037 { 00:14:35.037 "name": "BaseBdev4", 00:14:35.037 "uuid": "546a5f34-3b1d-4a5e-b29c-02651f00ddbe", 00:14:35.037 "is_configured": true, 00:14:35.037 "data_offset": 0, 00:14:35.037 "data_size": 65536 00:14:35.037 } 00:14:35.037 ] 00:14:35.037 }' 00:14:35.037 18:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:35.037 18:51:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:35.604 18:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.604 18:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:35.604 18:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:35.604 18:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:35.862 [2024-07-24 18:51:20.734713] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:35.862 18:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:35.862 18:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:35.862 18:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:35.862 18:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:35.862 18:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:35.862 18:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:35.862 18:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:35.862 18:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:35.862 18:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:35.862 18:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:35.862 18:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.862 18:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:36.121 18:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:36.121 "name": "Existed_Raid", 00:14:36.121 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.121 "strip_size_kb": 64, 00:14:36.121 "state": "configuring", 00:14:36.121 "raid_level": "raid0", 00:14:36.121 "superblock": false, 00:14:36.121 "num_base_bdevs": 4, 00:14:36.121 "num_base_bdevs_discovered": 3, 00:14:36.121 "num_base_bdevs_operational": 4, 00:14:36.121 "base_bdevs_list": [ 00:14:36.121 { 00:14:36.121 "name": "BaseBdev1", 00:14:36.121 "uuid": "4a91854c-f6d4-4553-a185-ac7b0a283703", 00:14:36.121 "is_configured": true, 00:14:36.121 "data_offset": 0, 00:14:36.121 "data_size": 65536 00:14:36.121 }, 00:14:36.121 { 00:14:36.121 "name": null, 00:14:36.121 "uuid": "70765780-f3ad-4f35-acaa-a5c3c52938f1", 00:14:36.121 "is_configured": false, 00:14:36.121 "data_offset": 0, 00:14:36.121 "data_size": 65536 00:14:36.121 }, 00:14:36.121 { 00:14:36.121 "name": "BaseBdev3", 00:14:36.121 "uuid": "3804f4a4-e45a-4207-a6f0-b1c562e49361", 00:14:36.121 "is_configured": true, 00:14:36.121 "data_offset": 0, 00:14:36.121 "data_size": 65536 00:14:36.121 }, 00:14:36.121 { 00:14:36.121 "name": "BaseBdev4", 00:14:36.121 "uuid": "546a5f34-3b1d-4a5e-b29c-02651f00ddbe", 00:14:36.121 "is_configured": true, 00:14:36.121 "data_offset": 0, 00:14:36.121 "data_size": 65536 00:14:36.121 } 00:14:36.121 ] 00:14:36.121 }' 00:14:36.121 18:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:36.121 18:51:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:36.688 18:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:36.688 18:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:36.688 18:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:36.688 18:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:36.946 [2024-07-24 18:51:21.733319] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:36.946 18:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:36.946 18:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:36.947 18:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:36.947 18:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:36.947 18:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:36.947 18:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:36.947 18:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:36.947 18:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:36.947 18:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:36.947 18:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:36.947 18:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:36.947 18:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:36.947 18:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:36.947 "name": "Existed_Raid", 00:14:36.947 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.947 "strip_size_kb": 64, 00:14:36.947 "state": "configuring", 00:14:36.947 "raid_level": "raid0", 00:14:36.947 "superblock": false, 00:14:36.947 "num_base_bdevs": 4, 00:14:36.947 "num_base_bdevs_discovered": 2, 00:14:36.947 "num_base_bdevs_operational": 4, 00:14:36.947 "base_bdevs_list": [ 00:14:36.947 { 00:14:36.947 "name": null, 00:14:36.947 "uuid": "4a91854c-f6d4-4553-a185-ac7b0a283703", 00:14:36.947 "is_configured": false, 00:14:36.947 "data_offset": 0, 00:14:36.947 "data_size": 65536 00:14:36.947 }, 00:14:36.947 { 00:14:36.947 "name": null, 00:14:36.947 "uuid": "70765780-f3ad-4f35-acaa-a5c3c52938f1", 00:14:36.947 "is_configured": false, 00:14:36.947 "data_offset": 0, 00:14:36.947 "data_size": 65536 00:14:36.947 }, 00:14:36.947 { 00:14:36.947 "name": "BaseBdev3", 00:14:36.947 "uuid": "3804f4a4-e45a-4207-a6f0-b1c562e49361", 00:14:36.947 "is_configured": true, 00:14:36.947 "data_offset": 0, 00:14:36.947 "data_size": 65536 00:14:36.947 }, 00:14:36.947 { 00:14:36.947 "name": "BaseBdev4", 00:14:36.947 "uuid": "546a5f34-3b1d-4a5e-b29c-02651f00ddbe", 00:14:36.947 "is_configured": true, 00:14:36.947 "data_offset": 0, 00:14:36.947 "data_size": 65536 00:14:36.947 } 00:14:36.947 ] 00:14:36.947 }' 00:14:36.947 18:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:36.947 18:51:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:37.514 18:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.514 18:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:37.773 18:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:37.773 18:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:37.773 [2024-07-24 18:51:22.753470] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:37.773 18:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:37.773 18:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:37.773 18:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:37.773 18:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:37.773 18:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:37.773 18:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:37.773 18:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:37.773 18:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:37.773 18:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:37.773 18:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:37.773 18:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.773 18:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:38.032 18:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:38.032 "name": "Existed_Raid", 00:14:38.032 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.032 "strip_size_kb": 64, 00:14:38.032 "state": "configuring", 00:14:38.032 "raid_level": "raid0", 00:14:38.032 "superblock": false, 00:14:38.032 "num_base_bdevs": 4, 00:14:38.032 "num_base_bdevs_discovered": 3, 00:14:38.032 "num_base_bdevs_operational": 4, 00:14:38.032 "base_bdevs_list": [ 00:14:38.032 { 00:14:38.032 "name": null, 00:14:38.032 "uuid": "4a91854c-f6d4-4553-a185-ac7b0a283703", 00:14:38.032 "is_configured": false, 00:14:38.032 "data_offset": 0, 00:14:38.032 "data_size": 65536 00:14:38.032 }, 00:14:38.032 { 00:14:38.032 "name": "BaseBdev2", 00:14:38.032 "uuid": "70765780-f3ad-4f35-acaa-a5c3c52938f1", 00:14:38.032 "is_configured": true, 00:14:38.032 "data_offset": 0, 00:14:38.032 "data_size": 65536 00:14:38.032 }, 00:14:38.032 { 00:14:38.032 "name": "BaseBdev3", 00:14:38.032 "uuid": "3804f4a4-e45a-4207-a6f0-b1c562e49361", 00:14:38.032 "is_configured": true, 00:14:38.032 "data_offset": 0, 00:14:38.032 "data_size": 65536 00:14:38.032 }, 00:14:38.032 { 00:14:38.032 "name": "BaseBdev4", 00:14:38.032 "uuid": "546a5f34-3b1d-4a5e-b29c-02651f00ddbe", 00:14:38.032 "is_configured": true, 00:14:38.032 "data_offset": 0, 00:14:38.032 "data_size": 65536 00:14:38.032 } 00:14:38.032 ] 00:14:38.032 }' 00:14:38.032 18:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:38.032 18:51:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:38.598 18:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:38.598 18:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:38.598 18:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:38.598 18:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:38.598 18:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:38.855 18:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 4a91854c-f6d4-4553-a185-ac7b0a283703 00:14:39.113 [2024-07-24 18:51:23.875101] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:39.113 [2024-07-24 18:51:23.875128] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x217aab0 00:14:39.113 [2024-07-24 18:51:23.875132] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:14:39.113 [2024-07-24 18:51:23.875260] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2181b70 00:14:39.113 [2024-07-24 18:51:23.875335] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x217aab0 00:14:39.113 [2024-07-24 18:51:23.875340] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x217aab0 00:14:39.113 [2024-07-24 18:51:23.875477] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:39.113 NewBaseBdev 00:14:39.113 18:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:39.113 18:51:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:39.113 18:51:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:39.113 18:51:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:39.113 18:51:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:39.113 18:51:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:39.113 18:51:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:39.113 18:51:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:39.371 [ 00:14:39.371 { 00:14:39.371 "name": "NewBaseBdev", 00:14:39.371 "aliases": [ 00:14:39.372 "4a91854c-f6d4-4553-a185-ac7b0a283703" 00:14:39.372 ], 00:14:39.372 "product_name": "Malloc disk", 00:14:39.372 "block_size": 512, 00:14:39.372 "num_blocks": 65536, 00:14:39.372 "uuid": "4a91854c-f6d4-4553-a185-ac7b0a283703", 00:14:39.372 "assigned_rate_limits": { 00:14:39.372 "rw_ios_per_sec": 0, 00:14:39.372 "rw_mbytes_per_sec": 0, 00:14:39.372 "r_mbytes_per_sec": 0, 00:14:39.372 "w_mbytes_per_sec": 0 00:14:39.372 }, 00:14:39.372 "claimed": true, 00:14:39.372 "claim_type": "exclusive_write", 00:14:39.372 "zoned": false, 00:14:39.372 "supported_io_types": { 00:14:39.372 "read": true, 00:14:39.372 "write": true, 00:14:39.372 "unmap": true, 00:14:39.372 "flush": true, 00:14:39.372 "reset": true, 00:14:39.372 "nvme_admin": false, 00:14:39.372 "nvme_io": false, 00:14:39.372 "nvme_io_md": false, 00:14:39.372 "write_zeroes": true, 00:14:39.372 "zcopy": true, 00:14:39.372 "get_zone_info": false, 00:14:39.372 "zone_management": false, 00:14:39.372 "zone_append": false, 00:14:39.372 "compare": false, 00:14:39.372 "compare_and_write": false, 00:14:39.372 "abort": true, 00:14:39.372 "seek_hole": false, 00:14:39.372 "seek_data": false, 00:14:39.372 "copy": true, 00:14:39.372 "nvme_iov_md": false 00:14:39.372 }, 00:14:39.372 "memory_domains": [ 00:14:39.372 { 00:14:39.372 "dma_device_id": "system", 00:14:39.372 "dma_device_type": 1 00:14:39.372 }, 00:14:39.372 { 00:14:39.372 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.372 "dma_device_type": 2 00:14:39.372 } 00:14:39.372 ], 00:14:39.372 "driver_specific": {} 00:14:39.372 } 00:14:39.372 ] 00:14:39.372 18:51:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:39.372 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:14:39.372 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:39.372 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:39.372 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:39.372 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:39.372 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:39.372 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:39.372 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:39.372 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:39.372 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:39.372 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.372 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:39.630 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:39.630 "name": "Existed_Raid", 00:14:39.630 "uuid": "a2811d70-bb20-44aa-bcb5-489a44f052da", 00:14:39.630 "strip_size_kb": 64, 00:14:39.630 "state": "online", 00:14:39.630 "raid_level": "raid0", 00:14:39.630 "superblock": false, 00:14:39.630 "num_base_bdevs": 4, 00:14:39.630 "num_base_bdevs_discovered": 4, 00:14:39.630 "num_base_bdevs_operational": 4, 00:14:39.630 "base_bdevs_list": [ 00:14:39.630 { 00:14:39.630 "name": "NewBaseBdev", 00:14:39.630 "uuid": "4a91854c-f6d4-4553-a185-ac7b0a283703", 00:14:39.630 "is_configured": true, 00:14:39.630 "data_offset": 0, 00:14:39.630 "data_size": 65536 00:14:39.630 }, 00:14:39.630 { 00:14:39.630 "name": "BaseBdev2", 00:14:39.630 "uuid": "70765780-f3ad-4f35-acaa-a5c3c52938f1", 00:14:39.630 "is_configured": true, 00:14:39.630 "data_offset": 0, 00:14:39.630 "data_size": 65536 00:14:39.630 }, 00:14:39.630 { 00:14:39.630 "name": "BaseBdev3", 00:14:39.630 "uuid": "3804f4a4-e45a-4207-a6f0-b1c562e49361", 00:14:39.630 "is_configured": true, 00:14:39.630 "data_offset": 0, 00:14:39.630 "data_size": 65536 00:14:39.630 }, 00:14:39.630 { 00:14:39.630 "name": "BaseBdev4", 00:14:39.630 "uuid": "546a5f34-3b1d-4a5e-b29c-02651f00ddbe", 00:14:39.630 "is_configured": true, 00:14:39.630 "data_offset": 0, 00:14:39.630 "data_size": 65536 00:14:39.630 } 00:14:39.630 ] 00:14:39.630 }' 00:14:39.630 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:39.630 18:51:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:39.888 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:39.888 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:39.888 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:39.888 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:39.888 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:39.888 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:39.888 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:39.888 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:40.147 [2024-07-24 18:51:24.982165] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:40.147 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:40.147 "name": "Existed_Raid", 00:14:40.147 "aliases": [ 00:14:40.147 "a2811d70-bb20-44aa-bcb5-489a44f052da" 00:14:40.147 ], 00:14:40.147 "product_name": "Raid Volume", 00:14:40.147 "block_size": 512, 00:14:40.147 "num_blocks": 262144, 00:14:40.147 "uuid": "a2811d70-bb20-44aa-bcb5-489a44f052da", 00:14:40.147 "assigned_rate_limits": { 00:14:40.147 "rw_ios_per_sec": 0, 00:14:40.147 "rw_mbytes_per_sec": 0, 00:14:40.147 "r_mbytes_per_sec": 0, 00:14:40.147 "w_mbytes_per_sec": 0 00:14:40.147 }, 00:14:40.147 "claimed": false, 00:14:40.147 "zoned": false, 00:14:40.147 "supported_io_types": { 00:14:40.147 "read": true, 00:14:40.147 "write": true, 00:14:40.147 "unmap": true, 00:14:40.147 "flush": true, 00:14:40.147 "reset": true, 00:14:40.147 "nvme_admin": false, 00:14:40.147 "nvme_io": false, 00:14:40.147 "nvme_io_md": false, 00:14:40.147 "write_zeroes": true, 00:14:40.147 "zcopy": false, 00:14:40.147 "get_zone_info": false, 00:14:40.147 "zone_management": false, 00:14:40.147 "zone_append": false, 00:14:40.147 "compare": false, 00:14:40.147 "compare_and_write": false, 00:14:40.147 "abort": false, 00:14:40.147 "seek_hole": false, 00:14:40.147 "seek_data": false, 00:14:40.147 "copy": false, 00:14:40.147 "nvme_iov_md": false 00:14:40.147 }, 00:14:40.147 "memory_domains": [ 00:14:40.147 { 00:14:40.147 "dma_device_id": "system", 00:14:40.147 "dma_device_type": 1 00:14:40.147 }, 00:14:40.147 { 00:14:40.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.147 "dma_device_type": 2 00:14:40.147 }, 00:14:40.147 { 00:14:40.147 "dma_device_id": "system", 00:14:40.147 "dma_device_type": 1 00:14:40.147 }, 00:14:40.147 { 00:14:40.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.147 "dma_device_type": 2 00:14:40.147 }, 00:14:40.147 { 00:14:40.147 "dma_device_id": "system", 00:14:40.147 "dma_device_type": 1 00:14:40.147 }, 00:14:40.147 { 00:14:40.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.147 "dma_device_type": 2 00:14:40.147 }, 00:14:40.147 { 00:14:40.147 "dma_device_id": "system", 00:14:40.147 "dma_device_type": 1 00:14:40.147 }, 00:14:40.147 { 00:14:40.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.147 "dma_device_type": 2 00:14:40.147 } 00:14:40.147 ], 00:14:40.147 "driver_specific": { 00:14:40.147 "raid": { 00:14:40.147 "uuid": "a2811d70-bb20-44aa-bcb5-489a44f052da", 00:14:40.147 "strip_size_kb": 64, 00:14:40.147 "state": "online", 00:14:40.147 "raid_level": "raid0", 00:14:40.147 "superblock": false, 00:14:40.147 "num_base_bdevs": 4, 00:14:40.147 "num_base_bdevs_discovered": 4, 00:14:40.147 "num_base_bdevs_operational": 4, 00:14:40.147 "base_bdevs_list": [ 00:14:40.147 { 00:14:40.147 "name": "NewBaseBdev", 00:14:40.147 "uuid": "4a91854c-f6d4-4553-a185-ac7b0a283703", 00:14:40.147 "is_configured": true, 00:14:40.147 "data_offset": 0, 00:14:40.147 "data_size": 65536 00:14:40.147 }, 00:14:40.147 { 00:14:40.147 "name": "BaseBdev2", 00:14:40.147 "uuid": "70765780-f3ad-4f35-acaa-a5c3c52938f1", 00:14:40.147 "is_configured": true, 00:14:40.147 "data_offset": 0, 00:14:40.147 "data_size": 65536 00:14:40.147 }, 00:14:40.147 { 00:14:40.147 "name": "BaseBdev3", 00:14:40.147 "uuid": "3804f4a4-e45a-4207-a6f0-b1c562e49361", 00:14:40.147 "is_configured": true, 00:14:40.147 "data_offset": 0, 00:14:40.147 "data_size": 65536 00:14:40.147 }, 00:14:40.147 { 00:14:40.147 "name": "BaseBdev4", 00:14:40.147 "uuid": "546a5f34-3b1d-4a5e-b29c-02651f00ddbe", 00:14:40.147 "is_configured": true, 00:14:40.147 "data_offset": 0, 00:14:40.147 "data_size": 65536 00:14:40.147 } 00:14:40.147 ] 00:14:40.147 } 00:14:40.147 } 00:14:40.147 }' 00:14:40.147 18:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:40.147 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:40.147 BaseBdev2 00:14:40.147 BaseBdev3 00:14:40.147 BaseBdev4' 00:14:40.147 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:40.147 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:40.147 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:40.406 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:40.406 "name": "NewBaseBdev", 00:14:40.406 "aliases": [ 00:14:40.406 "4a91854c-f6d4-4553-a185-ac7b0a283703" 00:14:40.406 ], 00:14:40.406 "product_name": "Malloc disk", 00:14:40.406 "block_size": 512, 00:14:40.406 "num_blocks": 65536, 00:14:40.406 "uuid": "4a91854c-f6d4-4553-a185-ac7b0a283703", 00:14:40.406 "assigned_rate_limits": { 00:14:40.406 "rw_ios_per_sec": 0, 00:14:40.406 "rw_mbytes_per_sec": 0, 00:14:40.406 "r_mbytes_per_sec": 0, 00:14:40.406 "w_mbytes_per_sec": 0 00:14:40.406 }, 00:14:40.406 "claimed": true, 00:14:40.406 "claim_type": "exclusive_write", 00:14:40.406 "zoned": false, 00:14:40.406 "supported_io_types": { 00:14:40.406 "read": true, 00:14:40.406 "write": true, 00:14:40.406 "unmap": true, 00:14:40.406 "flush": true, 00:14:40.406 "reset": true, 00:14:40.406 "nvme_admin": false, 00:14:40.406 "nvme_io": false, 00:14:40.406 "nvme_io_md": false, 00:14:40.406 "write_zeroes": true, 00:14:40.406 "zcopy": true, 00:14:40.406 "get_zone_info": false, 00:14:40.406 "zone_management": false, 00:14:40.406 "zone_append": false, 00:14:40.406 "compare": false, 00:14:40.406 "compare_and_write": false, 00:14:40.406 "abort": true, 00:14:40.406 "seek_hole": false, 00:14:40.406 "seek_data": false, 00:14:40.406 "copy": true, 00:14:40.406 "nvme_iov_md": false 00:14:40.406 }, 00:14:40.406 "memory_domains": [ 00:14:40.406 { 00:14:40.406 "dma_device_id": "system", 00:14:40.406 "dma_device_type": 1 00:14:40.406 }, 00:14:40.406 { 00:14:40.406 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.406 "dma_device_type": 2 00:14:40.406 } 00:14:40.406 ], 00:14:40.406 "driver_specific": {} 00:14:40.406 }' 00:14:40.406 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:40.406 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:40.406 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:40.406 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:40.406 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:40.406 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:40.406 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:40.406 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:40.406 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:40.406 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:40.665 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:40.665 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:40.665 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:40.665 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:40.665 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:40.665 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:40.665 "name": "BaseBdev2", 00:14:40.665 "aliases": [ 00:14:40.665 "70765780-f3ad-4f35-acaa-a5c3c52938f1" 00:14:40.665 ], 00:14:40.665 "product_name": "Malloc disk", 00:14:40.665 "block_size": 512, 00:14:40.665 "num_blocks": 65536, 00:14:40.665 "uuid": "70765780-f3ad-4f35-acaa-a5c3c52938f1", 00:14:40.665 "assigned_rate_limits": { 00:14:40.665 "rw_ios_per_sec": 0, 00:14:40.665 "rw_mbytes_per_sec": 0, 00:14:40.665 "r_mbytes_per_sec": 0, 00:14:40.665 "w_mbytes_per_sec": 0 00:14:40.665 }, 00:14:40.665 "claimed": true, 00:14:40.665 "claim_type": "exclusive_write", 00:14:40.665 "zoned": false, 00:14:40.665 "supported_io_types": { 00:14:40.665 "read": true, 00:14:40.665 "write": true, 00:14:40.665 "unmap": true, 00:14:40.665 "flush": true, 00:14:40.665 "reset": true, 00:14:40.665 "nvme_admin": false, 00:14:40.665 "nvme_io": false, 00:14:40.665 "nvme_io_md": false, 00:14:40.665 "write_zeroes": true, 00:14:40.665 "zcopy": true, 00:14:40.665 "get_zone_info": false, 00:14:40.665 "zone_management": false, 00:14:40.665 "zone_append": false, 00:14:40.665 "compare": false, 00:14:40.665 "compare_and_write": false, 00:14:40.665 "abort": true, 00:14:40.665 "seek_hole": false, 00:14:40.665 "seek_data": false, 00:14:40.665 "copy": true, 00:14:40.665 "nvme_iov_md": false 00:14:40.665 }, 00:14:40.665 "memory_domains": [ 00:14:40.665 { 00:14:40.665 "dma_device_id": "system", 00:14:40.665 "dma_device_type": 1 00:14:40.665 }, 00:14:40.665 { 00:14:40.665 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.665 "dma_device_type": 2 00:14:40.665 } 00:14:40.665 ], 00:14:40.665 "driver_specific": {} 00:14:40.665 }' 00:14:40.665 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:40.923 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:40.923 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:40.923 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:40.923 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:40.923 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:40.923 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:40.923 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:40.923 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:40.923 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.181 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.181 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:41.181 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:41.182 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:41.182 18:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:41.182 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:41.182 "name": "BaseBdev3", 00:14:41.182 "aliases": [ 00:14:41.182 "3804f4a4-e45a-4207-a6f0-b1c562e49361" 00:14:41.182 ], 00:14:41.182 "product_name": "Malloc disk", 00:14:41.182 "block_size": 512, 00:14:41.182 "num_blocks": 65536, 00:14:41.182 "uuid": "3804f4a4-e45a-4207-a6f0-b1c562e49361", 00:14:41.182 "assigned_rate_limits": { 00:14:41.182 "rw_ios_per_sec": 0, 00:14:41.182 "rw_mbytes_per_sec": 0, 00:14:41.182 "r_mbytes_per_sec": 0, 00:14:41.182 "w_mbytes_per_sec": 0 00:14:41.182 }, 00:14:41.182 "claimed": true, 00:14:41.182 "claim_type": "exclusive_write", 00:14:41.182 "zoned": false, 00:14:41.182 "supported_io_types": { 00:14:41.182 "read": true, 00:14:41.182 "write": true, 00:14:41.182 "unmap": true, 00:14:41.182 "flush": true, 00:14:41.182 "reset": true, 00:14:41.182 "nvme_admin": false, 00:14:41.182 "nvme_io": false, 00:14:41.182 "nvme_io_md": false, 00:14:41.182 "write_zeroes": true, 00:14:41.182 "zcopy": true, 00:14:41.182 "get_zone_info": false, 00:14:41.182 "zone_management": false, 00:14:41.182 "zone_append": false, 00:14:41.182 "compare": false, 00:14:41.182 "compare_and_write": false, 00:14:41.182 "abort": true, 00:14:41.182 "seek_hole": false, 00:14:41.182 "seek_data": false, 00:14:41.182 "copy": true, 00:14:41.182 "nvme_iov_md": false 00:14:41.182 }, 00:14:41.182 "memory_domains": [ 00:14:41.182 { 00:14:41.182 "dma_device_id": "system", 00:14:41.182 "dma_device_type": 1 00:14:41.182 }, 00:14:41.182 { 00:14:41.182 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.182 "dma_device_type": 2 00:14:41.182 } 00:14:41.182 ], 00:14:41.182 "driver_specific": {} 00:14:41.182 }' 00:14:41.182 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.182 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.439 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:41.439 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.439 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.439 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:41.439 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.439 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.439 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:41.439 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.439 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.439 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:41.439 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:41.439 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:41.439 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:14:41.697 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:41.697 "name": "BaseBdev4", 00:14:41.697 "aliases": [ 00:14:41.697 "546a5f34-3b1d-4a5e-b29c-02651f00ddbe" 00:14:41.697 ], 00:14:41.697 "product_name": "Malloc disk", 00:14:41.697 "block_size": 512, 00:14:41.697 "num_blocks": 65536, 00:14:41.697 "uuid": "546a5f34-3b1d-4a5e-b29c-02651f00ddbe", 00:14:41.697 "assigned_rate_limits": { 00:14:41.697 "rw_ios_per_sec": 0, 00:14:41.697 "rw_mbytes_per_sec": 0, 00:14:41.697 "r_mbytes_per_sec": 0, 00:14:41.697 "w_mbytes_per_sec": 0 00:14:41.697 }, 00:14:41.697 "claimed": true, 00:14:41.697 "claim_type": "exclusive_write", 00:14:41.697 "zoned": false, 00:14:41.697 "supported_io_types": { 00:14:41.697 "read": true, 00:14:41.697 "write": true, 00:14:41.697 "unmap": true, 00:14:41.697 "flush": true, 00:14:41.697 "reset": true, 00:14:41.697 "nvme_admin": false, 00:14:41.697 "nvme_io": false, 00:14:41.697 "nvme_io_md": false, 00:14:41.697 "write_zeroes": true, 00:14:41.697 "zcopy": true, 00:14:41.697 "get_zone_info": false, 00:14:41.697 "zone_management": false, 00:14:41.697 "zone_append": false, 00:14:41.697 "compare": false, 00:14:41.697 "compare_and_write": false, 00:14:41.697 "abort": true, 00:14:41.697 "seek_hole": false, 00:14:41.697 "seek_data": false, 00:14:41.697 "copy": true, 00:14:41.697 "nvme_iov_md": false 00:14:41.697 }, 00:14:41.697 "memory_domains": [ 00:14:41.697 { 00:14:41.697 "dma_device_id": "system", 00:14:41.697 "dma_device_type": 1 00:14:41.697 }, 00:14:41.697 { 00:14:41.697 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.697 "dma_device_type": 2 00:14:41.698 } 00:14:41.698 ], 00:14:41.698 "driver_specific": {} 00:14:41.698 }' 00:14:41.698 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.698 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.698 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:41.698 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.698 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.698 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:41.698 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.955 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.955 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:41.955 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.955 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.955 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:41.955 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:42.214 [2024-07-24 18:51:26.983159] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:42.214 [2024-07-24 18:51:26.983178] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:42.214 [2024-07-24 18:51:26.983217] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:42.214 [2024-07-24 18:51:26.983259] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:42.214 [2024-07-24 18:51:26.983265] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x217aab0 name Existed_Raid, state offline 00:14:42.214 18:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2105902 00:14:42.214 18:51:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2105902 ']' 00:14:42.214 18:51:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2105902 00:14:42.214 18:51:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:14:42.214 18:51:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:42.214 18:51:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2105902 00:14:42.214 18:51:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:42.214 18:51:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:42.214 18:51:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2105902' 00:14:42.214 killing process with pid 2105902 00:14:42.214 18:51:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2105902 00:14:42.214 [2024-07-24 18:51:27.035972] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:42.214 18:51:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2105902 00:14:42.214 [2024-07-24 18:51:27.067942] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:14:42.473 00:14:42.473 real 0m24.210s 00:14:42.473 user 0m44.966s 00:14:42.473 sys 0m3.747s 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:42.473 ************************************ 00:14:42.473 END TEST raid_state_function_test 00:14:42.473 ************************************ 00:14:42.473 18:51:27 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:14:42.473 18:51:27 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:42.473 18:51:27 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:42.473 18:51:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:42.473 ************************************ 00:14:42.473 START TEST raid_state_function_test_sb 00:14:42.473 ************************************ 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2110943 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2110943' 00:14:42.473 Process raid pid: 2110943 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2110943 /var/tmp/spdk-raid.sock 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2110943 ']' 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:42.473 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:42.473 18:51:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:42.473 [2024-07-24 18:51:27.368257] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:14:42.473 [2024-07-24 18:51:27.368295] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:42.473 [2024-07-24 18:51:27.431472] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:42.732 [2024-07-24 18:51:27.510097] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:42.732 [2024-07-24 18:51:27.564387] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:42.732 [2024-07-24 18:51:27.564413] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:43.297 18:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:43.297 18:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:14:43.297 18:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:43.556 [2024-07-24 18:51:28.323069] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:43.556 [2024-07-24 18:51:28.323097] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:43.556 [2024-07-24 18:51:28.323102] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:43.556 [2024-07-24 18:51:28.323107] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:43.556 [2024-07-24 18:51:28.323111] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:43.556 [2024-07-24 18:51:28.323115] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:43.556 [2024-07-24 18:51:28.323122] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:43.556 [2024-07-24 18:51:28.323127] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:43.556 18:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:43.556 18:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:43.556 18:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:43.556 18:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:43.556 18:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:43.556 18:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:43.556 18:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:43.556 18:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:43.556 18:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:43.556 18:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:43.556 18:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:43.556 18:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.556 18:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:43.556 "name": "Existed_Raid", 00:14:43.556 "uuid": "e160d302-6f3b-44b0-8386-7b8547257d56", 00:14:43.556 "strip_size_kb": 64, 00:14:43.556 "state": "configuring", 00:14:43.556 "raid_level": "raid0", 00:14:43.556 "superblock": true, 00:14:43.556 "num_base_bdevs": 4, 00:14:43.556 "num_base_bdevs_discovered": 0, 00:14:43.556 "num_base_bdevs_operational": 4, 00:14:43.556 "base_bdevs_list": [ 00:14:43.556 { 00:14:43.556 "name": "BaseBdev1", 00:14:43.556 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:43.556 "is_configured": false, 00:14:43.556 "data_offset": 0, 00:14:43.556 "data_size": 0 00:14:43.556 }, 00:14:43.556 { 00:14:43.556 "name": "BaseBdev2", 00:14:43.556 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:43.556 "is_configured": false, 00:14:43.556 "data_offset": 0, 00:14:43.556 "data_size": 0 00:14:43.556 }, 00:14:43.556 { 00:14:43.556 "name": "BaseBdev3", 00:14:43.556 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:43.556 "is_configured": false, 00:14:43.556 "data_offset": 0, 00:14:43.556 "data_size": 0 00:14:43.556 }, 00:14:43.556 { 00:14:43.556 "name": "BaseBdev4", 00:14:43.556 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:43.556 "is_configured": false, 00:14:43.556 "data_offset": 0, 00:14:43.556 "data_size": 0 00:14:43.556 } 00:14:43.556 ] 00:14:43.556 }' 00:14:43.556 18:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:43.556 18:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:44.123 18:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:44.382 [2024-07-24 18:51:29.137079] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:44.382 [2024-07-24 18:51:29.137098] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1848bc0 name Existed_Raid, state configuring 00:14:44.382 18:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:44.382 [2024-07-24 18:51:29.309561] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:44.382 [2024-07-24 18:51:29.309579] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:44.382 [2024-07-24 18:51:29.309584] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:44.382 [2024-07-24 18:51:29.309588] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:44.382 [2024-07-24 18:51:29.309593] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:44.382 [2024-07-24 18:51:29.309601] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:44.382 [2024-07-24 18:51:29.309605] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:44.382 [2024-07-24 18:51:29.309611] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:44.382 18:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:44.640 [2024-07-24 18:51:29.498254] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:44.640 BaseBdev1 00:14:44.640 18:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:44.640 18:51:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:44.640 18:51:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:44.640 18:51:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:44.640 18:51:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:44.640 18:51:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:44.640 18:51:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:44.899 18:51:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:44.899 [ 00:14:44.899 { 00:14:44.899 "name": "BaseBdev1", 00:14:44.899 "aliases": [ 00:14:44.899 "d3b9384b-e184-467c-bc86-2345e0c9b930" 00:14:44.899 ], 00:14:44.899 "product_name": "Malloc disk", 00:14:44.899 "block_size": 512, 00:14:44.899 "num_blocks": 65536, 00:14:44.899 "uuid": "d3b9384b-e184-467c-bc86-2345e0c9b930", 00:14:44.899 "assigned_rate_limits": { 00:14:44.899 "rw_ios_per_sec": 0, 00:14:44.899 "rw_mbytes_per_sec": 0, 00:14:44.899 "r_mbytes_per_sec": 0, 00:14:44.899 "w_mbytes_per_sec": 0 00:14:44.899 }, 00:14:44.899 "claimed": true, 00:14:44.899 "claim_type": "exclusive_write", 00:14:44.899 "zoned": false, 00:14:44.899 "supported_io_types": { 00:14:44.899 "read": true, 00:14:44.899 "write": true, 00:14:44.899 "unmap": true, 00:14:44.899 "flush": true, 00:14:44.899 "reset": true, 00:14:44.899 "nvme_admin": false, 00:14:44.899 "nvme_io": false, 00:14:44.899 "nvme_io_md": false, 00:14:44.899 "write_zeroes": true, 00:14:44.899 "zcopy": true, 00:14:44.899 "get_zone_info": false, 00:14:44.899 "zone_management": false, 00:14:44.899 "zone_append": false, 00:14:44.899 "compare": false, 00:14:44.899 "compare_and_write": false, 00:14:44.899 "abort": true, 00:14:44.899 "seek_hole": false, 00:14:44.899 "seek_data": false, 00:14:44.899 "copy": true, 00:14:44.899 "nvme_iov_md": false 00:14:44.899 }, 00:14:44.899 "memory_domains": [ 00:14:44.899 { 00:14:44.899 "dma_device_id": "system", 00:14:44.899 "dma_device_type": 1 00:14:44.899 }, 00:14:44.899 { 00:14:44.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.899 "dma_device_type": 2 00:14:44.899 } 00:14:44.899 ], 00:14:44.899 "driver_specific": {} 00:14:44.899 } 00:14:44.899 ] 00:14:44.899 18:51:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:44.899 18:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:44.899 18:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:44.899 18:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:44.899 18:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:44.899 18:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:44.899 18:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:44.899 18:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:44.899 18:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:44.899 18:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:44.899 18:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:44.899 18:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.899 18:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:45.157 18:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:45.157 "name": "Existed_Raid", 00:14:45.157 "uuid": "af8f7a72-740f-428e-ab9f-ef8bcc5df5ad", 00:14:45.157 "strip_size_kb": 64, 00:14:45.157 "state": "configuring", 00:14:45.157 "raid_level": "raid0", 00:14:45.157 "superblock": true, 00:14:45.157 "num_base_bdevs": 4, 00:14:45.157 "num_base_bdevs_discovered": 1, 00:14:45.157 "num_base_bdevs_operational": 4, 00:14:45.157 "base_bdevs_list": [ 00:14:45.157 { 00:14:45.157 "name": "BaseBdev1", 00:14:45.157 "uuid": "d3b9384b-e184-467c-bc86-2345e0c9b930", 00:14:45.157 "is_configured": true, 00:14:45.157 "data_offset": 2048, 00:14:45.157 "data_size": 63488 00:14:45.157 }, 00:14:45.157 { 00:14:45.157 "name": "BaseBdev2", 00:14:45.157 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:45.157 "is_configured": false, 00:14:45.157 "data_offset": 0, 00:14:45.157 "data_size": 0 00:14:45.157 }, 00:14:45.157 { 00:14:45.157 "name": "BaseBdev3", 00:14:45.157 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:45.157 "is_configured": false, 00:14:45.157 "data_offset": 0, 00:14:45.157 "data_size": 0 00:14:45.157 }, 00:14:45.157 { 00:14:45.157 "name": "BaseBdev4", 00:14:45.157 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:45.157 "is_configured": false, 00:14:45.157 "data_offset": 0, 00:14:45.157 "data_size": 0 00:14:45.157 } 00:14:45.157 ] 00:14:45.157 }' 00:14:45.157 18:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:45.157 18:51:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:45.725 18:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:45.725 [2024-07-24 18:51:30.653249] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:45.725 [2024-07-24 18:51:30.653282] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1848430 name Existed_Raid, state configuring 00:14:45.725 18:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:45.983 [2024-07-24 18:51:30.837753] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:45.983 [2024-07-24 18:51:30.838815] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:45.984 [2024-07-24 18:51:30.838840] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:45.984 [2024-07-24 18:51:30.838846] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:45.984 [2024-07-24 18:51:30.838851] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:45.984 [2024-07-24 18:51:30.838855] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:45.984 [2024-07-24 18:51:30.838860] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:45.984 18:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:45.984 18:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:45.984 18:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:45.984 18:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:45.984 18:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:45.984 18:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:45.984 18:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:45.984 18:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:45.984 18:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:45.984 18:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:45.984 18:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:45.984 18:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:45.984 18:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.984 18:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:46.242 18:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.242 "name": "Existed_Raid", 00:14:46.242 "uuid": "238fe2d1-345b-47ac-991d-1473d12870db", 00:14:46.242 "strip_size_kb": 64, 00:14:46.242 "state": "configuring", 00:14:46.242 "raid_level": "raid0", 00:14:46.242 "superblock": true, 00:14:46.242 "num_base_bdevs": 4, 00:14:46.242 "num_base_bdevs_discovered": 1, 00:14:46.242 "num_base_bdevs_operational": 4, 00:14:46.242 "base_bdevs_list": [ 00:14:46.242 { 00:14:46.242 "name": "BaseBdev1", 00:14:46.242 "uuid": "d3b9384b-e184-467c-bc86-2345e0c9b930", 00:14:46.242 "is_configured": true, 00:14:46.242 "data_offset": 2048, 00:14:46.242 "data_size": 63488 00:14:46.242 }, 00:14:46.242 { 00:14:46.242 "name": "BaseBdev2", 00:14:46.242 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:46.242 "is_configured": false, 00:14:46.242 "data_offset": 0, 00:14:46.242 "data_size": 0 00:14:46.242 }, 00:14:46.242 { 00:14:46.242 "name": "BaseBdev3", 00:14:46.242 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:46.242 "is_configured": false, 00:14:46.242 "data_offset": 0, 00:14:46.242 "data_size": 0 00:14:46.242 }, 00:14:46.242 { 00:14:46.242 "name": "BaseBdev4", 00:14:46.242 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:46.242 "is_configured": false, 00:14:46.242 "data_offset": 0, 00:14:46.242 "data_size": 0 00:14:46.242 } 00:14:46.242 ] 00:14:46.242 }' 00:14:46.242 18:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.242 18:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:46.501 18:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:46.759 [2024-07-24 18:51:31.642441] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:46.759 BaseBdev2 00:14:46.759 18:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:46.759 18:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:46.759 18:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:46.759 18:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:46.759 18:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:46.759 18:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:46.759 18:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:47.018 18:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:47.018 [ 00:14:47.018 { 00:14:47.018 "name": "BaseBdev2", 00:14:47.018 "aliases": [ 00:14:47.018 "a6a3f626-eac5-411e-a1e9-9197ef753423" 00:14:47.018 ], 00:14:47.018 "product_name": "Malloc disk", 00:14:47.018 "block_size": 512, 00:14:47.018 "num_blocks": 65536, 00:14:47.018 "uuid": "a6a3f626-eac5-411e-a1e9-9197ef753423", 00:14:47.018 "assigned_rate_limits": { 00:14:47.018 "rw_ios_per_sec": 0, 00:14:47.018 "rw_mbytes_per_sec": 0, 00:14:47.018 "r_mbytes_per_sec": 0, 00:14:47.018 "w_mbytes_per_sec": 0 00:14:47.018 }, 00:14:47.018 "claimed": true, 00:14:47.018 "claim_type": "exclusive_write", 00:14:47.018 "zoned": false, 00:14:47.018 "supported_io_types": { 00:14:47.018 "read": true, 00:14:47.018 "write": true, 00:14:47.018 "unmap": true, 00:14:47.018 "flush": true, 00:14:47.018 "reset": true, 00:14:47.018 "nvme_admin": false, 00:14:47.018 "nvme_io": false, 00:14:47.018 "nvme_io_md": false, 00:14:47.018 "write_zeroes": true, 00:14:47.018 "zcopy": true, 00:14:47.018 "get_zone_info": false, 00:14:47.018 "zone_management": false, 00:14:47.018 "zone_append": false, 00:14:47.018 "compare": false, 00:14:47.018 "compare_and_write": false, 00:14:47.018 "abort": true, 00:14:47.018 "seek_hole": false, 00:14:47.018 "seek_data": false, 00:14:47.018 "copy": true, 00:14:47.018 "nvme_iov_md": false 00:14:47.018 }, 00:14:47.018 "memory_domains": [ 00:14:47.018 { 00:14:47.018 "dma_device_id": "system", 00:14:47.018 "dma_device_type": 1 00:14:47.018 }, 00:14:47.018 { 00:14:47.018 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.018 "dma_device_type": 2 00:14:47.018 } 00:14:47.018 ], 00:14:47.018 "driver_specific": {} 00:14:47.018 } 00:14:47.018 ] 00:14:47.018 18:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:47.018 18:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:47.018 18:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:47.018 18:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:47.018 18:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:47.018 18:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:47.018 18:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:47.018 18:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:47.018 18:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:47.018 18:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:47.018 18:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:47.018 18:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:47.018 18:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:47.018 18:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.018 18:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:47.277 18:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:47.277 "name": "Existed_Raid", 00:14:47.277 "uuid": "238fe2d1-345b-47ac-991d-1473d12870db", 00:14:47.277 "strip_size_kb": 64, 00:14:47.277 "state": "configuring", 00:14:47.277 "raid_level": "raid0", 00:14:47.277 "superblock": true, 00:14:47.277 "num_base_bdevs": 4, 00:14:47.277 "num_base_bdevs_discovered": 2, 00:14:47.277 "num_base_bdevs_operational": 4, 00:14:47.277 "base_bdevs_list": [ 00:14:47.277 { 00:14:47.277 "name": "BaseBdev1", 00:14:47.277 "uuid": "d3b9384b-e184-467c-bc86-2345e0c9b930", 00:14:47.277 "is_configured": true, 00:14:47.277 "data_offset": 2048, 00:14:47.277 "data_size": 63488 00:14:47.277 }, 00:14:47.277 { 00:14:47.277 "name": "BaseBdev2", 00:14:47.277 "uuid": "a6a3f626-eac5-411e-a1e9-9197ef753423", 00:14:47.277 "is_configured": true, 00:14:47.277 "data_offset": 2048, 00:14:47.277 "data_size": 63488 00:14:47.277 }, 00:14:47.277 { 00:14:47.277 "name": "BaseBdev3", 00:14:47.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.277 "is_configured": false, 00:14:47.277 "data_offset": 0, 00:14:47.277 "data_size": 0 00:14:47.277 }, 00:14:47.277 { 00:14:47.277 "name": "BaseBdev4", 00:14:47.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.277 "is_configured": false, 00:14:47.277 "data_offset": 0, 00:14:47.277 "data_size": 0 00:14:47.277 } 00:14:47.277 ] 00:14:47.277 }' 00:14:47.277 18:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:47.277 18:51:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:47.843 18:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:47.843 [2024-07-24 18:51:32.788052] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:47.843 BaseBdev3 00:14:47.843 18:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:47.843 18:51:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:47.843 18:51:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:47.843 18:51:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:47.843 18:51:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:47.843 18:51:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:47.843 18:51:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:48.102 18:51:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:48.360 [ 00:14:48.360 { 00:14:48.360 "name": "BaseBdev3", 00:14:48.360 "aliases": [ 00:14:48.360 "cf6bccd2-100e-41b7-9f8b-43eaf5b2572b" 00:14:48.360 ], 00:14:48.360 "product_name": "Malloc disk", 00:14:48.360 "block_size": 512, 00:14:48.360 "num_blocks": 65536, 00:14:48.360 "uuid": "cf6bccd2-100e-41b7-9f8b-43eaf5b2572b", 00:14:48.360 "assigned_rate_limits": { 00:14:48.360 "rw_ios_per_sec": 0, 00:14:48.360 "rw_mbytes_per_sec": 0, 00:14:48.360 "r_mbytes_per_sec": 0, 00:14:48.360 "w_mbytes_per_sec": 0 00:14:48.360 }, 00:14:48.360 "claimed": true, 00:14:48.360 "claim_type": "exclusive_write", 00:14:48.360 "zoned": false, 00:14:48.360 "supported_io_types": { 00:14:48.360 "read": true, 00:14:48.360 "write": true, 00:14:48.360 "unmap": true, 00:14:48.360 "flush": true, 00:14:48.360 "reset": true, 00:14:48.360 "nvme_admin": false, 00:14:48.360 "nvme_io": false, 00:14:48.360 "nvme_io_md": false, 00:14:48.360 "write_zeroes": true, 00:14:48.360 "zcopy": true, 00:14:48.360 "get_zone_info": false, 00:14:48.360 "zone_management": false, 00:14:48.360 "zone_append": false, 00:14:48.360 "compare": false, 00:14:48.360 "compare_and_write": false, 00:14:48.360 "abort": true, 00:14:48.360 "seek_hole": false, 00:14:48.360 "seek_data": false, 00:14:48.360 "copy": true, 00:14:48.360 "nvme_iov_md": false 00:14:48.360 }, 00:14:48.360 "memory_domains": [ 00:14:48.360 { 00:14:48.360 "dma_device_id": "system", 00:14:48.360 "dma_device_type": 1 00:14:48.360 }, 00:14:48.360 { 00:14:48.360 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:48.360 "dma_device_type": 2 00:14:48.361 } 00:14:48.361 ], 00:14:48.361 "driver_specific": {} 00:14:48.361 } 00:14:48.361 ] 00:14:48.361 18:51:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:48.361 18:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:48.361 18:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:48.361 18:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:48.361 18:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:48.361 18:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:48.361 18:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:48.361 18:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:48.361 18:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:48.361 18:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:48.361 18:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:48.361 18:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:48.361 18:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:48.361 18:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.361 18:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:48.361 18:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:48.361 "name": "Existed_Raid", 00:14:48.361 "uuid": "238fe2d1-345b-47ac-991d-1473d12870db", 00:14:48.361 "strip_size_kb": 64, 00:14:48.361 "state": "configuring", 00:14:48.361 "raid_level": "raid0", 00:14:48.361 "superblock": true, 00:14:48.361 "num_base_bdevs": 4, 00:14:48.361 "num_base_bdevs_discovered": 3, 00:14:48.361 "num_base_bdevs_operational": 4, 00:14:48.361 "base_bdevs_list": [ 00:14:48.361 { 00:14:48.361 "name": "BaseBdev1", 00:14:48.361 "uuid": "d3b9384b-e184-467c-bc86-2345e0c9b930", 00:14:48.361 "is_configured": true, 00:14:48.361 "data_offset": 2048, 00:14:48.361 "data_size": 63488 00:14:48.361 }, 00:14:48.361 { 00:14:48.361 "name": "BaseBdev2", 00:14:48.361 "uuid": "a6a3f626-eac5-411e-a1e9-9197ef753423", 00:14:48.361 "is_configured": true, 00:14:48.361 "data_offset": 2048, 00:14:48.361 "data_size": 63488 00:14:48.361 }, 00:14:48.361 { 00:14:48.361 "name": "BaseBdev3", 00:14:48.361 "uuid": "cf6bccd2-100e-41b7-9f8b-43eaf5b2572b", 00:14:48.361 "is_configured": true, 00:14:48.361 "data_offset": 2048, 00:14:48.361 "data_size": 63488 00:14:48.361 }, 00:14:48.361 { 00:14:48.361 "name": "BaseBdev4", 00:14:48.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:48.361 "is_configured": false, 00:14:48.361 "data_offset": 0, 00:14:48.361 "data_size": 0 00:14:48.361 } 00:14:48.361 ] 00:14:48.361 }' 00:14:48.361 18:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:48.361 18:51:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:48.929 18:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:14:48.929 [2024-07-24 18:51:33.913650] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:14:48.929 [2024-07-24 18:51:33.913778] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1849490 00:14:48.929 [2024-07-24 18:51:33.913786] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:14:48.929 [2024-07-24 18:51:33.913897] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18352d0 00:14:48.929 [2024-07-24 18:51:33.913978] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1849490 00:14:48.929 [2024-07-24 18:51:33.913983] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1849490 00:14:48.929 [2024-07-24 18:51:33.914042] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:48.929 BaseBdev4 00:14:48.929 18:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:14:48.929 18:51:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:14:48.929 18:51:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:48.929 18:51:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:48.929 18:51:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:48.929 18:51:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:48.929 18:51:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:49.187 18:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:14:49.447 [ 00:14:49.447 { 00:14:49.447 "name": "BaseBdev4", 00:14:49.447 "aliases": [ 00:14:49.447 "73266e31-3792-4f66-812b-576a27088434" 00:14:49.447 ], 00:14:49.447 "product_name": "Malloc disk", 00:14:49.447 "block_size": 512, 00:14:49.447 "num_blocks": 65536, 00:14:49.447 "uuid": "73266e31-3792-4f66-812b-576a27088434", 00:14:49.447 "assigned_rate_limits": { 00:14:49.447 "rw_ios_per_sec": 0, 00:14:49.447 "rw_mbytes_per_sec": 0, 00:14:49.447 "r_mbytes_per_sec": 0, 00:14:49.447 "w_mbytes_per_sec": 0 00:14:49.447 }, 00:14:49.447 "claimed": true, 00:14:49.447 "claim_type": "exclusive_write", 00:14:49.447 "zoned": false, 00:14:49.447 "supported_io_types": { 00:14:49.447 "read": true, 00:14:49.447 "write": true, 00:14:49.447 "unmap": true, 00:14:49.447 "flush": true, 00:14:49.447 "reset": true, 00:14:49.447 "nvme_admin": false, 00:14:49.447 "nvme_io": false, 00:14:49.447 "nvme_io_md": false, 00:14:49.447 "write_zeroes": true, 00:14:49.447 "zcopy": true, 00:14:49.447 "get_zone_info": false, 00:14:49.447 "zone_management": false, 00:14:49.447 "zone_append": false, 00:14:49.447 "compare": false, 00:14:49.447 "compare_and_write": false, 00:14:49.447 "abort": true, 00:14:49.447 "seek_hole": false, 00:14:49.447 "seek_data": false, 00:14:49.447 "copy": true, 00:14:49.447 "nvme_iov_md": false 00:14:49.447 }, 00:14:49.447 "memory_domains": [ 00:14:49.447 { 00:14:49.447 "dma_device_id": "system", 00:14:49.447 "dma_device_type": 1 00:14:49.447 }, 00:14:49.447 { 00:14:49.447 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:49.447 "dma_device_type": 2 00:14:49.447 } 00:14:49.447 ], 00:14:49.447 "driver_specific": {} 00:14:49.447 } 00:14:49.447 ] 00:14:49.447 18:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:49.447 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:49.447 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:49.447 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:14:49.447 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:49.447 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:49.447 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:49.447 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:49.447 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:49.447 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:49.447 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:49.447 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:49.447 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:49.447 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.447 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:49.447 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:49.447 "name": "Existed_Raid", 00:14:49.447 "uuid": "238fe2d1-345b-47ac-991d-1473d12870db", 00:14:49.447 "strip_size_kb": 64, 00:14:49.447 "state": "online", 00:14:49.447 "raid_level": "raid0", 00:14:49.447 "superblock": true, 00:14:49.447 "num_base_bdevs": 4, 00:14:49.447 "num_base_bdevs_discovered": 4, 00:14:49.447 "num_base_bdevs_operational": 4, 00:14:49.447 "base_bdevs_list": [ 00:14:49.447 { 00:14:49.447 "name": "BaseBdev1", 00:14:49.447 "uuid": "d3b9384b-e184-467c-bc86-2345e0c9b930", 00:14:49.447 "is_configured": true, 00:14:49.447 "data_offset": 2048, 00:14:49.447 "data_size": 63488 00:14:49.447 }, 00:14:49.447 { 00:14:49.447 "name": "BaseBdev2", 00:14:49.447 "uuid": "a6a3f626-eac5-411e-a1e9-9197ef753423", 00:14:49.447 "is_configured": true, 00:14:49.447 "data_offset": 2048, 00:14:49.447 "data_size": 63488 00:14:49.447 }, 00:14:49.447 { 00:14:49.447 "name": "BaseBdev3", 00:14:49.447 "uuid": "cf6bccd2-100e-41b7-9f8b-43eaf5b2572b", 00:14:49.447 "is_configured": true, 00:14:49.447 "data_offset": 2048, 00:14:49.447 "data_size": 63488 00:14:49.447 }, 00:14:49.447 { 00:14:49.447 "name": "BaseBdev4", 00:14:49.447 "uuid": "73266e31-3792-4f66-812b-576a27088434", 00:14:49.447 "is_configured": true, 00:14:49.447 "data_offset": 2048, 00:14:49.447 "data_size": 63488 00:14:49.447 } 00:14:49.447 ] 00:14:49.447 }' 00:14:49.447 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:49.447 18:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:50.098 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:50.098 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:50.098 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:50.098 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:50.098 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:50.098 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:50.098 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:50.098 18:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:50.098 [2024-07-24 18:51:35.064840] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:50.098 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:50.098 "name": "Existed_Raid", 00:14:50.098 "aliases": [ 00:14:50.098 "238fe2d1-345b-47ac-991d-1473d12870db" 00:14:50.098 ], 00:14:50.098 "product_name": "Raid Volume", 00:14:50.098 "block_size": 512, 00:14:50.098 "num_blocks": 253952, 00:14:50.098 "uuid": "238fe2d1-345b-47ac-991d-1473d12870db", 00:14:50.098 "assigned_rate_limits": { 00:14:50.098 "rw_ios_per_sec": 0, 00:14:50.098 "rw_mbytes_per_sec": 0, 00:14:50.098 "r_mbytes_per_sec": 0, 00:14:50.098 "w_mbytes_per_sec": 0 00:14:50.098 }, 00:14:50.098 "claimed": false, 00:14:50.098 "zoned": false, 00:14:50.098 "supported_io_types": { 00:14:50.098 "read": true, 00:14:50.098 "write": true, 00:14:50.098 "unmap": true, 00:14:50.098 "flush": true, 00:14:50.098 "reset": true, 00:14:50.098 "nvme_admin": false, 00:14:50.098 "nvme_io": false, 00:14:50.098 "nvme_io_md": false, 00:14:50.098 "write_zeroes": true, 00:14:50.098 "zcopy": false, 00:14:50.098 "get_zone_info": false, 00:14:50.098 "zone_management": false, 00:14:50.098 "zone_append": false, 00:14:50.098 "compare": false, 00:14:50.098 "compare_and_write": false, 00:14:50.098 "abort": false, 00:14:50.098 "seek_hole": false, 00:14:50.098 "seek_data": false, 00:14:50.098 "copy": false, 00:14:50.098 "nvme_iov_md": false 00:14:50.098 }, 00:14:50.098 "memory_domains": [ 00:14:50.098 { 00:14:50.098 "dma_device_id": "system", 00:14:50.098 "dma_device_type": 1 00:14:50.098 }, 00:14:50.098 { 00:14:50.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.098 "dma_device_type": 2 00:14:50.098 }, 00:14:50.098 { 00:14:50.098 "dma_device_id": "system", 00:14:50.098 "dma_device_type": 1 00:14:50.098 }, 00:14:50.098 { 00:14:50.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.098 "dma_device_type": 2 00:14:50.098 }, 00:14:50.098 { 00:14:50.098 "dma_device_id": "system", 00:14:50.098 "dma_device_type": 1 00:14:50.098 }, 00:14:50.098 { 00:14:50.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.098 "dma_device_type": 2 00:14:50.098 }, 00:14:50.098 { 00:14:50.098 "dma_device_id": "system", 00:14:50.098 "dma_device_type": 1 00:14:50.098 }, 00:14:50.098 { 00:14:50.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.098 "dma_device_type": 2 00:14:50.098 } 00:14:50.098 ], 00:14:50.098 "driver_specific": { 00:14:50.098 "raid": { 00:14:50.098 "uuid": "238fe2d1-345b-47ac-991d-1473d12870db", 00:14:50.098 "strip_size_kb": 64, 00:14:50.098 "state": "online", 00:14:50.098 "raid_level": "raid0", 00:14:50.098 "superblock": true, 00:14:50.098 "num_base_bdevs": 4, 00:14:50.098 "num_base_bdevs_discovered": 4, 00:14:50.098 "num_base_bdevs_operational": 4, 00:14:50.098 "base_bdevs_list": [ 00:14:50.098 { 00:14:50.098 "name": "BaseBdev1", 00:14:50.098 "uuid": "d3b9384b-e184-467c-bc86-2345e0c9b930", 00:14:50.098 "is_configured": true, 00:14:50.098 "data_offset": 2048, 00:14:50.098 "data_size": 63488 00:14:50.098 }, 00:14:50.098 { 00:14:50.098 "name": "BaseBdev2", 00:14:50.098 "uuid": "a6a3f626-eac5-411e-a1e9-9197ef753423", 00:14:50.098 "is_configured": true, 00:14:50.098 "data_offset": 2048, 00:14:50.098 "data_size": 63488 00:14:50.098 }, 00:14:50.098 { 00:14:50.098 "name": "BaseBdev3", 00:14:50.098 "uuid": "cf6bccd2-100e-41b7-9f8b-43eaf5b2572b", 00:14:50.098 "is_configured": true, 00:14:50.098 "data_offset": 2048, 00:14:50.098 "data_size": 63488 00:14:50.098 }, 00:14:50.098 { 00:14:50.098 "name": "BaseBdev4", 00:14:50.098 "uuid": "73266e31-3792-4f66-812b-576a27088434", 00:14:50.098 "is_configured": true, 00:14:50.098 "data_offset": 2048, 00:14:50.098 "data_size": 63488 00:14:50.098 } 00:14:50.098 ] 00:14:50.098 } 00:14:50.098 } 00:14:50.098 }' 00:14:50.098 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:50.357 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:50.357 BaseBdev2 00:14:50.357 BaseBdev3 00:14:50.357 BaseBdev4' 00:14:50.357 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:50.357 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:50.357 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:50.357 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:50.357 "name": "BaseBdev1", 00:14:50.357 "aliases": [ 00:14:50.357 "d3b9384b-e184-467c-bc86-2345e0c9b930" 00:14:50.357 ], 00:14:50.357 "product_name": "Malloc disk", 00:14:50.357 "block_size": 512, 00:14:50.357 "num_blocks": 65536, 00:14:50.357 "uuid": "d3b9384b-e184-467c-bc86-2345e0c9b930", 00:14:50.357 "assigned_rate_limits": { 00:14:50.357 "rw_ios_per_sec": 0, 00:14:50.357 "rw_mbytes_per_sec": 0, 00:14:50.357 "r_mbytes_per_sec": 0, 00:14:50.357 "w_mbytes_per_sec": 0 00:14:50.357 }, 00:14:50.357 "claimed": true, 00:14:50.357 "claim_type": "exclusive_write", 00:14:50.357 "zoned": false, 00:14:50.357 "supported_io_types": { 00:14:50.357 "read": true, 00:14:50.357 "write": true, 00:14:50.357 "unmap": true, 00:14:50.357 "flush": true, 00:14:50.357 "reset": true, 00:14:50.357 "nvme_admin": false, 00:14:50.357 "nvme_io": false, 00:14:50.357 "nvme_io_md": false, 00:14:50.357 "write_zeroes": true, 00:14:50.357 "zcopy": true, 00:14:50.357 "get_zone_info": false, 00:14:50.357 "zone_management": false, 00:14:50.357 "zone_append": false, 00:14:50.357 "compare": false, 00:14:50.357 "compare_and_write": false, 00:14:50.357 "abort": true, 00:14:50.357 "seek_hole": false, 00:14:50.357 "seek_data": false, 00:14:50.357 "copy": true, 00:14:50.357 "nvme_iov_md": false 00:14:50.357 }, 00:14:50.357 "memory_domains": [ 00:14:50.357 { 00:14:50.357 "dma_device_id": "system", 00:14:50.357 "dma_device_type": 1 00:14:50.357 }, 00:14:50.357 { 00:14:50.357 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.357 "dma_device_type": 2 00:14:50.357 } 00:14:50.357 ], 00:14:50.357 "driver_specific": {} 00:14:50.357 }' 00:14:50.357 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:50.357 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:50.357 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:50.357 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:50.615 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:50.615 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:50.615 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:50.615 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:50.615 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:50.615 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:50.615 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:50.615 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:50.615 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:50.615 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:50.615 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:50.874 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:50.874 "name": "BaseBdev2", 00:14:50.874 "aliases": [ 00:14:50.874 "a6a3f626-eac5-411e-a1e9-9197ef753423" 00:14:50.874 ], 00:14:50.874 "product_name": "Malloc disk", 00:14:50.874 "block_size": 512, 00:14:50.874 "num_blocks": 65536, 00:14:50.874 "uuid": "a6a3f626-eac5-411e-a1e9-9197ef753423", 00:14:50.874 "assigned_rate_limits": { 00:14:50.874 "rw_ios_per_sec": 0, 00:14:50.874 "rw_mbytes_per_sec": 0, 00:14:50.874 "r_mbytes_per_sec": 0, 00:14:50.874 "w_mbytes_per_sec": 0 00:14:50.874 }, 00:14:50.874 "claimed": true, 00:14:50.874 "claim_type": "exclusive_write", 00:14:50.874 "zoned": false, 00:14:50.874 "supported_io_types": { 00:14:50.874 "read": true, 00:14:50.874 "write": true, 00:14:50.874 "unmap": true, 00:14:50.874 "flush": true, 00:14:50.874 "reset": true, 00:14:50.874 "nvme_admin": false, 00:14:50.874 "nvme_io": false, 00:14:50.874 "nvme_io_md": false, 00:14:50.874 "write_zeroes": true, 00:14:50.874 "zcopy": true, 00:14:50.874 "get_zone_info": false, 00:14:50.874 "zone_management": false, 00:14:50.874 "zone_append": false, 00:14:50.874 "compare": false, 00:14:50.874 "compare_and_write": false, 00:14:50.874 "abort": true, 00:14:50.874 "seek_hole": false, 00:14:50.874 "seek_data": false, 00:14:50.874 "copy": true, 00:14:50.874 "nvme_iov_md": false 00:14:50.874 }, 00:14:50.874 "memory_domains": [ 00:14:50.874 { 00:14:50.874 "dma_device_id": "system", 00:14:50.874 "dma_device_type": 1 00:14:50.874 }, 00:14:50.874 { 00:14:50.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.874 "dma_device_type": 2 00:14:50.874 } 00:14:50.874 ], 00:14:50.874 "driver_specific": {} 00:14:50.874 }' 00:14:50.874 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:50.874 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:50.874 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:50.874 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:50.874 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:51.133 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:51.133 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:51.133 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:51.133 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:51.133 18:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:51.133 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:51.133 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:51.133 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:51.133 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:51.133 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:51.391 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:51.391 "name": "BaseBdev3", 00:14:51.391 "aliases": [ 00:14:51.391 "cf6bccd2-100e-41b7-9f8b-43eaf5b2572b" 00:14:51.391 ], 00:14:51.391 "product_name": "Malloc disk", 00:14:51.391 "block_size": 512, 00:14:51.391 "num_blocks": 65536, 00:14:51.391 "uuid": "cf6bccd2-100e-41b7-9f8b-43eaf5b2572b", 00:14:51.391 "assigned_rate_limits": { 00:14:51.391 "rw_ios_per_sec": 0, 00:14:51.391 "rw_mbytes_per_sec": 0, 00:14:51.391 "r_mbytes_per_sec": 0, 00:14:51.391 "w_mbytes_per_sec": 0 00:14:51.391 }, 00:14:51.391 "claimed": true, 00:14:51.391 "claim_type": "exclusive_write", 00:14:51.391 "zoned": false, 00:14:51.391 "supported_io_types": { 00:14:51.391 "read": true, 00:14:51.391 "write": true, 00:14:51.391 "unmap": true, 00:14:51.391 "flush": true, 00:14:51.391 "reset": true, 00:14:51.391 "nvme_admin": false, 00:14:51.391 "nvme_io": false, 00:14:51.391 "nvme_io_md": false, 00:14:51.391 "write_zeroes": true, 00:14:51.391 "zcopy": true, 00:14:51.391 "get_zone_info": false, 00:14:51.391 "zone_management": false, 00:14:51.391 "zone_append": false, 00:14:51.391 "compare": false, 00:14:51.391 "compare_and_write": false, 00:14:51.391 "abort": true, 00:14:51.391 "seek_hole": false, 00:14:51.391 "seek_data": false, 00:14:51.391 "copy": true, 00:14:51.391 "nvme_iov_md": false 00:14:51.391 }, 00:14:51.391 "memory_domains": [ 00:14:51.391 { 00:14:51.391 "dma_device_id": "system", 00:14:51.391 "dma_device_type": 1 00:14:51.391 }, 00:14:51.391 { 00:14:51.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.391 "dma_device_type": 2 00:14:51.391 } 00:14:51.391 ], 00:14:51.391 "driver_specific": {} 00:14:51.391 }' 00:14:51.391 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:51.391 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:51.391 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:51.391 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:51.391 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:51.391 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:51.391 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:51.649 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:51.649 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:51.649 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:51.649 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:51.649 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:51.649 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:51.649 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:14:51.649 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:51.907 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:51.907 "name": "BaseBdev4", 00:14:51.907 "aliases": [ 00:14:51.907 "73266e31-3792-4f66-812b-576a27088434" 00:14:51.907 ], 00:14:51.907 "product_name": "Malloc disk", 00:14:51.907 "block_size": 512, 00:14:51.907 "num_blocks": 65536, 00:14:51.907 "uuid": "73266e31-3792-4f66-812b-576a27088434", 00:14:51.907 "assigned_rate_limits": { 00:14:51.907 "rw_ios_per_sec": 0, 00:14:51.907 "rw_mbytes_per_sec": 0, 00:14:51.907 "r_mbytes_per_sec": 0, 00:14:51.907 "w_mbytes_per_sec": 0 00:14:51.907 }, 00:14:51.907 "claimed": true, 00:14:51.907 "claim_type": "exclusive_write", 00:14:51.907 "zoned": false, 00:14:51.907 "supported_io_types": { 00:14:51.907 "read": true, 00:14:51.907 "write": true, 00:14:51.907 "unmap": true, 00:14:51.907 "flush": true, 00:14:51.907 "reset": true, 00:14:51.907 "nvme_admin": false, 00:14:51.907 "nvme_io": false, 00:14:51.907 "nvme_io_md": false, 00:14:51.907 "write_zeroes": true, 00:14:51.907 "zcopy": true, 00:14:51.907 "get_zone_info": false, 00:14:51.907 "zone_management": false, 00:14:51.907 "zone_append": false, 00:14:51.907 "compare": false, 00:14:51.907 "compare_and_write": false, 00:14:51.907 "abort": true, 00:14:51.907 "seek_hole": false, 00:14:51.907 "seek_data": false, 00:14:51.907 "copy": true, 00:14:51.907 "nvme_iov_md": false 00:14:51.907 }, 00:14:51.907 "memory_domains": [ 00:14:51.907 { 00:14:51.907 "dma_device_id": "system", 00:14:51.907 "dma_device_type": 1 00:14:51.907 }, 00:14:51.907 { 00:14:51.907 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.907 "dma_device_type": 2 00:14:51.907 } 00:14:51.907 ], 00:14:51.907 "driver_specific": {} 00:14:51.907 }' 00:14:51.907 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:51.907 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:51.907 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:51.907 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:51.907 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:51.907 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:51.907 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:51.907 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:51.907 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:51.907 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:52.166 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:52.166 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:52.166 18:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:52.166 [2024-07-24 18:51:37.130018] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:52.166 [2024-07-24 18:51:37.130036] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:52.166 [2024-07-24 18:51:37.130069] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:52.166 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:52.166 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:52.166 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:52.166 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:14:52.166 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:52.166 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:14:52.166 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:52.166 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:52.166 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:52.166 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:52.166 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:52.166 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:52.166 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:52.166 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:52.166 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:52.166 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.166 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:52.424 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:52.424 "name": "Existed_Raid", 00:14:52.424 "uuid": "238fe2d1-345b-47ac-991d-1473d12870db", 00:14:52.424 "strip_size_kb": 64, 00:14:52.424 "state": "offline", 00:14:52.425 "raid_level": "raid0", 00:14:52.425 "superblock": true, 00:14:52.425 "num_base_bdevs": 4, 00:14:52.425 "num_base_bdevs_discovered": 3, 00:14:52.425 "num_base_bdevs_operational": 3, 00:14:52.425 "base_bdevs_list": [ 00:14:52.425 { 00:14:52.425 "name": null, 00:14:52.425 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:52.425 "is_configured": false, 00:14:52.425 "data_offset": 2048, 00:14:52.425 "data_size": 63488 00:14:52.425 }, 00:14:52.425 { 00:14:52.425 "name": "BaseBdev2", 00:14:52.425 "uuid": "a6a3f626-eac5-411e-a1e9-9197ef753423", 00:14:52.425 "is_configured": true, 00:14:52.425 "data_offset": 2048, 00:14:52.425 "data_size": 63488 00:14:52.425 }, 00:14:52.425 { 00:14:52.425 "name": "BaseBdev3", 00:14:52.425 "uuid": "cf6bccd2-100e-41b7-9f8b-43eaf5b2572b", 00:14:52.425 "is_configured": true, 00:14:52.425 "data_offset": 2048, 00:14:52.425 "data_size": 63488 00:14:52.425 }, 00:14:52.425 { 00:14:52.425 "name": "BaseBdev4", 00:14:52.425 "uuid": "73266e31-3792-4f66-812b-576a27088434", 00:14:52.425 "is_configured": true, 00:14:52.425 "data_offset": 2048, 00:14:52.425 "data_size": 63488 00:14:52.425 } 00:14:52.425 ] 00:14:52.425 }' 00:14:52.425 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:52.425 18:51:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:52.990 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:52.990 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:52.990 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:52.990 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.990 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:52.990 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:52.990 18:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:53.248 [2024-07-24 18:51:38.049321] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:53.248 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:53.248 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:53.248 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:53.248 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.248 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:53.249 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:53.249 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:53.507 [2024-07-24 18:51:38.399839] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:53.507 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:53.507 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:53.507 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.507 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:53.766 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:53.766 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:53.766 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:14:53.766 [2024-07-24 18:51:38.746415] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:14:53.766 [2024-07-24 18:51:38.746446] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1849490 name Existed_Raid, state offline 00:14:53.766 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:53.766 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:53.766 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.766 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:54.024 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:54.024 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:54.024 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:14:54.024 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:54.024 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:54.024 18:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:54.282 BaseBdev2 00:14:54.282 18:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:54.282 18:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:54.282 18:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:54.282 18:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:54.282 18:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:54.282 18:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:54.282 18:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:54.282 18:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:54.540 [ 00:14:54.540 { 00:14:54.540 "name": "BaseBdev2", 00:14:54.540 "aliases": [ 00:14:54.540 "ccc78205-c13f-4738-bbc8-63ccd9fbf50d" 00:14:54.540 ], 00:14:54.540 "product_name": "Malloc disk", 00:14:54.540 "block_size": 512, 00:14:54.540 "num_blocks": 65536, 00:14:54.540 "uuid": "ccc78205-c13f-4738-bbc8-63ccd9fbf50d", 00:14:54.540 "assigned_rate_limits": { 00:14:54.540 "rw_ios_per_sec": 0, 00:14:54.540 "rw_mbytes_per_sec": 0, 00:14:54.540 "r_mbytes_per_sec": 0, 00:14:54.540 "w_mbytes_per_sec": 0 00:14:54.540 }, 00:14:54.540 "claimed": false, 00:14:54.540 "zoned": false, 00:14:54.540 "supported_io_types": { 00:14:54.540 "read": true, 00:14:54.540 "write": true, 00:14:54.540 "unmap": true, 00:14:54.540 "flush": true, 00:14:54.540 "reset": true, 00:14:54.540 "nvme_admin": false, 00:14:54.540 "nvme_io": false, 00:14:54.540 "nvme_io_md": false, 00:14:54.540 "write_zeroes": true, 00:14:54.540 "zcopy": true, 00:14:54.540 "get_zone_info": false, 00:14:54.540 "zone_management": false, 00:14:54.540 "zone_append": false, 00:14:54.540 "compare": false, 00:14:54.540 "compare_and_write": false, 00:14:54.540 "abort": true, 00:14:54.540 "seek_hole": false, 00:14:54.540 "seek_data": false, 00:14:54.540 "copy": true, 00:14:54.540 "nvme_iov_md": false 00:14:54.540 }, 00:14:54.540 "memory_domains": [ 00:14:54.540 { 00:14:54.540 "dma_device_id": "system", 00:14:54.540 "dma_device_type": 1 00:14:54.540 }, 00:14:54.540 { 00:14:54.540 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:54.540 "dma_device_type": 2 00:14:54.540 } 00:14:54.540 ], 00:14:54.540 "driver_specific": {} 00:14:54.540 } 00:14:54.540 ] 00:14:54.540 18:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:54.540 18:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:54.540 18:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:54.540 18:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:54.798 BaseBdev3 00:14:54.798 18:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:54.798 18:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:54.798 18:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:54.798 18:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:54.798 18:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:54.798 18:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:54.798 18:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:54.798 18:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:55.057 [ 00:14:55.057 { 00:14:55.057 "name": "BaseBdev3", 00:14:55.057 "aliases": [ 00:14:55.057 "9b86fa54-3c6c-4bf5-a93a-cab6a72d0d03" 00:14:55.057 ], 00:14:55.057 "product_name": "Malloc disk", 00:14:55.057 "block_size": 512, 00:14:55.057 "num_blocks": 65536, 00:14:55.057 "uuid": "9b86fa54-3c6c-4bf5-a93a-cab6a72d0d03", 00:14:55.057 "assigned_rate_limits": { 00:14:55.057 "rw_ios_per_sec": 0, 00:14:55.057 "rw_mbytes_per_sec": 0, 00:14:55.057 "r_mbytes_per_sec": 0, 00:14:55.057 "w_mbytes_per_sec": 0 00:14:55.057 }, 00:14:55.057 "claimed": false, 00:14:55.057 "zoned": false, 00:14:55.057 "supported_io_types": { 00:14:55.057 "read": true, 00:14:55.057 "write": true, 00:14:55.057 "unmap": true, 00:14:55.057 "flush": true, 00:14:55.057 "reset": true, 00:14:55.057 "nvme_admin": false, 00:14:55.057 "nvme_io": false, 00:14:55.057 "nvme_io_md": false, 00:14:55.057 "write_zeroes": true, 00:14:55.057 "zcopy": true, 00:14:55.057 "get_zone_info": false, 00:14:55.057 "zone_management": false, 00:14:55.057 "zone_append": false, 00:14:55.057 "compare": false, 00:14:55.057 "compare_and_write": false, 00:14:55.057 "abort": true, 00:14:55.057 "seek_hole": false, 00:14:55.057 "seek_data": false, 00:14:55.057 "copy": true, 00:14:55.057 "nvme_iov_md": false 00:14:55.057 }, 00:14:55.057 "memory_domains": [ 00:14:55.057 { 00:14:55.057 "dma_device_id": "system", 00:14:55.057 "dma_device_type": 1 00:14:55.057 }, 00:14:55.057 { 00:14:55.057 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:55.057 "dma_device_type": 2 00:14:55.057 } 00:14:55.057 ], 00:14:55.057 "driver_specific": {} 00:14:55.057 } 00:14:55.057 ] 00:14:55.057 18:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:55.057 18:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:55.057 18:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:55.057 18:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:14:55.315 BaseBdev4 00:14:55.315 18:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:14:55.315 18:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:14:55.315 18:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:55.315 18:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:55.315 18:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:55.315 18:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:55.315 18:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:55.315 18:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:14:55.574 [ 00:14:55.574 { 00:14:55.574 "name": "BaseBdev4", 00:14:55.574 "aliases": [ 00:14:55.574 "1cc23c24-65ba-4bfe-96df-c4ace54c6f3b" 00:14:55.574 ], 00:14:55.574 "product_name": "Malloc disk", 00:14:55.574 "block_size": 512, 00:14:55.574 "num_blocks": 65536, 00:14:55.574 "uuid": "1cc23c24-65ba-4bfe-96df-c4ace54c6f3b", 00:14:55.574 "assigned_rate_limits": { 00:14:55.574 "rw_ios_per_sec": 0, 00:14:55.574 "rw_mbytes_per_sec": 0, 00:14:55.574 "r_mbytes_per_sec": 0, 00:14:55.574 "w_mbytes_per_sec": 0 00:14:55.574 }, 00:14:55.574 "claimed": false, 00:14:55.574 "zoned": false, 00:14:55.574 "supported_io_types": { 00:14:55.574 "read": true, 00:14:55.574 "write": true, 00:14:55.574 "unmap": true, 00:14:55.574 "flush": true, 00:14:55.574 "reset": true, 00:14:55.574 "nvme_admin": false, 00:14:55.574 "nvme_io": false, 00:14:55.574 "nvme_io_md": false, 00:14:55.574 "write_zeroes": true, 00:14:55.574 "zcopy": true, 00:14:55.574 "get_zone_info": false, 00:14:55.574 "zone_management": false, 00:14:55.574 "zone_append": false, 00:14:55.574 "compare": false, 00:14:55.574 "compare_and_write": false, 00:14:55.574 "abort": true, 00:14:55.574 "seek_hole": false, 00:14:55.574 "seek_data": false, 00:14:55.574 "copy": true, 00:14:55.574 "nvme_iov_md": false 00:14:55.574 }, 00:14:55.574 "memory_domains": [ 00:14:55.574 { 00:14:55.574 "dma_device_id": "system", 00:14:55.574 "dma_device_type": 1 00:14:55.574 }, 00:14:55.574 { 00:14:55.574 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:55.574 "dma_device_type": 2 00:14:55.574 } 00:14:55.574 ], 00:14:55.574 "driver_specific": {} 00:14:55.574 } 00:14:55.574 ] 00:14:55.574 18:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:55.574 18:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:55.574 18:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:55.574 18:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:55.833 [2024-07-24 18:51:40.596269] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:55.833 [2024-07-24 18:51:40.596298] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:55.833 [2024-07-24 18:51:40.596309] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:55.833 [2024-07-24 18:51:40.597261] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:55.833 [2024-07-24 18:51:40.597290] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:14:55.833 18:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:55.833 18:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:55.833 18:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:55.833 18:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:55.833 18:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:55.833 18:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:55.833 18:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:55.833 18:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:55.833 18:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:55.833 18:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:55.833 18:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.833 18:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:55.833 18:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:55.833 "name": "Existed_Raid", 00:14:55.833 "uuid": "5fc13a6b-14d7-4384-a605-ab90c8ae3fc1", 00:14:55.833 "strip_size_kb": 64, 00:14:55.833 "state": "configuring", 00:14:55.833 "raid_level": "raid0", 00:14:55.833 "superblock": true, 00:14:55.833 "num_base_bdevs": 4, 00:14:55.833 "num_base_bdevs_discovered": 3, 00:14:55.833 "num_base_bdevs_operational": 4, 00:14:55.833 "base_bdevs_list": [ 00:14:55.833 { 00:14:55.833 "name": "BaseBdev1", 00:14:55.833 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:55.833 "is_configured": false, 00:14:55.833 "data_offset": 0, 00:14:55.833 "data_size": 0 00:14:55.833 }, 00:14:55.833 { 00:14:55.833 "name": "BaseBdev2", 00:14:55.833 "uuid": "ccc78205-c13f-4738-bbc8-63ccd9fbf50d", 00:14:55.833 "is_configured": true, 00:14:55.833 "data_offset": 2048, 00:14:55.833 "data_size": 63488 00:14:55.833 }, 00:14:55.833 { 00:14:55.833 "name": "BaseBdev3", 00:14:55.833 "uuid": "9b86fa54-3c6c-4bf5-a93a-cab6a72d0d03", 00:14:55.833 "is_configured": true, 00:14:55.833 "data_offset": 2048, 00:14:55.833 "data_size": 63488 00:14:55.833 }, 00:14:55.833 { 00:14:55.833 "name": "BaseBdev4", 00:14:55.833 "uuid": "1cc23c24-65ba-4bfe-96df-c4ace54c6f3b", 00:14:55.833 "is_configured": true, 00:14:55.833 "data_offset": 2048, 00:14:55.833 "data_size": 63488 00:14:55.833 } 00:14:55.833 ] 00:14:55.833 }' 00:14:55.833 18:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:55.833 18:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:56.400 18:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:56.400 [2024-07-24 18:51:41.366245] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:56.400 18:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:56.400 18:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:56.400 18:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:56.400 18:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:56.400 18:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:56.400 18:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:56.400 18:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:56.400 18:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:56.400 18:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:56.400 18:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:56.400 18:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.400 18:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:56.659 18:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:56.659 "name": "Existed_Raid", 00:14:56.659 "uuid": "5fc13a6b-14d7-4384-a605-ab90c8ae3fc1", 00:14:56.659 "strip_size_kb": 64, 00:14:56.659 "state": "configuring", 00:14:56.659 "raid_level": "raid0", 00:14:56.659 "superblock": true, 00:14:56.659 "num_base_bdevs": 4, 00:14:56.659 "num_base_bdevs_discovered": 2, 00:14:56.659 "num_base_bdevs_operational": 4, 00:14:56.659 "base_bdevs_list": [ 00:14:56.659 { 00:14:56.659 "name": "BaseBdev1", 00:14:56.659 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:56.659 "is_configured": false, 00:14:56.659 "data_offset": 0, 00:14:56.659 "data_size": 0 00:14:56.659 }, 00:14:56.659 { 00:14:56.659 "name": null, 00:14:56.659 "uuid": "ccc78205-c13f-4738-bbc8-63ccd9fbf50d", 00:14:56.659 "is_configured": false, 00:14:56.659 "data_offset": 2048, 00:14:56.659 "data_size": 63488 00:14:56.659 }, 00:14:56.659 { 00:14:56.659 "name": "BaseBdev3", 00:14:56.659 "uuid": "9b86fa54-3c6c-4bf5-a93a-cab6a72d0d03", 00:14:56.659 "is_configured": true, 00:14:56.659 "data_offset": 2048, 00:14:56.659 "data_size": 63488 00:14:56.659 }, 00:14:56.659 { 00:14:56.659 "name": "BaseBdev4", 00:14:56.659 "uuid": "1cc23c24-65ba-4bfe-96df-c4ace54c6f3b", 00:14:56.659 "is_configured": true, 00:14:56.659 "data_offset": 2048, 00:14:56.659 "data_size": 63488 00:14:56.659 } 00:14:56.659 ] 00:14:56.659 }' 00:14:56.659 18:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:56.659 18:51:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:57.226 18:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.226 18:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:57.226 18:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:57.226 18:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:57.484 [2024-07-24 18:51:42.291334] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:57.484 BaseBdev1 00:14:57.484 18:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:57.484 18:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:57.484 18:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:57.484 18:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:57.484 18:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:57.484 18:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:57.484 18:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:57.484 18:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:57.742 [ 00:14:57.742 { 00:14:57.742 "name": "BaseBdev1", 00:14:57.742 "aliases": [ 00:14:57.742 "1d779fd2-cecd-44ef-94ca-9894e63c3d26" 00:14:57.742 ], 00:14:57.742 "product_name": "Malloc disk", 00:14:57.742 "block_size": 512, 00:14:57.742 "num_blocks": 65536, 00:14:57.742 "uuid": "1d779fd2-cecd-44ef-94ca-9894e63c3d26", 00:14:57.742 "assigned_rate_limits": { 00:14:57.742 "rw_ios_per_sec": 0, 00:14:57.742 "rw_mbytes_per_sec": 0, 00:14:57.743 "r_mbytes_per_sec": 0, 00:14:57.743 "w_mbytes_per_sec": 0 00:14:57.743 }, 00:14:57.743 "claimed": true, 00:14:57.743 "claim_type": "exclusive_write", 00:14:57.743 "zoned": false, 00:14:57.743 "supported_io_types": { 00:14:57.743 "read": true, 00:14:57.743 "write": true, 00:14:57.743 "unmap": true, 00:14:57.743 "flush": true, 00:14:57.743 "reset": true, 00:14:57.743 "nvme_admin": false, 00:14:57.743 "nvme_io": false, 00:14:57.743 "nvme_io_md": false, 00:14:57.743 "write_zeroes": true, 00:14:57.743 "zcopy": true, 00:14:57.743 "get_zone_info": false, 00:14:57.743 "zone_management": false, 00:14:57.743 "zone_append": false, 00:14:57.743 "compare": false, 00:14:57.743 "compare_and_write": false, 00:14:57.743 "abort": true, 00:14:57.743 "seek_hole": false, 00:14:57.743 "seek_data": false, 00:14:57.743 "copy": true, 00:14:57.743 "nvme_iov_md": false 00:14:57.743 }, 00:14:57.743 "memory_domains": [ 00:14:57.743 { 00:14:57.743 "dma_device_id": "system", 00:14:57.743 "dma_device_type": 1 00:14:57.743 }, 00:14:57.743 { 00:14:57.743 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.743 "dma_device_type": 2 00:14:57.743 } 00:14:57.743 ], 00:14:57.743 "driver_specific": {} 00:14:57.743 } 00:14:57.743 ] 00:14:57.743 18:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:57.743 18:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:57.743 18:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:57.743 18:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:57.743 18:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:57.743 18:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:57.743 18:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:57.743 18:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:57.743 18:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:57.743 18:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:57.743 18:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:57.743 18:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.743 18:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:58.000 18:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:58.000 "name": "Existed_Raid", 00:14:58.000 "uuid": "5fc13a6b-14d7-4384-a605-ab90c8ae3fc1", 00:14:58.000 "strip_size_kb": 64, 00:14:58.000 "state": "configuring", 00:14:58.000 "raid_level": "raid0", 00:14:58.000 "superblock": true, 00:14:58.000 "num_base_bdevs": 4, 00:14:58.000 "num_base_bdevs_discovered": 3, 00:14:58.000 "num_base_bdevs_operational": 4, 00:14:58.000 "base_bdevs_list": [ 00:14:58.000 { 00:14:58.000 "name": "BaseBdev1", 00:14:58.000 "uuid": "1d779fd2-cecd-44ef-94ca-9894e63c3d26", 00:14:58.000 "is_configured": true, 00:14:58.000 "data_offset": 2048, 00:14:58.000 "data_size": 63488 00:14:58.000 }, 00:14:58.000 { 00:14:58.000 "name": null, 00:14:58.000 "uuid": "ccc78205-c13f-4738-bbc8-63ccd9fbf50d", 00:14:58.000 "is_configured": false, 00:14:58.000 "data_offset": 2048, 00:14:58.000 "data_size": 63488 00:14:58.000 }, 00:14:58.000 { 00:14:58.000 "name": "BaseBdev3", 00:14:58.000 "uuid": "9b86fa54-3c6c-4bf5-a93a-cab6a72d0d03", 00:14:58.000 "is_configured": true, 00:14:58.000 "data_offset": 2048, 00:14:58.000 "data_size": 63488 00:14:58.000 }, 00:14:58.000 { 00:14:58.000 "name": "BaseBdev4", 00:14:58.000 "uuid": "1cc23c24-65ba-4bfe-96df-c4ace54c6f3b", 00:14:58.000 "is_configured": true, 00:14:58.000 "data_offset": 2048, 00:14:58.000 "data_size": 63488 00:14:58.000 } 00:14:58.000 ] 00:14:58.000 }' 00:14:58.000 18:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:58.000 18:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:58.258 18:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.258 18:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:58.516 18:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:58.516 18:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:58.775 [2024-07-24 18:51:43.538580] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:58.775 18:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:58.775 18:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:58.775 18:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:58.775 18:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:58.775 18:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:58.775 18:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:58.775 18:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:58.775 18:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:58.775 18:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:58.775 18:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:58.775 18:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.775 18:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:58.775 18:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:58.775 "name": "Existed_Raid", 00:14:58.775 "uuid": "5fc13a6b-14d7-4384-a605-ab90c8ae3fc1", 00:14:58.775 "strip_size_kb": 64, 00:14:58.775 "state": "configuring", 00:14:58.775 "raid_level": "raid0", 00:14:58.775 "superblock": true, 00:14:58.775 "num_base_bdevs": 4, 00:14:58.775 "num_base_bdevs_discovered": 2, 00:14:58.775 "num_base_bdevs_operational": 4, 00:14:58.775 "base_bdevs_list": [ 00:14:58.775 { 00:14:58.775 "name": "BaseBdev1", 00:14:58.775 "uuid": "1d779fd2-cecd-44ef-94ca-9894e63c3d26", 00:14:58.775 "is_configured": true, 00:14:58.775 "data_offset": 2048, 00:14:58.775 "data_size": 63488 00:14:58.775 }, 00:14:58.775 { 00:14:58.775 "name": null, 00:14:58.775 "uuid": "ccc78205-c13f-4738-bbc8-63ccd9fbf50d", 00:14:58.775 "is_configured": false, 00:14:58.775 "data_offset": 2048, 00:14:58.775 "data_size": 63488 00:14:58.775 }, 00:14:58.775 { 00:14:58.775 "name": null, 00:14:58.775 "uuid": "9b86fa54-3c6c-4bf5-a93a-cab6a72d0d03", 00:14:58.775 "is_configured": false, 00:14:58.775 "data_offset": 2048, 00:14:58.775 "data_size": 63488 00:14:58.775 }, 00:14:58.775 { 00:14:58.775 "name": "BaseBdev4", 00:14:58.775 "uuid": "1cc23c24-65ba-4bfe-96df-c4ace54c6f3b", 00:14:58.775 "is_configured": true, 00:14:58.775 "data_offset": 2048, 00:14:58.775 "data_size": 63488 00:14:58.775 } 00:14:58.775 ] 00:14:58.775 }' 00:14:58.775 18:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:58.775 18:51:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:59.341 18:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:59.341 18:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:59.599 18:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:59.599 18:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:59.599 [2024-07-24 18:51:44.557210] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:59.599 18:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:59.599 18:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:59.599 18:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:59.599 18:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:59.599 18:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:59.599 18:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:59.599 18:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:59.599 18:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:59.599 18:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:59.599 18:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:59.599 18:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:59.599 18:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:59.857 18:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:59.857 "name": "Existed_Raid", 00:14:59.857 "uuid": "5fc13a6b-14d7-4384-a605-ab90c8ae3fc1", 00:14:59.857 "strip_size_kb": 64, 00:14:59.857 "state": "configuring", 00:14:59.857 "raid_level": "raid0", 00:14:59.857 "superblock": true, 00:14:59.857 "num_base_bdevs": 4, 00:14:59.857 "num_base_bdevs_discovered": 3, 00:14:59.857 "num_base_bdevs_operational": 4, 00:14:59.857 "base_bdevs_list": [ 00:14:59.857 { 00:14:59.857 "name": "BaseBdev1", 00:14:59.857 "uuid": "1d779fd2-cecd-44ef-94ca-9894e63c3d26", 00:14:59.857 "is_configured": true, 00:14:59.857 "data_offset": 2048, 00:14:59.857 "data_size": 63488 00:14:59.857 }, 00:14:59.857 { 00:14:59.857 "name": null, 00:14:59.857 "uuid": "ccc78205-c13f-4738-bbc8-63ccd9fbf50d", 00:14:59.857 "is_configured": false, 00:14:59.857 "data_offset": 2048, 00:14:59.857 "data_size": 63488 00:14:59.857 }, 00:14:59.857 { 00:14:59.857 "name": "BaseBdev3", 00:14:59.857 "uuid": "9b86fa54-3c6c-4bf5-a93a-cab6a72d0d03", 00:14:59.857 "is_configured": true, 00:14:59.857 "data_offset": 2048, 00:14:59.857 "data_size": 63488 00:14:59.857 }, 00:14:59.857 { 00:14:59.857 "name": "BaseBdev4", 00:14:59.857 "uuid": "1cc23c24-65ba-4bfe-96df-c4ace54c6f3b", 00:14:59.857 "is_configured": true, 00:14:59.857 "data_offset": 2048, 00:14:59.857 "data_size": 63488 00:14:59.857 } 00:14:59.857 ] 00:14:59.857 }' 00:14:59.857 18:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:59.857 18:51:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:00.424 18:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.424 18:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:00.424 18:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:00.424 18:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:00.683 [2024-07-24 18:51:45.579889] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:00.683 18:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:00.683 18:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:00.683 18:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:00.683 18:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:00.683 18:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:00.683 18:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:00.683 18:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:00.683 18:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:00.683 18:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:00.683 18:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:00.683 18:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.683 18:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:00.941 18:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:00.941 "name": "Existed_Raid", 00:15:00.941 "uuid": "5fc13a6b-14d7-4384-a605-ab90c8ae3fc1", 00:15:00.941 "strip_size_kb": 64, 00:15:00.941 "state": "configuring", 00:15:00.941 "raid_level": "raid0", 00:15:00.941 "superblock": true, 00:15:00.941 "num_base_bdevs": 4, 00:15:00.941 "num_base_bdevs_discovered": 2, 00:15:00.941 "num_base_bdevs_operational": 4, 00:15:00.941 "base_bdevs_list": [ 00:15:00.941 { 00:15:00.941 "name": null, 00:15:00.941 "uuid": "1d779fd2-cecd-44ef-94ca-9894e63c3d26", 00:15:00.941 "is_configured": false, 00:15:00.941 "data_offset": 2048, 00:15:00.941 "data_size": 63488 00:15:00.941 }, 00:15:00.941 { 00:15:00.941 "name": null, 00:15:00.941 "uuid": "ccc78205-c13f-4738-bbc8-63ccd9fbf50d", 00:15:00.941 "is_configured": false, 00:15:00.941 "data_offset": 2048, 00:15:00.941 "data_size": 63488 00:15:00.941 }, 00:15:00.941 { 00:15:00.941 "name": "BaseBdev3", 00:15:00.941 "uuid": "9b86fa54-3c6c-4bf5-a93a-cab6a72d0d03", 00:15:00.941 "is_configured": true, 00:15:00.941 "data_offset": 2048, 00:15:00.941 "data_size": 63488 00:15:00.941 }, 00:15:00.941 { 00:15:00.941 "name": "BaseBdev4", 00:15:00.941 "uuid": "1cc23c24-65ba-4bfe-96df-c4ace54c6f3b", 00:15:00.941 "is_configured": true, 00:15:00.941 "data_offset": 2048, 00:15:00.941 "data_size": 63488 00:15:00.941 } 00:15:00.941 ] 00:15:00.941 }' 00:15:00.941 18:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:00.941 18:51:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:01.508 18:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:01.508 18:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.508 18:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:01.508 18:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:01.767 [2024-07-24 18:51:46.580228] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:01.767 18:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:01.767 18:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:01.767 18:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:01.767 18:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:01.767 18:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:01.767 18:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:01.767 18:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:01.767 18:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:01.767 18:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:01.767 18:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:01.767 18:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.767 18:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:01.767 18:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:01.767 "name": "Existed_Raid", 00:15:01.767 "uuid": "5fc13a6b-14d7-4384-a605-ab90c8ae3fc1", 00:15:01.767 "strip_size_kb": 64, 00:15:01.767 "state": "configuring", 00:15:01.767 "raid_level": "raid0", 00:15:01.767 "superblock": true, 00:15:01.767 "num_base_bdevs": 4, 00:15:01.767 "num_base_bdevs_discovered": 3, 00:15:01.767 "num_base_bdevs_operational": 4, 00:15:01.767 "base_bdevs_list": [ 00:15:01.767 { 00:15:01.767 "name": null, 00:15:01.767 "uuid": "1d779fd2-cecd-44ef-94ca-9894e63c3d26", 00:15:01.767 "is_configured": false, 00:15:01.767 "data_offset": 2048, 00:15:01.767 "data_size": 63488 00:15:01.767 }, 00:15:01.767 { 00:15:01.767 "name": "BaseBdev2", 00:15:01.767 "uuid": "ccc78205-c13f-4738-bbc8-63ccd9fbf50d", 00:15:01.767 "is_configured": true, 00:15:01.767 "data_offset": 2048, 00:15:01.767 "data_size": 63488 00:15:01.767 }, 00:15:01.767 { 00:15:01.767 "name": "BaseBdev3", 00:15:01.767 "uuid": "9b86fa54-3c6c-4bf5-a93a-cab6a72d0d03", 00:15:01.767 "is_configured": true, 00:15:01.767 "data_offset": 2048, 00:15:01.767 "data_size": 63488 00:15:01.767 }, 00:15:01.767 { 00:15:01.767 "name": "BaseBdev4", 00:15:01.767 "uuid": "1cc23c24-65ba-4bfe-96df-c4ace54c6f3b", 00:15:01.767 "is_configured": true, 00:15:01.767 "data_offset": 2048, 00:15:01.767 "data_size": 63488 00:15:01.767 } 00:15:01.767 ] 00:15:01.767 }' 00:15:01.767 18:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:01.767 18:51:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:02.334 18:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:02.334 18:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:02.592 18:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:02.592 18:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:02.592 18:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:02.850 18:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 1d779fd2-cecd-44ef-94ca-9894e63c3d26 00:15:02.850 [2024-07-24 18:51:47.757907] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:02.850 [2024-07-24 18:51:47.758015] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x183fde0 00:15:02.850 [2024-07-24 18:51:47.758023] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:02.850 [2024-07-24 18:51:47.758140] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18408e0 00:15:02.850 [2024-07-24 18:51:47.758216] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x183fde0 00:15:02.850 [2024-07-24 18:51:47.758222] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x183fde0 00:15:02.850 [2024-07-24 18:51:47.758279] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:02.850 NewBaseBdev 00:15:02.850 18:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:02.850 18:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:02.850 18:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:02.850 18:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:02.850 18:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:02.850 18:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:02.850 18:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:03.108 18:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:03.108 [ 00:15:03.108 { 00:15:03.108 "name": "NewBaseBdev", 00:15:03.108 "aliases": [ 00:15:03.108 "1d779fd2-cecd-44ef-94ca-9894e63c3d26" 00:15:03.108 ], 00:15:03.108 "product_name": "Malloc disk", 00:15:03.108 "block_size": 512, 00:15:03.108 "num_blocks": 65536, 00:15:03.108 "uuid": "1d779fd2-cecd-44ef-94ca-9894e63c3d26", 00:15:03.108 "assigned_rate_limits": { 00:15:03.108 "rw_ios_per_sec": 0, 00:15:03.108 "rw_mbytes_per_sec": 0, 00:15:03.108 "r_mbytes_per_sec": 0, 00:15:03.108 "w_mbytes_per_sec": 0 00:15:03.108 }, 00:15:03.108 "claimed": true, 00:15:03.108 "claim_type": "exclusive_write", 00:15:03.108 "zoned": false, 00:15:03.108 "supported_io_types": { 00:15:03.108 "read": true, 00:15:03.108 "write": true, 00:15:03.108 "unmap": true, 00:15:03.108 "flush": true, 00:15:03.108 "reset": true, 00:15:03.108 "nvme_admin": false, 00:15:03.108 "nvme_io": false, 00:15:03.108 "nvme_io_md": false, 00:15:03.108 "write_zeroes": true, 00:15:03.108 "zcopy": true, 00:15:03.108 "get_zone_info": false, 00:15:03.108 "zone_management": false, 00:15:03.108 "zone_append": false, 00:15:03.108 "compare": false, 00:15:03.108 "compare_and_write": false, 00:15:03.108 "abort": true, 00:15:03.108 "seek_hole": false, 00:15:03.108 "seek_data": false, 00:15:03.108 "copy": true, 00:15:03.108 "nvme_iov_md": false 00:15:03.108 }, 00:15:03.108 "memory_domains": [ 00:15:03.108 { 00:15:03.108 "dma_device_id": "system", 00:15:03.108 "dma_device_type": 1 00:15:03.108 }, 00:15:03.108 { 00:15:03.108 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.108 "dma_device_type": 2 00:15:03.108 } 00:15:03.108 ], 00:15:03.108 "driver_specific": {} 00:15:03.108 } 00:15:03.108 ] 00:15:03.108 18:51:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:03.108 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:15:03.108 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:03.108 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:03.108 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:03.108 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:03.108 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:03.108 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:03.108 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:03.108 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:03.108 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:03.108 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:03.108 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:03.367 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:03.367 "name": "Existed_Raid", 00:15:03.367 "uuid": "5fc13a6b-14d7-4384-a605-ab90c8ae3fc1", 00:15:03.367 "strip_size_kb": 64, 00:15:03.367 "state": "online", 00:15:03.367 "raid_level": "raid0", 00:15:03.367 "superblock": true, 00:15:03.367 "num_base_bdevs": 4, 00:15:03.367 "num_base_bdevs_discovered": 4, 00:15:03.367 "num_base_bdevs_operational": 4, 00:15:03.367 "base_bdevs_list": [ 00:15:03.367 { 00:15:03.367 "name": "NewBaseBdev", 00:15:03.367 "uuid": "1d779fd2-cecd-44ef-94ca-9894e63c3d26", 00:15:03.367 "is_configured": true, 00:15:03.367 "data_offset": 2048, 00:15:03.367 "data_size": 63488 00:15:03.367 }, 00:15:03.367 { 00:15:03.367 "name": "BaseBdev2", 00:15:03.367 "uuid": "ccc78205-c13f-4738-bbc8-63ccd9fbf50d", 00:15:03.367 "is_configured": true, 00:15:03.367 "data_offset": 2048, 00:15:03.367 "data_size": 63488 00:15:03.367 }, 00:15:03.367 { 00:15:03.367 "name": "BaseBdev3", 00:15:03.367 "uuid": "9b86fa54-3c6c-4bf5-a93a-cab6a72d0d03", 00:15:03.367 "is_configured": true, 00:15:03.367 "data_offset": 2048, 00:15:03.367 "data_size": 63488 00:15:03.367 }, 00:15:03.367 { 00:15:03.367 "name": "BaseBdev4", 00:15:03.367 "uuid": "1cc23c24-65ba-4bfe-96df-c4ace54c6f3b", 00:15:03.367 "is_configured": true, 00:15:03.367 "data_offset": 2048, 00:15:03.367 "data_size": 63488 00:15:03.367 } 00:15:03.367 ] 00:15:03.367 }' 00:15:03.367 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:03.367 18:51:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:03.934 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:03.934 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:03.934 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:03.934 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:03.934 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:03.934 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:03.934 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:03.934 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:03.934 [2024-07-24 18:51:48.856971] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:03.934 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:03.934 "name": "Existed_Raid", 00:15:03.934 "aliases": [ 00:15:03.934 "5fc13a6b-14d7-4384-a605-ab90c8ae3fc1" 00:15:03.934 ], 00:15:03.934 "product_name": "Raid Volume", 00:15:03.934 "block_size": 512, 00:15:03.934 "num_blocks": 253952, 00:15:03.934 "uuid": "5fc13a6b-14d7-4384-a605-ab90c8ae3fc1", 00:15:03.934 "assigned_rate_limits": { 00:15:03.934 "rw_ios_per_sec": 0, 00:15:03.934 "rw_mbytes_per_sec": 0, 00:15:03.934 "r_mbytes_per_sec": 0, 00:15:03.934 "w_mbytes_per_sec": 0 00:15:03.934 }, 00:15:03.934 "claimed": false, 00:15:03.934 "zoned": false, 00:15:03.934 "supported_io_types": { 00:15:03.934 "read": true, 00:15:03.934 "write": true, 00:15:03.934 "unmap": true, 00:15:03.934 "flush": true, 00:15:03.934 "reset": true, 00:15:03.934 "nvme_admin": false, 00:15:03.934 "nvme_io": false, 00:15:03.934 "nvme_io_md": false, 00:15:03.934 "write_zeroes": true, 00:15:03.934 "zcopy": false, 00:15:03.934 "get_zone_info": false, 00:15:03.934 "zone_management": false, 00:15:03.934 "zone_append": false, 00:15:03.934 "compare": false, 00:15:03.934 "compare_and_write": false, 00:15:03.934 "abort": false, 00:15:03.934 "seek_hole": false, 00:15:03.934 "seek_data": false, 00:15:03.934 "copy": false, 00:15:03.934 "nvme_iov_md": false 00:15:03.934 }, 00:15:03.934 "memory_domains": [ 00:15:03.934 { 00:15:03.934 "dma_device_id": "system", 00:15:03.934 "dma_device_type": 1 00:15:03.934 }, 00:15:03.934 { 00:15:03.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.934 "dma_device_type": 2 00:15:03.934 }, 00:15:03.934 { 00:15:03.934 "dma_device_id": "system", 00:15:03.934 "dma_device_type": 1 00:15:03.934 }, 00:15:03.934 { 00:15:03.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.934 "dma_device_type": 2 00:15:03.934 }, 00:15:03.934 { 00:15:03.934 "dma_device_id": "system", 00:15:03.934 "dma_device_type": 1 00:15:03.934 }, 00:15:03.934 { 00:15:03.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.934 "dma_device_type": 2 00:15:03.934 }, 00:15:03.934 { 00:15:03.934 "dma_device_id": "system", 00:15:03.934 "dma_device_type": 1 00:15:03.934 }, 00:15:03.934 { 00:15:03.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.934 "dma_device_type": 2 00:15:03.934 } 00:15:03.934 ], 00:15:03.934 "driver_specific": { 00:15:03.934 "raid": { 00:15:03.934 "uuid": "5fc13a6b-14d7-4384-a605-ab90c8ae3fc1", 00:15:03.934 "strip_size_kb": 64, 00:15:03.934 "state": "online", 00:15:03.934 "raid_level": "raid0", 00:15:03.935 "superblock": true, 00:15:03.935 "num_base_bdevs": 4, 00:15:03.935 "num_base_bdevs_discovered": 4, 00:15:03.935 "num_base_bdevs_operational": 4, 00:15:03.935 "base_bdevs_list": [ 00:15:03.935 { 00:15:03.935 "name": "NewBaseBdev", 00:15:03.935 "uuid": "1d779fd2-cecd-44ef-94ca-9894e63c3d26", 00:15:03.935 "is_configured": true, 00:15:03.935 "data_offset": 2048, 00:15:03.935 "data_size": 63488 00:15:03.935 }, 00:15:03.935 { 00:15:03.935 "name": "BaseBdev2", 00:15:03.935 "uuid": "ccc78205-c13f-4738-bbc8-63ccd9fbf50d", 00:15:03.935 "is_configured": true, 00:15:03.935 "data_offset": 2048, 00:15:03.935 "data_size": 63488 00:15:03.935 }, 00:15:03.935 { 00:15:03.935 "name": "BaseBdev3", 00:15:03.935 "uuid": "9b86fa54-3c6c-4bf5-a93a-cab6a72d0d03", 00:15:03.935 "is_configured": true, 00:15:03.935 "data_offset": 2048, 00:15:03.935 "data_size": 63488 00:15:03.935 }, 00:15:03.935 { 00:15:03.935 "name": "BaseBdev4", 00:15:03.935 "uuid": "1cc23c24-65ba-4bfe-96df-c4ace54c6f3b", 00:15:03.935 "is_configured": true, 00:15:03.935 "data_offset": 2048, 00:15:03.935 "data_size": 63488 00:15:03.935 } 00:15:03.935 ] 00:15:03.935 } 00:15:03.935 } 00:15:03.935 }' 00:15:03.935 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:03.935 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:03.935 BaseBdev2 00:15:03.935 BaseBdev3 00:15:03.935 BaseBdev4' 00:15:03.935 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:03.935 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:03.935 18:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:04.193 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:04.193 "name": "NewBaseBdev", 00:15:04.193 "aliases": [ 00:15:04.193 "1d779fd2-cecd-44ef-94ca-9894e63c3d26" 00:15:04.193 ], 00:15:04.193 "product_name": "Malloc disk", 00:15:04.193 "block_size": 512, 00:15:04.193 "num_blocks": 65536, 00:15:04.193 "uuid": "1d779fd2-cecd-44ef-94ca-9894e63c3d26", 00:15:04.193 "assigned_rate_limits": { 00:15:04.193 "rw_ios_per_sec": 0, 00:15:04.193 "rw_mbytes_per_sec": 0, 00:15:04.193 "r_mbytes_per_sec": 0, 00:15:04.193 "w_mbytes_per_sec": 0 00:15:04.193 }, 00:15:04.193 "claimed": true, 00:15:04.193 "claim_type": "exclusive_write", 00:15:04.193 "zoned": false, 00:15:04.193 "supported_io_types": { 00:15:04.193 "read": true, 00:15:04.193 "write": true, 00:15:04.193 "unmap": true, 00:15:04.193 "flush": true, 00:15:04.193 "reset": true, 00:15:04.193 "nvme_admin": false, 00:15:04.193 "nvme_io": false, 00:15:04.193 "nvme_io_md": false, 00:15:04.193 "write_zeroes": true, 00:15:04.193 "zcopy": true, 00:15:04.193 "get_zone_info": false, 00:15:04.193 "zone_management": false, 00:15:04.193 "zone_append": false, 00:15:04.193 "compare": false, 00:15:04.193 "compare_and_write": false, 00:15:04.193 "abort": true, 00:15:04.193 "seek_hole": false, 00:15:04.193 "seek_data": false, 00:15:04.193 "copy": true, 00:15:04.193 "nvme_iov_md": false 00:15:04.193 }, 00:15:04.193 "memory_domains": [ 00:15:04.193 { 00:15:04.193 "dma_device_id": "system", 00:15:04.193 "dma_device_type": 1 00:15:04.193 }, 00:15:04.193 { 00:15:04.193 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.193 "dma_device_type": 2 00:15:04.193 } 00:15:04.193 ], 00:15:04.193 "driver_specific": {} 00:15:04.193 }' 00:15:04.193 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.193 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.193 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:04.193 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:04.193 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:04.451 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:04.451 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.451 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.451 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:04.451 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.451 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.451 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:04.451 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:04.451 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:04.451 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:04.709 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:04.709 "name": "BaseBdev2", 00:15:04.709 "aliases": [ 00:15:04.709 "ccc78205-c13f-4738-bbc8-63ccd9fbf50d" 00:15:04.709 ], 00:15:04.709 "product_name": "Malloc disk", 00:15:04.709 "block_size": 512, 00:15:04.709 "num_blocks": 65536, 00:15:04.709 "uuid": "ccc78205-c13f-4738-bbc8-63ccd9fbf50d", 00:15:04.709 "assigned_rate_limits": { 00:15:04.709 "rw_ios_per_sec": 0, 00:15:04.709 "rw_mbytes_per_sec": 0, 00:15:04.709 "r_mbytes_per_sec": 0, 00:15:04.709 "w_mbytes_per_sec": 0 00:15:04.709 }, 00:15:04.709 "claimed": true, 00:15:04.709 "claim_type": "exclusive_write", 00:15:04.709 "zoned": false, 00:15:04.709 "supported_io_types": { 00:15:04.709 "read": true, 00:15:04.709 "write": true, 00:15:04.709 "unmap": true, 00:15:04.709 "flush": true, 00:15:04.709 "reset": true, 00:15:04.709 "nvme_admin": false, 00:15:04.709 "nvme_io": false, 00:15:04.709 "nvme_io_md": false, 00:15:04.709 "write_zeroes": true, 00:15:04.709 "zcopy": true, 00:15:04.709 "get_zone_info": false, 00:15:04.709 "zone_management": false, 00:15:04.709 "zone_append": false, 00:15:04.709 "compare": false, 00:15:04.709 "compare_and_write": false, 00:15:04.709 "abort": true, 00:15:04.709 "seek_hole": false, 00:15:04.709 "seek_data": false, 00:15:04.709 "copy": true, 00:15:04.709 "nvme_iov_md": false 00:15:04.709 }, 00:15:04.709 "memory_domains": [ 00:15:04.709 { 00:15:04.709 "dma_device_id": "system", 00:15:04.709 "dma_device_type": 1 00:15:04.709 }, 00:15:04.709 { 00:15:04.709 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.709 "dma_device_type": 2 00:15:04.709 } 00:15:04.709 ], 00:15:04.709 "driver_specific": {} 00:15:04.709 }' 00:15:04.709 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.709 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.709 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:04.709 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:04.709 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:04.709 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:04.709 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.709 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.709 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:04.709 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.968 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.968 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:04.968 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:04.968 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:04.968 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:04.968 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:04.968 "name": "BaseBdev3", 00:15:04.968 "aliases": [ 00:15:04.968 "9b86fa54-3c6c-4bf5-a93a-cab6a72d0d03" 00:15:04.968 ], 00:15:04.968 "product_name": "Malloc disk", 00:15:04.968 "block_size": 512, 00:15:04.968 "num_blocks": 65536, 00:15:04.968 "uuid": "9b86fa54-3c6c-4bf5-a93a-cab6a72d0d03", 00:15:04.968 "assigned_rate_limits": { 00:15:04.968 "rw_ios_per_sec": 0, 00:15:04.968 "rw_mbytes_per_sec": 0, 00:15:04.968 "r_mbytes_per_sec": 0, 00:15:04.968 "w_mbytes_per_sec": 0 00:15:04.968 }, 00:15:04.968 "claimed": true, 00:15:04.968 "claim_type": "exclusive_write", 00:15:04.968 "zoned": false, 00:15:04.968 "supported_io_types": { 00:15:04.968 "read": true, 00:15:04.968 "write": true, 00:15:04.968 "unmap": true, 00:15:04.968 "flush": true, 00:15:04.968 "reset": true, 00:15:04.968 "nvme_admin": false, 00:15:04.968 "nvme_io": false, 00:15:04.968 "nvme_io_md": false, 00:15:04.968 "write_zeroes": true, 00:15:04.968 "zcopy": true, 00:15:04.968 "get_zone_info": false, 00:15:04.968 "zone_management": false, 00:15:04.968 "zone_append": false, 00:15:04.968 "compare": false, 00:15:04.968 "compare_and_write": false, 00:15:04.968 "abort": true, 00:15:04.968 "seek_hole": false, 00:15:04.968 "seek_data": false, 00:15:04.968 "copy": true, 00:15:04.968 "nvme_iov_md": false 00:15:04.968 }, 00:15:04.968 "memory_domains": [ 00:15:04.968 { 00:15:04.968 "dma_device_id": "system", 00:15:04.968 "dma_device_type": 1 00:15:04.968 }, 00:15:04.968 { 00:15:04.968 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.968 "dma_device_type": 2 00:15:04.968 } 00:15:04.968 ], 00:15:04.968 "driver_specific": {} 00:15:04.968 }' 00:15:04.968 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.968 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:05.226 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:05.226 18:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:05.226 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:05.226 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:05.226 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:05.226 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:05.226 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:05.226 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:05.226 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:05.226 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:05.485 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:05.485 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:05.485 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:05.485 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:05.485 "name": "BaseBdev4", 00:15:05.485 "aliases": [ 00:15:05.485 "1cc23c24-65ba-4bfe-96df-c4ace54c6f3b" 00:15:05.485 ], 00:15:05.485 "product_name": "Malloc disk", 00:15:05.485 "block_size": 512, 00:15:05.485 "num_blocks": 65536, 00:15:05.485 "uuid": "1cc23c24-65ba-4bfe-96df-c4ace54c6f3b", 00:15:05.485 "assigned_rate_limits": { 00:15:05.485 "rw_ios_per_sec": 0, 00:15:05.485 "rw_mbytes_per_sec": 0, 00:15:05.485 "r_mbytes_per_sec": 0, 00:15:05.485 "w_mbytes_per_sec": 0 00:15:05.485 }, 00:15:05.485 "claimed": true, 00:15:05.485 "claim_type": "exclusive_write", 00:15:05.485 "zoned": false, 00:15:05.485 "supported_io_types": { 00:15:05.485 "read": true, 00:15:05.485 "write": true, 00:15:05.485 "unmap": true, 00:15:05.485 "flush": true, 00:15:05.485 "reset": true, 00:15:05.485 "nvme_admin": false, 00:15:05.485 "nvme_io": false, 00:15:05.485 "nvme_io_md": false, 00:15:05.485 "write_zeroes": true, 00:15:05.485 "zcopy": true, 00:15:05.485 "get_zone_info": false, 00:15:05.485 "zone_management": false, 00:15:05.485 "zone_append": false, 00:15:05.485 "compare": false, 00:15:05.485 "compare_and_write": false, 00:15:05.485 "abort": true, 00:15:05.485 "seek_hole": false, 00:15:05.485 "seek_data": false, 00:15:05.485 "copy": true, 00:15:05.485 "nvme_iov_md": false 00:15:05.485 }, 00:15:05.485 "memory_domains": [ 00:15:05.485 { 00:15:05.485 "dma_device_id": "system", 00:15:05.485 "dma_device_type": 1 00:15:05.485 }, 00:15:05.485 { 00:15:05.485 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:05.485 "dma_device_type": 2 00:15:05.485 } 00:15:05.485 ], 00:15:05.485 "driver_specific": {} 00:15:05.485 }' 00:15:05.485 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:05.485 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:05.485 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:05.485 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:05.743 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:05.743 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:05.743 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:05.743 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:05.743 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:05.743 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:05.743 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:05.743 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:05.743 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:06.001 [2024-07-24 18:51:50.841894] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:06.001 [2024-07-24 18:51:50.841914] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:06.001 [2024-07-24 18:51:50.841951] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:06.001 [2024-07-24 18:51:50.841997] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:06.001 [2024-07-24 18:51:50.842003] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x183fde0 name Existed_Raid, state offline 00:15:06.001 18:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2110943 00:15:06.001 18:51:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2110943 ']' 00:15:06.001 18:51:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2110943 00:15:06.001 18:51:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:15:06.001 18:51:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:06.001 18:51:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2110943 00:15:06.001 18:51:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:06.001 18:51:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:06.001 18:51:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2110943' 00:15:06.001 killing process with pid 2110943 00:15:06.001 18:51:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2110943 00:15:06.001 [2024-07-24 18:51:50.907286] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:06.001 18:51:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2110943 00:15:06.001 [2024-07-24 18:51:50.939012] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:06.260 18:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:06.260 00:15:06.260 real 0m23.801s 00:15:06.260 user 0m44.336s 00:15:06.260 sys 0m3.613s 00:15:06.260 18:51:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:06.260 18:51:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:06.260 ************************************ 00:15:06.260 END TEST raid_state_function_test_sb 00:15:06.260 ************************************ 00:15:06.260 18:51:51 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:15:06.260 18:51:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:15:06.260 18:51:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:06.260 18:51:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:06.260 ************************************ 00:15:06.260 START TEST raid_superblock_test 00:15:06.260 ************************************ 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2115517 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2115517 /var/tmp/spdk-raid.sock 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2115517 ']' 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:06.260 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:06.260 18:51:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:06.261 [2024-07-24 18:51:51.233622] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:15:06.261 [2024-07-24 18:51:51.233663] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2115517 ] 00:15:06.519 [2024-07-24 18:51:51.299576] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:06.519 [2024-07-24 18:51:51.376271] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:06.520 [2024-07-24 18:51:51.435080] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:06.520 [2024-07-24 18:51:51.435104] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:07.087 18:51:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:07.087 18:51:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:15:07.087 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:15:07.087 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:07.087 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:15:07.087 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:15:07.087 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:07.088 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:07.088 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:07.088 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:07.088 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:07.346 malloc1 00:15:07.346 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:07.346 [2024-07-24 18:51:52.338766] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:07.346 [2024-07-24 18:51:52.338803] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:07.346 [2024-07-24 18:51:52.338814] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e95e20 00:15:07.346 [2024-07-24 18:51:52.338820] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:07.346 [2024-07-24 18:51:52.339877] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:07.346 [2024-07-24 18:51:52.339898] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:07.346 pt1 00:15:07.605 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:07.605 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:07.605 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:15:07.605 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:15:07.605 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:07.605 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:07.605 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:07.605 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:07.605 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:07.605 malloc2 00:15:07.605 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:07.863 [2024-07-24 18:51:52.691225] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:07.863 [2024-07-24 18:51:52.691261] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:07.863 [2024-07-24 18:51:52.691270] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x203fed0 00:15:07.863 [2024-07-24 18:51:52.691275] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:07.863 [2024-07-24 18:51:52.692248] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:07.863 [2024-07-24 18:51:52.692269] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:07.863 pt2 00:15:07.863 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:07.863 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:07.863 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:15:07.863 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:15:07.863 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:07.863 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:07.863 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:07.863 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:07.863 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:08.122 malloc3 00:15:08.122 18:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:08.122 [2024-07-24 18:51:53.051675] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:08.122 [2024-07-24 18:51:53.051708] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:08.122 [2024-07-24 18:51:53.051717] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2043a30 00:15:08.122 [2024-07-24 18:51:53.051724] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:08.122 [2024-07-24 18:51:53.052715] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:08.122 [2024-07-24 18:51:53.052738] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:08.122 pt3 00:15:08.122 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:08.122 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:08.122 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:15:08.122 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:15:08.122 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:15:08.122 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:08.122 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:08.122 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:08.122 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:15:08.485 malloc4 00:15:08.485 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:15:08.743 [2024-07-24 18:51:53.399942] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:15:08.743 [2024-07-24 18:51:53.399975] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:08.743 [2024-07-24 18:51:53.399984] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2040900 00:15:08.743 [2024-07-24 18:51:53.399991] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:08.743 [2024-07-24 18:51:53.400943] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:08.743 [2024-07-24 18:51:53.400964] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:15:08.743 pt4 00:15:08.743 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:08.743 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:08.743 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:15:08.743 [2024-07-24 18:51:53.564364] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:08.743 [2024-07-24 18:51:53.565149] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:08.743 [2024-07-24 18:51:53.565186] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:08.743 [2024-07-24 18:51:53.565213] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:15:08.743 [2024-07-24 18:51:53.565322] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2043d40 00:15:08.743 [2024-07-24 18:51:53.565328] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:08.743 [2024-07-24 18:51:53.565453] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2048140 00:15:08.743 [2024-07-24 18:51:53.565554] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2043d40 00:15:08.743 [2024-07-24 18:51:53.565560] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2043d40 00:15:08.743 [2024-07-24 18:51:53.565619] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:08.743 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:08.743 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:08.743 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:08.743 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:08.743 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:08.743 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:08.743 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:08.743 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:08.743 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:08.743 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:08.743 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:08.743 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:09.001 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:09.001 "name": "raid_bdev1", 00:15:09.001 "uuid": "02d2d5aa-efbc-4332-adc1-6495b467b5d7", 00:15:09.001 "strip_size_kb": 64, 00:15:09.001 "state": "online", 00:15:09.001 "raid_level": "raid0", 00:15:09.001 "superblock": true, 00:15:09.001 "num_base_bdevs": 4, 00:15:09.001 "num_base_bdevs_discovered": 4, 00:15:09.001 "num_base_bdevs_operational": 4, 00:15:09.001 "base_bdevs_list": [ 00:15:09.001 { 00:15:09.001 "name": "pt1", 00:15:09.001 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:09.001 "is_configured": true, 00:15:09.001 "data_offset": 2048, 00:15:09.001 "data_size": 63488 00:15:09.001 }, 00:15:09.001 { 00:15:09.001 "name": "pt2", 00:15:09.001 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:09.001 "is_configured": true, 00:15:09.001 "data_offset": 2048, 00:15:09.001 "data_size": 63488 00:15:09.001 }, 00:15:09.001 { 00:15:09.001 "name": "pt3", 00:15:09.001 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:09.001 "is_configured": true, 00:15:09.001 "data_offset": 2048, 00:15:09.001 "data_size": 63488 00:15:09.001 }, 00:15:09.001 { 00:15:09.001 "name": "pt4", 00:15:09.001 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:09.001 "is_configured": true, 00:15:09.001 "data_offset": 2048, 00:15:09.001 "data_size": 63488 00:15:09.001 } 00:15:09.001 ] 00:15:09.001 }' 00:15:09.001 18:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:09.001 18:51:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:09.259 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:15:09.259 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:09.259 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:09.259 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:09.259 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:09.259 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:09.259 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:09.259 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:09.518 [2024-07-24 18:51:54.386711] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:09.518 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:09.518 "name": "raid_bdev1", 00:15:09.518 "aliases": [ 00:15:09.518 "02d2d5aa-efbc-4332-adc1-6495b467b5d7" 00:15:09.518 ], 00:15:09.518 "product_name": "Raid Volume", 00:15:09.518 "block_size": 512, 00:15:09.518 "num_blocks": 253952, 00:15:09.518 "uuid": "02d2d5aa-efbc-4332-adc1-6495b467b5d7", 00:15:09.518 "assigned_rate_limits": { 00:15:09.518 "rw_ios_per_sec": 0, 00:15:09.518 "rw_mbytes_per_sec": 0, 00:15:09.518 "r_mbytes_per_sec": 0, 00:15:09.518 "w_mbytes_per_sec": 0 00:15:09.518 }, 00:15:09.518 "claimed": false, 00:15:09.518 "zoned": false, 00:15:09.518 "supported_io_types": { 00:15:09.518 "read": true, 00:15:09.518 "write": true, 00:15:09.518 "unmap": true, 00:15:09.518 "flush": true, 00:15:09.518 "reset": true, 00:15:09.518 "nvme_admin": false, 00:15:09.518 "nvme_io": false, 00:15:09.518 "nvme_io_md": false, 00:15:09.518 "write_zeroes": true, 00:15:09.518 "zcopy": false, 00:15:09.518 "get_zone_info": false, 00:15:09.518 "zone_management": false, 00:15:09.518 "zone_append": false, 00:15:09.518 "compare": false, 00:15:09.518 "compare_and_write": false, 00:15:09.518 "abort": false, 00:15:09.518 "seek_hole": false, 00:15:09.518 "seek_data": false, 00:15:09.518 "copy": false, 00:15:09.518 "nvme_iov_md": false 00:15:09.518 }, 00:15:09.518 "memory_domains": [ 00:15:09.518 { 00:15:09.518 "dma_device_id": "system", 00:15:09.518 "dma_device_type": 1 00:15:09.518 }, 00:15:09.518 { 00:15:09.518 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.518 "dma_device_type": 2 00:15:09.518 }, 00:15:09.518 { 00:15:09.518 "dma_device_id": "system", 00:15:09.518 "dma_device_type": 1 00:15:09.518 }, 00:15:09.518 { 00:15:09.518 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.518 "dma_device_type": 2 00:15:09.518 }, 00:15:09.518 { 00:15:09.518 "dma_device_id": "system", 00:15:09.518 "dma_device_type": 1 00:15:09.518 }, 00:15:09.518 { 00:15:09.518 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.518 "dma_device_type": 2 00:15:09.518 }, 00:15:09.518 { 00:15:09.518 "dma_device_id": "system", 00:15:09.518 "dma_device_type": 1 00:15:09.518 }, 00:15:09.518 { 00:15:09.518 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.518 "dma_device_type": 2 00:15:09.518 } 00:15:09.518 ], 00:15:09.518 "driver_specific": { 00:15:09.518 "raid": { 00:15:09.518 "uuid": "02d2d5aa-efbc-4332-adc1-6495b467b5d7", 00:15:09.518 "strip_size_kb": 64, 00:15:09.518 "state": "online", 00:15:09.518 "raid_level": "raid0", 00:15:09.518 "superblock": true, 00:15:09.518 "num_base_bdevs": 4, 00:15:09.518 "num_base_bdevs_discovered": 4, 00:15:09.518 "num_base_bdevs_operational": 4, 00:15:09.518 "base_bdevs_list": [ 00:15:09.518 { 00:15:09.518 "name": "pt1", 00:15:09.518 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:09.518 "is_configured": true, 00:15:09.518 "data_offset": 2048, 00:15:09.518 "data_size": 63488 00:15:09.518 }, 00:15:09.518 { 00:15:09.518 "name": "pt2", 00:15:09.518 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:09.518 "is_configured": true, 00:15:09.518 "data_offset": 2048, 00:15:09.518 "data_size": 63488 00:15:09.518 }, 00:15:09.518 { 00:15:09.518 "name": "pt3", 00:15:09.518 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:09.518 "is_configured": true, 00:15:09.518 "data_offset": 2048, 00:15:09.518 "data_size": 63488 00:15:09.518 }, 00:15:09.518 { 00:15:09.518 "name": "pt4", 00:15:09.518 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:09.518 "is_configured": true, 00:15:09.518 "data_offset": 2048, 00:15:09.518 "data_size": 63488 00:15:09.518 } 00:15:09.518 ] 00:15:09.518 } 00:15:09.518 } 00:15:09.518 }' 00:15:09.518 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:09.518 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:09.518 pt2 00:15:09.518 pt3 00:15:09.518 pt4' 00:15:09.518 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:09.518 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:09.518 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:09.777 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:09.777 "name": "pt1", 00:15:09.778 "aliases": [ 00:15:09.778 "00000000-0000-0000-0000-000000000001" 00:15:09.778 ], 00:15:09.778 "product_name": "passthru", 00:15:09.778 "block_size": 512, 00:15:09.778 "num_blocks": 65536, 00:15:09.778 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:09.778 "assigned_rate_limits": { 00:15:09.778 "rw_ios_per_sec": 0, 00:15:09.778 "rw_mbytes_per_sec": 0, 00:15:09.778 "r_mbytes_per_sec": 0, 00:15:09.778 "w_mbytes_per_sec": 0 00:15:09.778 }, 00:15:09.778 "claimed": true, 00:15:09.778 "claim_type": "exclusive_write", 00:15:09.778 "zoned": false, 00:15:09.778 "supported_io_types": { 00:15:09.778 "read": true, 00:15:09.778 "write": true, 00:15:09.778 "unmap": true, 00:15:09.778 "flush": true, 00:15:09.778 "reset": true, 00:15:09.778 "nvme_admin": false, 00:15:09.778 "nvme_io": false, 00:15:09.778 "nvme_io_md": false, 00:15:09.778 "write_zeroes": true, 00:15:09.778 "zcopy": true, 00:15:09.778 "get_zone_info": false, 00:15:09.778 "zone_management": false, 00:15:09.778 "zone_append": false, 00:15:09.778 "compare": false, 00:15:09.778 "compare_and_write": false, 00:15:09.778 "abort": true, 00:15:09.778 "seek_hole": false, 00:15:09.778 "seek_data": false, 00:15:09.778 "copy": true, 00:15:09.778 "nvme_iov_md": false 00:15:09.778 }, 00:15:09.778 "memory_domains": [ 00:15:09.778 { 00:15:09.778 "dma_device_id": "system", 00:15:09.778 "dma_device_type": 1 00:15:09.778 }, 00:15:09.778 { 00:15:09.778 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.778 "dma_device_type": 2 00:15:09.778 } 00:15:09.778 ], 00:15:09.778 "driver_specific": { 00:15:09.778 "passthru": { 00:15:09.778 "name": "pt1", 00:15:09.778 "base_bdev_name": "malloc1" 00:15:09.778 } 00:15:09.778 } 00:15:09.778 }' 00:15:09.778 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:09.778 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:09.778 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:09.778 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:09.778 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:09.778 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:09.778 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.036 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.036 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:10.036 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.036 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.036 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:10.036 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:10.036 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:10.036 18:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:10.295 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:10.295 "name": "pt2", 00:15:10.295 "aliases": [ 00:15:10.295 "00000000-0000-0000-0000-000000000002" 00:15:10.295 ], 00:15:10.295 "product_name": "passthru", 00:15:10.295 "block_size": 512, 00:15:10.295 "num_blocks": 65536, 00:15:10.295 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:10.295 "assigned_rate_limits": { 00:15:10.295 "rw_ios_per_sec": 0, 00:15:10.295 "rw_mbytes_per_sec": 0, 00:15:10.295 "r_mbytes_per_sec": 0, 00:15:10.295 "w_mbytes_per_sec": 0 00:15:10.295 }, 00:15:10.295 "claimed": true, 00:15:10.295 "claim_type": "exclusive_write", 00:15:10.295 "zoned": false, 00:15:10.295 "supported_io_types": { 00:15:10.295 "read": true, 00:15:10.295 "write": true, 00:15:10.295 "unmap": true, 00:15:10.295 "flush": true, 00:15:10.295 "reset": true, 00:15:10.295 "nvme_admin": false, 00:15:10.295 "nvme_io": false, 00:15:10.295 "nvme_io_md": false, 00:15:10.295 "write_zeroes": true, 00:15:10.295 "zcopy": true, 00:15:10.295 "get_zone_info": false, 00:15:10.295 "zone_management": false, 00:15:10.295 "zone_append": false, 00:15:10.295 "compare": false, 00:15:10.295 "compare_and_write": false, 00:15:10.295 "abort": true, 00:15:10.295 "seek_hole": false, 00:15:10.295 "seek_data": false, 00:15:10.295 "copy": true, 00:15:10.295 "nvme_iov_md": false 00:15:10.295 }, 00:15:10.295 "memory_domains": [ 00:15:10.295 { 00:15:10.295 "dma_device_id": "system", 00:15:10.295 "dma_device_type": 1 00:15:10.295 }, 00:15:10.295 { 00:15:10.295 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:10.295 "dma_device_type": 2 00:15:10.295 } 00:15:10.295 ], 00:15:10.295 "driver_specific": { 00:15:10.295 "passthru": { 00:15:10.295 "name": "pt2", 00:15:10.295 "base_bdev_name": "malloc2" 00:15:10.295 } 00:15:10.295 } 00:15:10.295 }' 00:15:10.295 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:10.295 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:10.295 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:10.295 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:10.295 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:10.295 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:10.295 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.295 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.295 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:10.295 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.554 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.554 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:10.554 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:10.554 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:10.554 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:10.554 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:10.554 "name": "pt3", 00:15:10.554 "aliases": [ 00:15:10.554 "00000000-0000-0000-0000-000000000003" 00:15:10.554 ], 00:15:10.554 "product_name": "passthru", 00:15:10.554 "block_size": 512, 00:15:10.554 "num_blocks": 65536, 00:15:10.554 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:10.554 "assigned_rate_limits": { 00:15:10.554 "rw_ios_per_sec": 0, 00:15:10.554 "rw_mbytes_per_sec": 0, 00:15:10.554 "r_mbytes_per_sec": 0, 00:15:10.554 "w_mbytes_per_sec": 0 00:15:10.554 }, 00:15:10.554 "claimed": true, 00:15:10.554 "claim_type": "exclusive_write", 00:15:10.554 "zoned": false, 00:15:10.554 "supported_io_types": { 00:15:10.555 "read": true, 00:15:10.555 "write": true, 00:15:10.555 "unmap": true, 00:15:10.555 "flush": true, 00:15:10.555 "reset": true, 00:15:10.555 "nvme_admin": false, 00:15:10.555 "nvme_io": false, 00:15:10.555 "nvme_io_md": false, 00:15:10.555 "write_zeroes": true, 00:15:10.555 "zcopy": true, 00:15:10.555 "get_zone_info": false, 00:15:10.555 "zone_management": false, 00:15:10.555 "zone_append": false, 00:15:10.555 "compare": false, 00:15:10.555 "compare_and_write": false, 00:15:10.555 "abort": true, 00:15:10.555 "seek_hole": false, 00:15:10.555 "seek_data": false, 00:15:10.555 "copy": true, 00:15:10.555 "nvme_iov_md": false 00:15:10.555 }, 00:15:10.555 "memory_domains": [ 00:15:10.555 { 00:15:10.555 "dma_device_id": "system", 00:15:10.555 "dma_device_type": 1 00:15:10.555 }, 00:15:10.555 { 00:15:10.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:10.555 "dma_device_type": 2 00:15:10.555 } 00:15:10.555 ], 00:15:10.555 "driver_specific": { 00:15:10.555 "passthru": { 00:15:10.555 "name": "pt3", 00:15:10.555 "base_bdev_name": "malloc3" 00:15:10.555 } 00:15:10.555 } 00:15:10.555 }' 00:15:10.555 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:10.555 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:10.814 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:10.814 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:10.814 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:10.814 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:10.814 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.814 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.814 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:10.814 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.814 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.814 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:10.814 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:10.814 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:10.814 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:15:11.072 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:11.072 "name": "pt4", 00:15:11.072 "aliases": [ 00:15:11.072 "00000000-0000-0000-0000-000000000004" 00:15:11.072 ], 00:15:11.072 "product_name": "passthru", 00:15:11.072 "block_size": 512, 00:15:11.072 "num_blocks": 65536, 00:15:11.072 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:11.072 "assigned_rate_limits": { 00:15:11.072 "rw_ios_per_sec": 0, 00:15:11.072 "rw_mbytes_per_sec": 0, 00:15:11.072 "r_mbytes_per_sec": 0, 00:15:11.072 "w_mbytes_per_sec": 0 00:15:11.072 }, 00:15:11.072 "claimed": true, 00:15:11.072 "claim_type": "exclusive_write", 00:15:11.072 "zoned": false, 00:15:11.072 "supported_io_types": { 00:15:11.072 "read": true, 00:15:11.072 "write": true, 00:15:11.072 "unmap": true, 00:15:11.072 "flush": true, 00:15:11.072 "reset": true, 00:15:11.072 "nvme_admin": false, 00:15:11.072 "nvme_io": false, 00:15:11.072 "nvme_io_md": false, 00:15:11.072 "write_zeroes": true, 00:15:11.072 "zcopy": true, 00:15:11.072 "get_zone_info": false, 00:15:11.072 "zone_management": false, 00:15:11.072 "zone_append": false, 00:15:11.072 "compare": false, 00:15:11.072 "compare_and_write": false, 00:15:11.072 "abort": true, 00:15:11.072 "seek_hole": false, 00:15:11.072 "seek_data": false, 00:15:11.072 "copy": true, 00:15:11.072 "nvme_iov_md": false 00:15:11.072 }, 00:15:11.072 "memory_domains": [ 00:15:11.072 { 00:15:11.072 "dma_device_id": "system", 00:15:11.072 "dma_device_type": 1 00:15:11.072 }, 00:15:11.072 { 00:15:11.072 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.072 "dma_device_type": 2 00:15:11.072 } 00:15:11.072 ], 00:15:11.072 "driver_specific": { 00:15:11.072 "passthru": { 00:15:11.072 "name": "pt4", 00:15:11.072 "base_bdev_name": "malloc4" 00:15:11.072 } 00:15:11.072 } 00:15:11.072 }' 00:15:11.072 18:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:11.072 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:11.072 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:11.072 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:11.332 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:11.332 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:11.332 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:11.332 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:11.332 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:11.332 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:11.332 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:11.332 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:11.332 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:15:11.332 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:11.591 [2024-07-24 18:51:56.436020] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:11.591 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=02d2d5aa-efbc-4332-adc1-6495b467b5d7 00:15:11.591 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 02d2d5aa-efbc-4332-adc1-6495b467b5d7 ']' 00:15:11.591 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:11.591 [2024-07-24 18:51:56.600235] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:11.591 [2024-07-24 18:51:56.600249] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:11.850 [2024-07-24 18:51:56.600287] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:11.850 [2024-07-24 18:51:56.600330] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:11.850 [2024-07-24 18:51:56.600336] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2043d40 name raid_bdev1, state offline 00:15:11.850 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.850 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:15:11.850 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:15:11.850 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:15:11.850 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:11.850 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:12.109 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:12.109 18:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:12.109 18:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:12.109 18:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:12.367 18:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:12.367 18:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:15:12.625 18:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:12.625 18:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:12.625 18:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:15:12.625 18:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:15:12.625 18:51:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:15:12.625 18:51:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:15:12.625 18:51:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:12.625 18:51:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:12.625 18:51:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:12.625 18:51:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:12.625 18:51:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:12.625 18:51:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:12.625 18:51:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:12.625 18:51:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:12.625 18:51:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:15:12.884 [2024-07-24 18:51:57.739152] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:12.884 [2024-07-24 18:51:57.740138] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:12.884 [2024-07-24 18:51:57.740171] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:12.884 [2024-07-24 18:51:57.740193] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:15:12.884 [2024-07-24 18:51:57.740224] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:12.884 [2024-07-24 18:51:57.740252] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:12.884 [2024-07-24 18:51:57.740264] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:12.884 [2024-07-24 18:51:57.740277] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:15:12.884 [2024-07-24 18:51:57.740286] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:12.884 [2024-07-24 18:51:57.740291] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e95490 name raid_bdev1, state configuring 00:15:12.884 request: 00:15:12.884 { 00:15:12.884 "name": "raid_bdev1", 00:15:12.884 "raid_level": "raid0", 00:15:12.884 "base_bdevs": [ 00:15:12.884 "malloc1", 00:15:12.884 "malloc2", 00:15:12.884 "malloc3", 00:15:12.884 "malloc4" 00:15:12.884 ], 00:15:12.884 "strip_size_kb": 64, 00:15:12.884 "superblock": false, 00:15:12.884 "method": "bdev_raid_create", 00:15:12.884 "req_id": 1 00:15:12.884 } 00:15:12.884 Got JSON-RPC error response 00:15:12.884 response: 00:15:12.884 { 00:15:12.884 "code": -17, 00:15:12.884 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:12.884 } 00:15:12.884 18:51:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:15:12.884 18:51:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:15:12.884 18:51:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:15:12.884 18:51:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:15:12.884 18:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.884 18:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:15:13.143 18:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:15:13.143 18:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:15:13.143 18:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:13.143 [2024-07-24 18:51:58.071973] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:13.143 [2024-07-24 18:51:58.072002] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:13.143 [2024-07-24 18:51:58.072020] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2040100 00:15:13.143 [2024-07-24 18:51:58.072026] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:13.143 [2024-07-24 18:51:58.073210] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:13.143 [2024-07-24 18:51:58.073232] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:13.143 [2024-07-24 18:51:58.073280] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:13.143 [2024-07-24 18:51:58.073298] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:13.143 pt1 00:15:13.143 18:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:15:13.143 18:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:13.143 18:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:13.143 18:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:13.143 18:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:13.143 18:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:13.143 18:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:13.143 18:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:13.143 18:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:13.143 18:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:13.143 18:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:13.143 18:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.401 18:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:13.401 "name": "raid_bdev1", 00:15:13.401 "uuid": "02d2d5aa-efbc-4332-adc1-6495b467b5d7", 00:15:13.401 "strip_size_kb": 64, 00:15:13.401 "state": "configuring", 00:15:13.401 "raid_level": "raid0", 00:15:13.401 "superblock": true, 00:15:13.401 "num_base_bdevs": 4, 00:15:13.401 "num_base_bdevs_discovered": 1, 00:15:13.401 "num_base_bdevs_operational": 4, 00:15:13.401 "base_bdevs_list": [ 00:15:13.401 { 00:15:13.401 "name": "pt1", 00:15:13.401 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:13.401 "is_configured": true, 00:15:13.401 "data_offset": 2048, 00:15:13.401 "data_size": 63488 00:15:13.401 }, 00:15:13.401 { 00:15:13.401 "name": null, 00:15:13.401 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:13.401 "is_configured": false, 00:15:13.401 "data_offset": 2048, 00:15:13.401 "data_size": 63488 00:15:13.401 }, 00:15:13.401 { 00:15:13.401 "name": null, 00:15:13.401 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:13.401 "is_configured": false, 00:15:13.401 "data_offset": 2048, 00:15:13.401 "data_size": 63488 00:15:13.401 }, 00:15:13.401 { 00:15:13.401 "name": null, 00:15:13.401 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:13.401 "is_configured": false, 00:15:13.401 "data_offset": 2048, 00:15:13.401 "data_size": 63488 00:15:13.401 } 00:15:13.401 ] 00:15:13.401 }' 00:15:13.401 18:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:13.401 18:51:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:13.966 18:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:15:13.966 18:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:13.966 [2024-07-24 18:51:58.910156] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:13.966 [2024-07-24 18:51:58.910198] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:13.966 [2024-07-24 18:51:58.910209] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2045070 00:15:13.966 [2024-07-24 18:51:58.910215] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:13.966 [2024-07-24 18:51:58.910490] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:13.966 [2024-07-24 18:51:58.910504] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:13.966 [2024-07-24 18:51:58.910555] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:13.966 [2024-07-24 18:51:58.910568] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:13.966 pt2 00:15:13.966 18:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:14.225 [2024-07-24 18:51:59.078596] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:14.225 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:15:14.225 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:14.225 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:14.225 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:14.225 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:14.225 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:14.225 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:14.225 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:14.225 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:14.225 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:14.225 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.225 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:14.483 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:14.483 "name": "raid_bdev1", 00:15:14.483 "uuid": "02d2d5aa-efbc-4332-adc1-6495b467b5d7", 00:15:14.484 "strip_size_kb": 64, 00:15:14.484 "state": "configuring", 00:15:14.484 "raid_level": "raid0", 00:15:14.484 "superblock": true, 00:15:14.484 "num_base_bdevs": 4, 00:15:14.484 "num_base_bdevs_discovered": 1, 00:15:14.484 "num_base_bdevs_operational": 4, 00:15:14.484 "base_bdevs_list": [ 00:15:14.484 { 00:15:14.484 "name": "pt1", 00:15:14.484 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:14.484 "is_configured": true, 00:15:14.484 "data_offset": 2048, 00:15:14.484 "data_size": 63488 00:15:14.484 }, 00:15:14.484 { 00:15:14.484 "name": null, 00:15:14.484 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:14.484 "is_configured": false, 00:15:14.484 "data_offset": 2048, 00:15:14.484 "data_size": 63488 00:15:14.484 }, 00:15:14.484 { 00:15:14.484 "name": null, 00:15:14.484 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:14.484 "is_configured": false, 00:15:14.484 "data_offset": 2048, 00:15:14.484 "data_size": 63488 00:15:14.484 }, 00:15:14.484 { 00:15:14.484 "name": null, 00:15:14.484 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:14.484 "is_configured": false, 00:15:14.484 "data_offset": 2048, 00:15:14.484 "data_size": 63488 00:15:14.484 } 00:15:14.484 ] 00:15:14.484 }' 00:15:14.484 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:14.484 18:51:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:14.741 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:15:14.741 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:14.741 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:15.000 [2024-07-24 18:51:59.876664] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:15.000 [2024-07-24 18:51:59.876707] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:15.000 [2024-07-24 18:51:59.876719] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2042be0 00:15:15.000 [2024-07-24 18:51:59.876725] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:15.000 [2024-07-24 18:51:59.876981] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:15.000 [2024-07-24 18:51:59.876993] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:15.000 [2024-07-24 18:51:59.877043] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:15.000 [2024-07-24 18:51:59.877055] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:15.000 pt2 00:15:15.000 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:15.000 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:15.000 18:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:15.259 [2024-07-24 18:52:00.049124] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:15.259 [2024-07-24 18:52:00.049161] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:15.259 [2024-07-24 18:52:00.049172] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2042030 00:15:15.259 [2024-07-24 18:52:00.049179] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:15.259 [2024-07-24 18:52:00.049418] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:15.259 [2024-07-24 18:52:00.049430] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:15.259 [2024-07-24 18:52:00.049479] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:15.259 [2024-07-24 18:52:00.049491] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:15.259 pt3 00:15:15.259 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:15.259 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:15.259 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:15:15.259 [2024-07-24 18:52:00.217551] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:15:15.259 [2024-07-24 18:52:00.217577] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:15.259 [2024-07-24 18:52:00.217587] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e94c40 00:15:15.259 [2024-07-24 18:52:00.217593] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:15.259 [2024-07-24 18:52:00.217845] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:15.259 [2024-07-24 18:52:00.217856] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:15:15.259 [2024-07-24 18:52:00.217897] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:15:15.259 [2024-07-24 18:52:00.217908] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:15:15.259 [2024-07-24 18:52:00.217991] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2045390 00:15:15.259 [2024-07-24 18:52:00.217997] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:15.259 [2024-07-24 18:52:00.218119] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2043010 00:15:15.259 [2024-07-24 18:52:00.218206] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2045390 00:15:15.259 [2024-07-24 18:52:00.218211] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2045390 00:15:15.259 [2024-07-24 18:52:00.218275] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:15.259 pt4 00:15:15.259 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:15.259 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:15.259 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:15.259 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:15.259 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:15.259 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:15.259 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:15.259 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:15.259 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:15.259 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:15.259 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:15.259 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:15.259 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:15.259 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:15.516 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:15.516 "name": "raid_bdev1", 00:15:15.516 "uuid": "02d2d5aa-efbc-4332-adc1-6495b467b5d7", 00:15:15.516 "strip_size_kb": 64, 00:15:15.516 "state": "online", 00:15:15.516 "raid_level": "raid0", 00:15:15.516 "superblock": true, 00:15:15.516 "num_base_bdevs": 4, 00:15:15.516 "num_base_bdevs_discovered": 4, 00:15:15.516 "num_base_bdevs_operational": 4, 00:15:15.516 "base_bdevs_list": [ 00:15:15.516 { 00:15:15.516 "name": "pt1", 00:15:15.516 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:15.516 "is_configured": true, 00:15:15.516 "data_offset": 2048, 00:15:15.516 "data_size": 63488 00:15:15.516 }, 00:15:15.516 { 00:15:15.516 "name": "pt2", 00:15:15.516 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:15.516 "is_configured": true, 00:15:15.516 "data_offset": 2048, 00:15:15.516 "data_size": 63488 00:15:15.516 }, 00:15:15.516 { 00:15:15.516 "name": "pt3", 00:15:15.516 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:15.516 "is_configured": true, 00:15:15.516 "data_offset": 2048, 00:15:15.516 "data_size": 63488 00:15:15.516 }, 00:15:15.516 { 00:15:15.516 "name": "pt4", 00:15:15.516 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:15.516 "is_configured": true, 00:15:15.517 "data_offset": 2048, 00:15:15.517 "data_size": 63488 00:15:15.517 } 00:15:15.517 ] 00:15:15.517 }' 00:15:15.517 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:15.517 18:52:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:16.083 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:15:16.083 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:16.083 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:16.083 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:16.083 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:16.083 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:16.083 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:16.083 18:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:16.083 [2024-07-24 18:52:01.031901] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:16.083 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:16.083 "name": "raid_bdev1", 00:15:16.083 "aliases": [ 00:15:16.083 "02d2d5aa-efbc-4332-adc1-6495b467b5d7" 00:15:16.083 ], 00:15:16.083 "product_name": "Raid Volume", 00:15:16.083 "block_size": 512, 00:15:16.083 "num_blocks": 253952, 00:15:16.083 "uuid": "02d2d5aa-efbc-4332-adc1-6495b467b5d7", 00:15:16.083 "assigned_rate_limits": { 00:15:16.083 "rw_ios_per_sec": 0, 00:15:16.083 "rw_mbytes_per_sec": 0, 00:15:16.083 "r_mbytes_per_sec": 0, 00:15:16.083 "w_mbytes_per_sec": 0 00:15:16.083 }, 00:15:16.083 "claimed": false, 00:15:16.083 "zoned": false, 00:15:16.083 "supported_io_types": { 00:15:16.083 "read": true, 00:15:16.083 "write": true, 00:15:16.083 "unmap": true, 00:15:16.083 "flush": true, 00:15:16.083 "reset": true, 00:15:16.083 "nvme_admin": false, 00:15:16.083 "nvme_io": false, 00:15:16.083 "nvme_io_md": false, 00:15:16.083 "write_zeroes": true, 00:15:16.083 "zcopy": false, 00:15:16.083 "get_zone_info": false, 00:15:16.083 "zone_management": false, 00:15:16.083 "zone_append": false, 00:15:16.083 "compare": false, 00:15:16.083 "compare_and_write": false, 00:15:16.083 "abort": false, 00:15:16.083 "seek_hole": false, 00:15:16.083 "seek_data": false, 00:15:16.083 "copy": false, 00:15:16.083 "nvme_iov_md": false 00:15:16.083 }, 00:15:16.083 "memory_domains": [ 00:15:16.083 { 00:15:16.083 "dma_device_id": "system", 00:15:16.083 "dma_device_type": 1 00:15:16.083 }, 00:15:16.083 { 00:15:16.083 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:16.084 "dma_device_type": 2 00:15:16.084 }, 00:15:16.084 { 00:15:16.084 "dma_device_id": "system", 00:15:16.084 "dma_device_type": 1 00:15:16.084 }, 00:15:16.084 { 00:15:16.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:16.084 "dma_device_type": 2 00:15:16.084 }, 00:15:16.084 { 00:15:16.084 "dma_device_id": "system", 00:15:16.084 "dma_device_type": 1 00:15:16.084 }, 00:15:16.084 { 00:15:16.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:16.084 "dma_device_type": 2 00:15:16.084 }, 00:15:16.084 { 00:15:16.084 "dma_device_id": "system", 00:15:16.084 "dma_device_type": 1 00:15:16.084 }, 00:15:16.084 { 00:15:16.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:16.084 "dma_device_type": 2 00:15:16.084 } 00:15:16.084 ], 00:15:16.084 "driver_specific": { 00:15:16.084 "raid": { 00:15:16.084 "uuid": "02d2d5aa-efbc-4332-adc1-6495b467b5d7", 00:15:16.084 "strip_size_kb": 64, 00:15:16.084 "state": "online", 00:15:16.084 "raid_level": "raid0", 00:15:16.084 "superblock": true, 00:15:16.084 "num_base_bdevs": 4, 00:15:16.084 "num_base_bdevs_discovered": 4, 00:15:16.084 "num_base_bdevs_operational": 4, 00:15:16.084 "base_bdevs_list": [ 00:15:16.084 { 00:15:16.084 "name": "pt1", 00:15:16.084 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:16.084 "is_configured": true, 00:15:16.084 "data_offset": 2048, 00:15:16.084 "data_size": 63488 00:15:16.084 }, 00:15:16.084 { 00:15:16.084 "name": "pt2", 00:15:16.084 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:16.084 "is_configured": true, 00:15:16.084 "data_offset": 2048, 00:15:16.084 "data_size": 63488 00:15:16.084 }, 00:15:16.084 { 00:15:16.084 "name": "pt3", 00:15:16.084 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:16.084 "is_configured": true, 00:15:16.084 "data_offset": 2048, 00:15:16.084 "data_size": 63488 00:15:16.084 }, 00:15:16.084 { 00:15:16.084 "name": "pt4", 00:15:16.084 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:16.084 "is_configured": true, 00:15:16.084 "data_offset": 2048, 00:15:16.084 "data_size": 63488 00:15:16.084 } 00:15:16.084 ] 00:15:16.084 } 00:15:16.084 } 00:15:16.084 }' 00:15:16.084 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:16.084 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:16.084 pt2 00:15:16.084 pt3 00:15:16.084 pt4' 00:15:16.084 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:16.084 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:16.084 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:16.342 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:16.342 "name": "pt1", 00:15:16.342 "aliases": [ 00:15:16.342 "00000000-0000-0000-0000-000000000001" 00:15:16.342 ], 00:15:16.342 "product_name": "passthru", 00:15:16.342 "block_size": 512, 00:15:16.342 "num_blocks": 65536, 00:15:16.342 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:16.342 "assigned_rate_limits": { 00:15:16.343 "rw_ios_per_sec": 0, 00:15:16.343 "rw_mbytes_per_sec": 0, 00:15:16.343 "r_mbytes_per_sec": 0, 00:15:16.343 "w_mbytes_per_sec": 0 00:15:16.343 }, 00:15:16.343 "claimed": true, 00:15:16.343 "claim_type": "exclusive_write", 00:15:16.343 "zoned": false, 00:15:16.343 "supported_io_types": { 00:15:16.343 "read": true, 00:15:16.343 "write": true, 00:15:16.343 "unmap": true, 00:15:16.343 "flush": true, 00:15:16.343 "reset": true, 00:15:16.343 "nvme_admin": false, 00:15:16.343 "nvme_io": false, 00:15:16.343 "nvme_io_md": false, 00:15:16.343 "write_zeroes": true, 00:15:16.343 "zcopy": true, 00:15:16.343 "get_zone_info": false, 00:15:16.343 "zone_management": false, 00:15:16.343 "zone_append": false, 00:15:16.343 "compare": false, 00:15:16.343 "compare_and_write": false, 00:15:16.343 "abort": true, 00:15:16.343 "seek_hole": false, 00:15:16.343 "seek_data": false, 00:15:16.343 "copy": true, 00:15:16.343 "nvme_iov_md": false 00:15:16.343 }, 00:15:16.343 "memory_domains": [ 00:15:16.343 { 00:15:16.343 "dma_device_id": "system", 00:15:16.343 "dma_device_type": 1 00:15:16.343 }, 00:15:16.343 { 00:15:16.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:16.343 "dma_device_type": 2 00:15:16.343 } 00:15:16.343 ], 00:15:16.343 "driver_specific": { 00:15:16.343 "passthru": { 00:15:16.343 "name": "pt1", 00:15:16.343 "base_bdev_name": "malloc1" 00:15:16.343 } 00:15:16.343 } 00:15:16.343 }' 00:15:16.343 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:16.343 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:16.343 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:16.343 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:16.601 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:16.601 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:16.601 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:16.601 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:16.601 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:16.601 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:16.601 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:16.601 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:16.601 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:16.601 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:16.601 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:16.859 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:16.859 "name": "pt2", 00:15:16.859 "aliases": [ 00:15:16.859 "00000000-0000-0000-0000-000000000002" 00:15:16.859 ], 00:15:16.860 "product_name": "passthru", 00:15:16.860 "block_size": 512, 00:15:16.860 "num_blocks": 65536, 00:15:16.860 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:16.860 "assigned_rate_limits": { 00:15:16.860 "rw_ios_per_sec": 0, 00:15:16.860 "rw_mbytes_per_sec": 0, 00:15:16.860 "r_mbytes_per_sec": 0, 00:15:16.860 "w_mbytes_per_sec": 0 00:15:16.860 }, 00:15:16.860 "claimed": true, 00:15:16.860 "claim_type": "exclusive_write", 00:15:16.860 "zoned": false, 00:15:16.860 "supported_io_types": { 00:15:16.860 "read": true, 00:15:16.860 "write": true, 00:15:16.860 "unmap": true, 00:15:16.860 "flush": true, 00:15:16.860 "reset": true, 00:15:16.860 "nvme_admin": false, 00:15:16.860 "nvme_io": false, 00:15:16.860 "nvme_io_md": false, 00:15:16.860 "write_zeroes": true, 00:15:16.860 "zcopy": true, 00:15:16.860 "get_zone_info": false, 00:15:16.860 "zone_management": false, 00:15:16.860 "zone_append": false, 00:15:16.860 "compare": false, 00:15:16.860 "compare_and_write": false, 00:15:16.860 "abort": true, 00:15:16.860 "seek_hole": false, 00:15:16.860 "seek_data": false, 00:15:16.860 "copy": true, 00:15:16.860 "nvme_iov_md": false 00:15:16.860 }, 00:15:16.860 "memory_domains": [ 00:15:16.860 { 00:15:16.860 "dma_device_id": "system", 00:15:16.860 "dma_device_type": 1 00:15:16.860 }, 00:15:16.860 { 00:15:16.860 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:16.860 "dma_device_type": 2 00:15:16.860 } 00:15:16.860 ], 00:15:16.860 "driver_specific": { 00:15:16.860 "passthru": { 00:15:16.860 "name": "pt2", 00:15:16.860 "base_bdev_name": "malloc2" 00:15:16.860 } 00:15:16.860 } 00:15:16.860 }' 00:15:16.860 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:16.860 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:16.860 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:16.860 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:16.860 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:17.117 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:17.117 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:17.117 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:17.117 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:17.117 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:17.117 18:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:17.117 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:17.117 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:17.117 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:17.118 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:17.376 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:17.376 "name": "pt3", 00:15:17.376 "aliases": [ 00:15:17.376 "00000000-0000-0000-0000-000000000003" 00:15:17.376 ], 00:15:17.376 "product_name": "passthru", 00:15:17.376 "block_size": 512, 00:15:17.376 "num_blocks": 65536, 00:15:17.376 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:17.376 "assigned_rate_limits": { 00:15:17.376 "rw_ios_per_sec": 0, 00:15:17.376 "rw_mbytes_per_sec": 0, 00:15:17.376 "r_mbytes_per_sec": 0, 00:15:17.376 "w_mbytes_per_sec": 0 00:15:17.376 }, 00:15:17.376 "claimed": true, 00:15:17.376 "claim_type": "exclusive_write", 00:15:17.376 "zoned": false, 00:15:17.376 "supported_io_types": { 00:15:17.376 "read": true, 00:15:17.376 "write": true, 00:15:17.376 "unmap": true, 00:15:17.376 "flush": true, 00:15:17.376 "reset": true, 00:15:17.376 "nvme_admin": false, 00:15:17.376 "nvme_io": false, 00:15:17.376 "nvme_io_md": false, 00:15:17.376 "write_zeroes": true, 00:15:17.376 "zcopy": true, 00:15:17.376 "get_zone_info": false, 00:15:17.376 "zone_management": false, 00:15:17.376 "zone_append": false, 00:15:17.376 "compare": false, 00:15:17.376 "compare_and_write": false, 00:15:17.376 "abort": true, 00:15:17.376 "seek_hole": false, 00:15:17.376 "seek_data": false, 00:15:17.376 "copy": true, 00:15:17.376 "nvme_iov_md": false 00:15:17.376 }, 00:15:17.376 "memory_domains": [ 00:15:17.376 { 00:15:17.376 "dma_device_id": "system", 00:15:17.376 "dma_device_type": 1 00:15:17.376 }, 00:15:17.376 { 00:15:17.376 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:17.376 "dma_device_type": 2 00:15:17.376 } 00:15:17.376 ], 00:15:17.376 "driver_specific": { 00:15:17.376 "passthru": { 00:15:17.376 "name": "pt3", 00:15:17.376 "base_bdev_name": "malloc3" 00:15:17.376 } 00:15:17.376 } 00:15:17.376 }' 00:15:17.376 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:17.376 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:17.376 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:17.376 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:17.376 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:17.376 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:17.376 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:17.376 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:17.634 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:17.634 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:17.634 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:17.634 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:17.634 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:17.634 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:15:17.634 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:17.892 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:17.892 "name": "pt4", 00:15:17.892 "aliases": [ 00:15:17.892 "00000000-0000-0000-0000-000000000004" 00:15:17.892 ], 00:15:17.892 "product_name": "passthru", 00:15:17.892 "block_size": 512, 00:15:17.892 "num_blocks": 65536, 00:15:17.892 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:17.892 "assigned_rate_limits": { 00:15:17.892 "rw_ios_per_sec": 0, 00:15:17.892 "rw_mbytes_per_sec": 0, 00:15:17.892 "r_mbytes_per_sec": 0, 00:15:17.892 "w_mbytes_per_sec": 0 00:15:17.892 }, 00:15:17.892 "claimed": true, 00:15:17.892 "claim_type": "exclusive_write", 00:15:17.892 "zoned": false, 00:15:17.892 "supported_io_types": { 00:15:17.892 "read": true, 00:15:17.892 "write": true, 00:15:17.892 "unmap": true, 00:15:17.892 "flush": true, 00:15:17.892 "reset": true, 00:15:17.892 "nvme_admin": false, 00:15:17.892 "nvme_io": false, 00:15:17.892 "nvme_io_md": false, 00:15:17.892 "write_zeroes": true, 00:15:17.892 "zcopy": true, 00:15:17.892 "get_zone_info": false, 00:15:17.892 "zone_management": false, 00:15:17.892 "zone_append": false, 00:15:17.892 "compare": false, 00:15:17.892 "compare_and_write": false, 00:15:17.892 "abort": true, 00:15:17.892 "seek_hole": false, 00:15:17.892 "seek_data": false, 00:15:17.892 "copy": true, 00:15:17.893 "nvme_iov_md": false 00:15:17.893 }, 00:15:17.893 "memory_domains": [ 00:15:17.893 { 00:15:17.893 "dma_device_id": "system", 00:15:17.893 "dma_device_type": 1 00:15:17.893 }, 00:15:17.893 { 00:15:17.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:17.893 "dma_device_type": 2 00:15:17.893 } 00:15:17.893 ], 00:15:17.893 "driver_specific": { 00:15:17.893 "passthru": { 00:15:17.893 "name": "pt4", 00:15:17.893 "base_bdev_name": "malloc4" 00:15:17.893 } 00:15:17.893 } 00:15:17.893 }' 00:15:17.893 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:17.893 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:17.893 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:17.893 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:17.893 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:17.893 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:17.893 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:17.893 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:17.893 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:17.893 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:18.152 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:18.152 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:18.152 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:18.152 18:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:18.152 [2024-07-24 18:52:03.109248] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:18.152 18:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 02d2d5aa-efbc-4332-adc1-6495b467b5d7 '!=' 02d2d5aa-efbc-4332-adc1-6495b467b5d7 ']' 00:15:18.152 18:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:15:18.152 18:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:18.152 18:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:18.152 18:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2115517 00:15:18.152 18:52:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2115517 ']' 00:15:18.152 18:52:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2115517 00:15:18.152 18:52:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:15:18.152 18:52:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:18.152 18:52:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2115517 00:15:18.411 18:52:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:18.411 18:52:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:18.411 18:52:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2115517' 00:15:18.411 killing process with pid 2115517 00:15:18.411 18:52:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2115517 00:15:18.411 [2024-07-24 18:52:03.163783] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:18.411 [2024-07-24 18:52:03.163833] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:18.411 [2024-07-24 18:52:03.163878] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:18.411 [2024-07-24 18:52:03.163884] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2045390 name raid_bdev1, state offline 00:15:18.411 18:52:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2115517 00:15:18.411 [2024-07-24 18:52:03.195879] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:18.411 18:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:18.411 00:15:18.411 real 0m12.187s 00:15:18.411 user 0m22.240s 00:15:18.411 sys 0m1.867s 00:15:18.411 18:52:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:18.411 18:52:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:18.411 ************************************ 00:15:18.411 END TEST raid_superblock_test 00:15:18.411 ************************************ 00:15:18.411 18:52:03 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:15:18.411 18:52:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:18.411 18:52:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:18.411 18:52:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:18.669 ************************************ 00:15:18.669 START TEST raid_read_error_test 00:15:18.669 ************************************ 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.wIA0TlSAen 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2117896 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2117896 /var/tmp/spdk-raid.sock 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2117896 ']' 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:18.669 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:18.669 18:52:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:18.669 [2024-07-24 18:52:03.498550] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:15:18.669 [2024-07-24 18:52:03.498588] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2117896 ] 00:15:18.669 [2024-07-24 18:52:03.561776] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:18.669 [2024-07-24 18:52:03.640466] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:18.928 [2024-07-24 18:52:03.697152] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:18.928 [2024-07-24 18:52:03.697175] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:19.493 18:52:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:19.493 18:52:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:19.493 18:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:19.493 18:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:19.493 BaseBdev1_malloc 00:15:19.493 18:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:19.751 true 00:15:19.751 18:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:20.010 [2024-07-24 18:52:04.773175] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:20.010 [2024-07-24 18:52:04.773206] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:20.010 [2024-07-24 18:52:04.773217] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20a7d20 00:15:20.010 [2024-07-24 18:52:04.773223] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:20.010 [2024-07-24 18:52:04.774371] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:20.010 [2024-07-24 18:52:04.774392] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:20.010 BaseBdev1 00:15:20.010 18:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:20.010 18:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:20.010 BaseBdev2_malloc 00:15:20.010 18:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:20.269 true 00:15:20.269 18:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:20.269 [2024-07-24 18:52:05.257942] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:20.269 [2024-07-24 18:52:05.257973] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:20.269 [2024-07-24 18:52:05.257983] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20acd50 00:15:20.269 [2024-07-24 18:52:05.257989] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:20.269 [2024-07-24 18:52:05.258969] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:20.269 [2024-07-24 18:52:05.258990] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:20.269 BaseBdev2 00:15:20.269 18:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:20.269 18:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:20.528 BaseBdev3_malloc 00:15:20.528 18:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:20.785 true 00:15:20.785 18:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:20.785 [2024-07-24 18:52:05.742877] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:20.786 [2024-07-24 18:52:05.742908] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:20.786 [2024-07-24 18:52:05.742919] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20abef0 00:15:20.786 [2024-07-24 18:52:05.742925] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:20.786 [2024-07-24 18:52:05.743976] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:20.786 [2024-07-24 18:52:05.743996] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:20.786 BaseBdev3 00:15:20.786 18:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:20.786 18:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:15:21.043 BaseBdev4_malloc 00:15:21.043 18:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:15:21.302 true 00:15:21.302 18:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:15:21.302 [2024-07-24 18:52:06.223503] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:15:21.302 [2024-07-24 18:52:06.223532] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:21.302 [2024-07-24 18:52:06.223544] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20b0280 00:15:21.302 [2024-07-24 18:52:06.223549] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:21.302 [2024-07-24 18:52:06.224516] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:21.302 [2024-07-24 18:52:06.224536] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:15:21.302 BaseBdev4 00:15:21.302 18:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:15:21.561 [2024-07-24 18:52:06.379932] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:21.561 [2024-07-24 18:52:06.380782] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:21.561 [2024-07-24 18:52:06.380826] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:21.561 [2024-07-24 18:52:06.380863] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:21.561 [2024-07-24 18:52:06.381021] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x20b1d90 00:15:21.561 [2024-07-24 18:52:06.381027] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:21.561 [2024-07-24 18:52:06.381154] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20af8d0 00:15:21.561 [2024-07-24 18:52:06.381255] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20b1d90 00:15:21.561 [2024-07-24 18:52:06.381260] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20b1d90 00:15:21.561 [2024-07-24 18:52:06.381326] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:21.561 18:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:21.561 18:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:21.561 18:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:21.561 18:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:21.561 18:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:21.561 18:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:21.561 18:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:21.561 18:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:21.561 18:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:21.561 18:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:21.561 18:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.561 18:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:21.561 18:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:21.561 "name": "raid_bdev1", 00:15:21.561 "uuid": "f31a1820-61cf-4a23-a6ba-d65d10ebf311", 00:15:21.561 "strip_size_kb": 64, 00:15:21.561 "state": "online", 00:15:21.561 "raid_level": "raid0", 00:15:21.561 "superblock": true, 00:15:21.561 "num_base_bdevs": 4, 00:15:21.561 "num_base_bdevs_discovered": 4, 00:15:21.561 "num_base_bdevs_operational": 4, 00:15:21.561 "base_bdevs_list": [ 00:15:21.561 { 00:15:21.561 "name": "BaseBdev1", 00:15:21.561 "uuid": "3135438d-ed5a-5bfc-aedd-5ead2d586256", 00:15:21.561 "is_configured": true, 00:15:21.561 "data_offset": 2048, 00:15:21.561 "data_size": 63488 00:15:21.561 }, 00:15:21.561 { 00:15:21.561 "name": "BaseBdev2", 00:15:21.561 "uuid": "e00f8bee-f60a-5cde-acfe-14b87c2ec580", 00:15:21.561 "is_configured": true, 00:15:21.561 "data_offset": 2048, 00:15:21.561 "data_size": 63488 00:15:21.561 }, 00:15:21.561 { 00:15:21.561 "name": "BaseBdev3", 00:15:21.561 "uuid": "6e2bf143-b84b-5471-bc29-f3c50d96b08c", 00:15:21.561 "is_configured": true, 00:15:21.561 "data_offset": 2048, 00:15:21.561 "data_size": 63488 00:15:21.561 }, 00:15:21.561 { 00:15:21.561 "name": "BaseBdev4", 00:15:21.561 "uuid": "30b8a09d-4273-5e38-93e0-047d4f94a293", 00:15:21.561 "is_configured": true, 00:15:21.561 "data_offset": 2048, 00:15:21.561 "data_size": 63488 00:15:21.561 } 00:15:21.561 ] 00:15:21.561 }' 00:15:21.561 18:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:21.561 18:52:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:22.126 18:52:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:22.126 18:52:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:22.126 [2024-07-24 18:52:07.122047] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20b51c0 00:15:23.060 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:23.320 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:23.320 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:23.320 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:15:23.320 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:23.320 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:23.320 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:23.320 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:23.320 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:23.320 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:23.320 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:23.320 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:23.320 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:23.320 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:23.320 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.320 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:23.578 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:23.578 "name": "raid_bdev1", 00:15:23.578 "uuid": "f31a1820-61cf-4a23-a6ba-d65d10ebf311", 00:15:23.578 "strip_size_kb": 64, 00:15:23.578 "state": "online", 00:15:23.578 "raid_level": "raid0", 00:15:23.578 "superblock": true, 00:15:23.578 "num_base_bdevs": 4, 00:15:23.578 "num_base_bdevs_discovered": 4, 00:15:23.578 "num_base_bdevs_operational": 4, 00:15:23.578 "base_bdevs_list": [ 00:15:23.578 { 00:15:23.578 "name": "BaseBdev1", 00:15:23.578 "uuid": "3135438d-ed5a-5bfc-aedd-5ead2d586256", 00:15:23.578 "is_configured": true, 00:15:23.578 "data_offset": 2048, 00:15:23.578 "data_size": 63488 00:15:23.578 }, 00:15:23.578 { 00:15:23.578 "name": "BaseBdev2", 00:15:23.578 "uuid": "e00f8bee-f60a-5cde-acfe-14b87c2ec580", 00:15:23.578 "is_configured": true, 00:15:23.578 "data_offset": 2048, 00:15:23.578 "data_size": 63488 00:15:23.578 }, 00:15:23.578 { 00:15:23.578 "name": "BaseBdev3", 00:15:23.578 "uuid": "6e2bf143-b84b-5471-bc29-f3c50d96b08c", 00:15:23.578 "is_configured": true, 00:15:23.578 "data_offset": 2048, 00:15:23.578 "data_size": 63488 00:15:23.578 }, 00:15:23.578 { 00:15:23.578 "name": "BaseBdev4", 00:15:23.578 "uuid": "30b8a09d-4273-5e38-93e0-047d4f94a293", 00:15:23.578 "is_configured": true, 00:15:23.578 "data_offset": 2048, 00:15:23.578 "data_size": 63488 00:15:23.578 } 00:15:23.578 ] 00:15:23.578 }' 00:15:23.578 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:23.578 18:52:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:24.144 18:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:24.144 [2024-07-24 18:52:09.050860] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:24.144 [2024-07-24 18:52:09.050893] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:24.144 [2024-07-24 18:52:09.053008] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:24.144 [2024-07-24 18:52:09.053031] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:24.144 [2024-07-24 18:52:09.053056] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:24.144 [2024-07-24 18:52:09.053061] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20b1d90 name raid_bdev1, state offline 00:15:24.144 0 00:15:24.144 18:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2117896 00:15:24.144 18:52:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2117896 ']' 00:15:24.144 18:52:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2117896 00:15:24.144 18:52:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:24.144 18:52:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:24.144 18:52:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2117896 00:15:24.144 18:52:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:24.144 18:52:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:24.144 18:52:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2117896' 00:15:24.144 killing process with pid 2117896 00:15:24.144 18:52:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2117896 00:15:24.144 [2024-07-24 18:52:09.112811] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:24.144 18:52:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2117896 00:15:24.144 [2024-07-24 18:52:09.139151] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:24.403 18:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.wIA0TlSAen 00:15:24.403 18:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:24.403 18:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:24.403 18:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:15:24.403 18:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:24.403 18:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:24.403 18:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:24.403 18:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:15:24.403 00:15:24.403 real 0m5.889s 00:15:24.403 user 0m9.240s 00:15:24.403 sys 0m0.859s 00:15:24.403 18:52:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:24.403 18:52:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:24.403 ************************************ 00:15:24.403 END TEST raid_read_error_test 00:15:24.403 ************************************ 00:15:24.403 18:52:09 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:15:24.403 18:52:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:24.403 18:52:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:24.403 18:52:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:24.403 ************************************ 00:15:24.403 START TEST raid_write_error_test 00:15:24.403 ************************************ 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ST6Vral6TQ 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2119043 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2119043 /var/tmp/spdk-raid.sock 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2119043 ']' 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:24.403 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:24.403 18:52:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:24.661 [2024-07-24 18:52:09.457273] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:15:24.661 [2024-07-24 18:52:09.457314] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2119043 ] 00:15:24.661 [2024-07-24 18:52:09.519587] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:24.661 [2024-07-24 18:52:09.599118] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:24.661 [2024-07-24 18:52:09.648904] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:24.661 [2024-07-24 18:52:09.648929] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:25.592 18:52:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:25.593 18:52:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:25.593 18:52:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:25.593 18:52:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:25.593 BaseBdev1_malloc 00:15:25.593 18:52:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:25.593 true 00:15:25.593 18:52:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:25.851 [2024-07-24 18:52:10.724869] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:25.851 [2024-07-24 18:52:10.724901] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:25.851 [2024-07-24 18:52:10.724913] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23bbd20 00:15:25.851 [2024-07-24 18:52:10.724919] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:25.851 [2024-07-24 18:52:10.726093] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:25.851 [2024-07-24 18:52:10.726114] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:25.851 BaseBdev1 00:15:25.851 18:52:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:25.851 18:52:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:26.110 BaseBdev2_malloc 00:15:26.110 18:52:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:26.110 true 00:15:26.110 18:52:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:26.367 [2024-07-24 18:52:11.217588] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:26.367 [2024-07-24 18:52:11.217626] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:26.367 [2024-07-24 18:52:11.217638] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23c0d50 00:15:26.367 [2024-07-24 18:52:11.217643] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:26.367 [2024-07-24 18:52:11.218730] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:26.367 [2024-07-24 18:52:11.218749] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:26.367 BaseBdev2 00:15:26.367 18:52:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:26.367 18:52:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:26.650 BaseBdev3_malloc 00:15:26.650 18:52:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:26.650 true 00:15:26.650 18:52:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:26.909 [2024-07-24 18:52:11.718452] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:26.909 [2024-07-24 18:52:11.718487] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:26.909 [2024-07-24 18:52:11.718514] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23bfef0 00:15:26.909 [2024-07-24 18:52:11.718520] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:26.909 [2024-07-24 18:52:11.719592] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:26.909 [2024-07-24 18:52:11.719612] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:26.909 BaseBdev3 00:15:26.909 18:52:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:26.909 18:52:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:15:26.909 BaseBdev4_malloc 00:15:26.909 18:52:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:15:27.167 true 00:15:27.167 18:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:15:27.425 [2024-07-24 18:52:12.223237] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:15:27.425 [2024-07-24 18:52:12.223269] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:27.425 [2024-07-24 18:52:12.223281] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23c4280 00:15:27.425 [2024-07-24 18:52:12.223287] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:27.425 [2024-07-24 18:52:12.224354] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:27.425 [2024-07-24 18:52:12.224375] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:15:27.425 BaseBdev4 00:15:27.425 18:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:15:27.425 [2024-07-24 18:52:12.379669] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:27.425 [2024-07-24 18:52:12.380529] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:27.425 [2024-07-24 18:52:12.380574] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:27.425 [2024-07-24 18:52:12.380609] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:27.425 [2024-07-24 18:52:12.380762] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23c5d90 00:15:27.425 [2024-07-24 18:52:12.380768] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:27.425 [2024-07-24 18:52:12.380898] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23c38d0 00:15:27.425 [2024-07-24 18:52:12.380995] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23c5d90 00:15:27.425 [2024-07-24 18:52:12.381000] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23c5d90 00:15:27.425 [2024-07-24 18:52:12.381063] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:27.425 18:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:27.425 18:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:27.425 18:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:27.425 18:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:27.425 18:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:27.425 18:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:27.425 18:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:27.425 18:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:27.425 18:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:27.425 18:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:27.425 18:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.425 18:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:27.685 18:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:27.685 "name": "raid_bdev1", 00:15:27.685 "uuid": "af8e257b-ba23-4e4f-a13d-c5d2fd218d7b", 00:15:27.685 "strip_size_kb": 64, 00:15:27.685 "state": "online", 00:15:27.685 "raid_level": "raid0", 00:15:27.685 "superblock": true, 00:15:27.685 "num_base_bdevs": 4, 00:15:27.685 "num_base_bdevs_discovered": 4, 00:15:27.685 "num_base_bdevs_operational": 4, 00:15:27.685 "base_bdevs_list": [ 00:15:27.685 { 00:15:27.685 "name": "BaseBdev1", 00:15:27.685 "uuid": "708e59b3-8e80-5703-b65f-2b57e504f471", 00:15:27.685 "is_configured": true, 00:15:27.685 "data_offset": 2048, 00:15:27.685 "data_size": 63488 00:15:27.685 }, 00:15:27.685 { 00:15:27.685 "name": "BaseBdev2", 00:15:27.685 "uuid": "8c363686-6cb2-560c-8f58-0012e158233b", 00:15:27.685 "is_configured": true, 00:15:27.685 "data_offset": 2048, 00:15:27.685 "data_size": 63488 00:15:27.685 }, 00:15:27.685 { 00:15:27.685 "name": "BaseBdev3", 00:15:27.685 "uuid": "20f114e7-3ff3-5c49-aafe-31013d6ff72f", 00:15:27.685 "is_configured": true, 00:15:27.685 "data_offset": 2048, 00:15:27.685 "data_size": 63488 00:15:27.685 }, 00:15:27.685 { 00:15:27.685 "name": "BaseBdev4", 00:15:27.685 "uuid": "e7c3a014-3351-5870-82d4-0112964b11b6", 00:15:27.685 "is_configured": true, 00:15:27.685 "data_offset": 2048, 00:15:27.685 "data_size": 63488 00:15:27.685 } 00:15:27.685 ] 00:15:27.685 }' 00:15:27.685 18:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:27.685 18:52:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:28.251 18:52:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:28.251 18:52:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:28.251 [2024-07-24 18:52:13.093727] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23c91c0 00:15:29.186 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:29.186 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:29.186 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:29.186 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:15:29.186 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:29.186 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:29.186 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:29.186 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:29.186 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:29.186 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:29.186 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:29.186 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:29.186 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:29.186 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:29.444 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.444 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:29.444 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:29.444 "name": "raid_bdev1", 00:15:29.444 "uuid": "af8e257b-ba23-4e4f-a13d-c5d2fd218d7b", 00:15:29.444 "strip_size_kb": 64, 00:15:29.444 "state": "online", 00:15:29.444 "raid_level": "raid0", 00:15:29.444 "superblock": true, 00:15:29.444 "num_base_bdevs": 4, 00:15:29.444 "num_base_bdevs_discovered": 4, 00:15:29.444 "num_base_bdevs_operational": 4, 00:15:29.444 "base_bdevs_list": [ 00:15:29.444 { 00:15:29.444 "name": "BaseBdev1", 00:15:29.444 "uuid": "708e59b3-8e80-5703-b65f-2b57e504f471", 00:15:29.444 "is_configured": true, 00:15:29.444 "data_offset": 2048, 00:15:29.444 "data_size": 63488 00:15:29.444 }, 00:15:29.444 { 00:15:29.444 "name": "BaseBdev2", 00:15:29.444 "uuid": "8c363686-6cb2-560c-8f58-0012e158233b", 00:15:29.444 "is_configured": true, 00:15:29.444 "data_offset": 2048, 00:15:29.444 "data_size": 63488 00:15:29.444 }, 00:15:29.444 { 00:15:29.444 "name": "BaseBdev3", 00:15:29.444 "uuid": "20f114e7-3ff3-5c49-aafe-31013d6ff72f", 00:15:29.444 "is_configured": true, 00:15:29.444 "data_offset": 2048, 00:15:29.444 "data_size": 63488 00:15:29.444 }, 00:15:29.444 { 00:15:29.444 "name": "BaseBdev4", 00:15:29.444 "uuid": "e7c3a014-3351-5870-82d4-0112964b11b6", 00:15:29.444 "is_configured": true, 00:15:29.444 "data_offset": 2048, 00:15:29.444 "data_size": 63488 00:15:29.444 } 00:15:29.444 ] 00:15:29.444 }' 00:15:29.444 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:29.444 18:52:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:30.010 18:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:30.268 [2024-07-24 18:52:15.027145] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:30.268 [2024-07-24 18:52:15.027178] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:30.268 [2024-07-24 18:52:15.029189] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:30.268 [2024-07-24 18:52:15.029214] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:30.268 [2024-07-24 18:52:15.029238] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:30.268 [2024-07-24 18:52:15.029244] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23c5d90 name raid_bdev1, state offline 00:15:30.268 0 00:15:30.268 18:52:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2119043 00:15:30.268 18:52:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2119043 ']' 00:15:30.268 18:52:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2119043 00:15:30.268 18:52:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:15:30.268 18:52:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:30.268 18:52:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2119043 00:15:30.268 18:52:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:30.268 18:52:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:30.268 18:52:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2119043' 00:15:30.268 killing process with pid 2119043 00:15:30.268 18:52:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2119043 00:15:30.268 [2024-07-24 18:52:15.092025] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:30.268 18:52:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2119043 00:15:30.268 [2024-07-24 18:52:15.118064] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:30.526 18:52:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ST6Vral6TQ 00:15:30.526 18:52:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:30.526 18:52:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:30.526 18:52:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:15:30.526 18:52:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:30.526 18:52:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:30.526 18:52:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:30.526 18:52:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:15:30.526 00:15:30.526 real 0m5.909s 00:15:30.526 user 0m9.309s 00:15:30.526 sys 0m0.844s 00:15:30.526 18:52:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:30.526 18:52:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:30.526 ************************************ 00:15:30.526 END TEST raid_write_error_test 00:15:30.526 ************************************ 00:15:30.526 18:52:15 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:15:30.526 18:52:15 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:15:30.526 18:52:15 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:30.526 18:52:15 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:30.526 18:52:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:30.526 ************************************ 00:15:30.526 START TEST raid_state_function_test 00:15:30.526 ************************************ 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:30.526 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:30.527 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:30.527 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:30.527 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:30.527 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:30.527 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:30.527 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:30.527 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2120142 00:15:30.527 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2120142' 00:15:30.527 Process raid pid: 2120142 00:15:30.527 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:30.527 18:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2120142 /var/tmp/spdk-raid.sock 00:15:30.527 18:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2120142 ']' 00:15:30.527 18:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:30.527 18:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:30.527 18:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:30.527 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:30.527 18:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:30.527 18:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:30.527 [2024-07-24 18:52:15.429586] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:15:30.527 [2024-07-24 18:52:15.429623] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:30.527 [2024-07-24 18:52:15.493686] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:30.785 [2024-07-24 18:52:15.563614] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:30.785 [2024-07-24 18:52:15.612745] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:30.785 [2024-07-24 18:52:15.612767] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:31.353 18:52:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:31.353 18:52:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:15:31.353 18:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:31.613 [2024-07-24 18:52:16.371282] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:31.613 [2024-07-24 18:52:16.371314] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:31.613 [2024-07-24 18:52:16.371320] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:31.613 [2024-07-24 18:52:16.371326] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:31.613 [2024-07-24 18:52:16.371334] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:31.613 [2024-07-24 18:52:16.371340] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:31.613 [2024-07-24 18:52:16.371344] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:31.613 [2024-07-24 18:52:16.371349] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:31.613 18:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:31.613 18:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:31.613 18:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:31.613 18:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:31.613 18:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:31.613 18:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:31.613 18:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:31.613 18:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:31.613 18:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:31.613 18:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:31.613 18:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.613 18:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:31.613 18:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:31.613 "name": "Existed_Raid", 00:15:31.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.613 "strip_size_kb": 64, 00:15:31.613 "state": "configuring", 00:15:31.613 "raid_level": "concat", 00:15:31.613 "superblock": false, 00:15:31.613 "num_base_bdevs": 4, 00:15:31.613 "num_base_bdevs_discovered": 0, 00:15:31.613 "num_base_bdevs_operational": 4, 00:15:31.613 "base_bdevs_list": [ 00:15:31.613 { 00:15:31.613 "name": "BaseBdev1", 00:15:31.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.613 "is_configured": false, 00:15:31.613 "data_offset": 0, 00:15:31.613 "data_size": 0 00:15:31.613 }, 00:15:31.613 { 00:15:31.613 "name": "BaseBdev2", 00:15:31.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.613 "is_configured": false, 00:15:31.613 "data_offset": 0, 00:15:31.613 "data_size": 0 00:15:31.613 }, 00:15:31.613 { 00:15:31.613 "name": "BaseBdev3", 00:15:31.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.613 "is_configured": false, 00:15:31.613 "data_offset": 0, 00:15:31.613 "data_size": 0 00:15:31.613 }, 00:15:31.613 { 00:15:31.613 "name": "BaseBdev4", 00:15:31.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.613 "is_configured": false, 00:15:31.613 "data_offset": 0, 00:15:31.613 "data_size": 0 00:15:31.613 } 00:15:31.613 ] 00:15:31.613 }' 00:15:31.613 18:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:31.613 18:52:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:32.179 18:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:32.438 [2024-07-24 18:52:17.209325] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:32.438 [2024-07-24 18:52:17.209346] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd2dbc0 name Existed_Raid, state configuring 00:15:32.438 18:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:32.438 [2024-07-24 18:52:17.385796] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:32.438 [2024-07-24 18:52:17.385814] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:32.438 [2024-07-24 18:52:17.385818] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:32.438 [2024-07-24 18:52:17.385826] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:32.438 [2024-07-24 18:52:17.385830] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:32.438 [2024-07-24 18:52:17.385835] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:32.438 [2024-07-24 18:52:17.385839] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:32.438 [2024-07-24 18:52:17.385843] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:32.438 18:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:32.697 [2024-07-24 18:52:17.562360] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:32.697 BaseBdev1 00:15:32.697 18:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:32.697 18:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:32.697 18:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:32.697 18:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:32.697 18:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:32.697 18:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:32.697 18:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:32.955 18:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:32.955 [ 00:15:32.955 { 00:15:32.955 "name": "BaseBdev1", 00:15:32.955 "aliases": [ 00:15:32.955 "eafd503a-2366-4512-b69c-d5c34961b277" 00:15:32.955 ], 00:15:32.955 "product_name": "Malloc disk", 00:15:32.955 "block_size": 512, 00:15:32.955 "num_blocks": 65536, 00:15:32.955 "uuid": "eafd503a-2366-4512-b69c-d5c34961b277", 00:15:32.955 "assigned_rate_limits": { 00:15:32.955 "rw_ios_per_sec": 0, 00:15:32.955 "rw_mbytes_per_sec": 0, 00:15:32.955 "r_mbytes_per_sec": 0, 00:15:32.955 "w_mbytes_per_sec": 0 00:15:32.955 }, 00:15:32.955 "claimed": true, 00:15:32.955 "claim_type": "exclusive_write", 00:15:32.955 "zoned": false, 00:15:32.955 "supported_io_types": { 00:15:32.955 "read": true, 00:15:32.955 "write": true, 00:15:32.955 "unmap": true, 00:15:32.955 "flush": true, 00:15:32.955 "reset": true, 00:15:32.956 "nvme_admin": false, 00:15:32.956 "nvme_io": false, 00:15:32.956 "nvme_io_md": false, 00:15:32.956 "write_zeroes": true, 00:15:32.956 "zcopy": true, 00:15:32.956 "get_zone_info": false, 00:15:32.956 "zone_management": false, 00:15:32.956 "zone_append": false, 00:15:32.956 "compare": false, 00:15:32.956 "compare_and_write": false, 00:15:32.956 "abort": true, 00:15:32.956 "seek_hole": false, 00:15:32.956 "seek_data": false, 00:15:32.956 "copy": true, 00:15:32.956 "nvme_iov_md": false 00:15:32.956 }, 00:15:32.956 "memory_domains": [ 00:15:32.956 { 00:15:32.956 "dma_device_id": "system", 00:15:32.956 "dma_device_type": 1 00:15:32.956 }, 00:15:32.956 { 00:15:32.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.956 "dma_device_type": 2 00:15:32.956 } 00:15:32.956 ], 00:15:32.956 "driver_specific": {} 00:15:32.956 } 00:15:32.956 ] 00:15:32.956 18:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:32.956 18:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:32.956 18:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:32.956 18:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:32.956 18:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:32.956 18:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:32.956 18:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:32.956 18:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:32.956 18:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:32.956 18:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:32.956 18:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:32.956 18:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.956 18:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:33.214 18:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:33.214 "name": "Existed_Raid", 00:15:33.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:33.214 "strip_size_kb": 64, 00:15:33.214 "state": "configuring", 00:15:33.214 "raid_level": "concat", 00:15:33.214 "superblock": false, 00:15:33.214 "num_base_bdevs": 4, 00:15:33.214 "num_base_bdevs_discovered": 1, 00:15:33.214 "num_base_bdevs_operational": 4, 00:15:33.214 "base_bdevs_list": [ 00:15:33.214 { 00:15:33.214 "name": "BaseBdev1", 00:15:33.214 "uuid": "eafd503a-2366-4512-b69c-d5c34961b277", 00:15:33.214 "is_configured": true, 00:15:33.214 "data_offset": 0, 00:15:33.214 "data_size": 65536 00:15:33.214 }, 00:15:33.214 { 00:15:33.214 "name": "BaseBdev2", 00:15:33.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:33.214 "is_configured": false, 00:15:33.214 "data_offset": 0, 00:15:33.214 "data_size": 0 00:15:33.214 }, 00:15:33.214 { 00:15:33.214 "name": "BaseBdev3", 00:15:33.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:33.214 "is_configured": false, 00:15:33.214 "data_offset": 0, 00:15:33.214 "data_size": 0 00:15:33.214 }, 00:15:33.214 { 00:15:33.214 "name": "BaseBdev4", 00:15:33.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:33.214 "is_configured": false, 00:15:33.214 "data_offset": 0, 00:15:33.214 "data_size": 0 00:15:33.214 } 00:15:33.214 ] 00:15:33.214 }' 00:15:33.214 18:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:33.215 18:52:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:33.781 18:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:33.781 [2024-07-24 18:52:18.705440] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:33.781 [2024-07-24 18:52:18.705473] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd2d430 name Existed_Raid, state configuring 00:15:33.781 18:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:34.038 [2024-07-24 18:52:18.873912] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:34.038 [2024-07-24 18:52:18.874984] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:34.038 [2024-07-24 18:52:18.875011] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:34.038 [2024-07-24 18:52:18.875017] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:34.038 [2024-07-24 18:52:18.875022] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:34.038 [2024-07-24 18:52:18.875027] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:34.038 [2024-07-24 18:52:18.875032] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:34.038 18:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:34.038 18:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:34.038 18:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:34.038 18:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:34.038 18:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:34.038 18:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:34.038 18:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:34.038 18:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:34.038 18:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:34.038 18:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:34.038 18:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:34.038 18:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:34.038 18:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.038 18:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:34.297 18:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:34.297 "name": "Existed_Raid", 00:15:34.297 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:34.297 "strip_size_kb": 64, 00:15:34.297 "state": "configuring", 00:15:34.297 "raid_level": "concat", 00:15:34.297 "superblock": false, 00:15:34.297 "num_base_bdevs": 4, 00:15:34.297 "num_base_bdevs_discovered": 1, 00:15:34.297 "num_base_bdevs_operational": 4, 00:15:34.297 "base_bdevs_list": [ 00:15:34.297 { 00:15:34.297 "name": "BaseBdev1", 00:15:34.297 "uuid": "eafd503a-2366-4512-b69c-d5c34961b277", 00:15:34.297 "is_configured": true, 00:15:34.297 "data_offset": 0, 00:15:34.297 "data_size": 65536 00:15:34.297 }, 00:15:34.298 { 00:15:34.298 "name": "BaseBdev2", 00:15:34.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:34.298 "is_configured": false, 00:15:34.298 "data_offset": 0, 00:15:34.298 "data_size": 0 00:15:34.298 }, 00:15:34.298 { 00:15:34.298 "name": "BaseBdev3", 00:15:34.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:34.298 "is_configured": false, 00:15:34.298 "data_offset": 0, 00:15:34.298 "data_size": 0 00:15:34.298 }, 00:15:34.298 { 00:15:34.298 "name": "BaseBdev4", 00:15:34.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:34.298 "is_configured": false, 00:15:34.298 "data_offset": 0, 00:15:34.298 "data_size": 0 00:15:34.298 } 00:15:34.298 ] 00:15:34.298 }' 00:15:34.298 18:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:34.298 18:52:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:34.567 18:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:34.825 [2024-07-24 18:52:19.706755] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:34.825 BaseBdev2 00:15:34.825 18:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:34.825 18:52:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:34.825 18:52:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:34.825 18:52:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:34.825 18:52:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:34.825 18:52:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:34.825 18:52:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:35.084 18:52:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:35.084 [ 00:15:35.084 { 00:15:35.084 "name": "BaseBdev2", 00:15:35.084 "aliases": [ 00:15:35.084 "2787dc22-e29f-42fc-8ba2-4a1f6c8527e1" 00:15:35.084 ], 00:15:35.084 "product_name": "Malloc disk", 00:15:35.084 "block_size": 512, 00:15:35.084 "num_blocks": 65536, 00:15:35.084 "uuid": "2787dc22-e29f-42fc-8ba2-4a1f6c8527e1", 00:15:35.084 "assigned_rate_limits": { 00:15:35.084 "rw_ios_per_sec": 0, 00:15:35.084 "rw_mbytes_per_sec": 0, 00:15:35.084 "r_mbytes_per_sec": 0, 00:15:35.084 "w_mbytes_per_sec": 0 00:15:35.084 }, 00:15:35.084 "claimed": true, 00:15:35.084 "claim_type": "exclusive_write", 00:15:35.084 "zoned": false, 00:15:35.084 "supported_io_types": { 00:15:35.084 "read": true, 00:15:35.084 "write": true, 00:15:35.084 "unmap": true, 00:15:35.084 "flush": true, 00:15:35.084 "reset": true, 00:15:35.084 "nvme_admin": false, 00:15:35.084 "nvme_io": false, 00:15:35.084 "nvme_io_md": false, 00:15:35.084 "write_zeroes": true, 00:15:35.084 "zcopy": true, 00:15:35.084 "get_zone_info": false, 00:15:35.084 "zone_management": false, 00:15:35.084 "zone_append": false, 00:15:35.084 "compare": false, 00:15:35.084 "compare_and_write": false, 00:15:35.084 "abort": true, 00:15:35.084 "seek_hole": false, 00:15:35.084 "seek_data": false, 00:15:35.084 "copy": true, 00:15:35.084 "nvme_iov_md": false 00:15:35.084 }, 00:15:35.084 "memory_domains": [ 00:15:35.084 { 00:15:35.084 "dma_device_id": "system", 00:15:35.084 "dma_device_type": 1 00:15:35.084 }, 00:15:35.084 { 00:15:35.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:35.084 "dma_device_type": 2 00:15:35.084 } 00:15:35.084 ], 00:15:35.084 "driver_specific": {} 00:15:35.084 } 00:15:35.084 ] 00:15:35.084 18:52:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:35.084 18:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:35.084 18:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:35.084 18:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:35.084 18:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:35.084 18:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:35.084 18:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:35.084 18:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:35.084 18:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:35.084 18:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:35.084 18:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:35.084 18:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:35.084 18:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:35.084 18:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.084 18:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:35.343 18:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:35.343 "name": "Existed_Raid", 00:15:35.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:35.343 "strip_size_kb": 64, 00:15:35.343 "state": "configuring", 00:15:35.343 "raid_level": "concat", 00:15:35.343 "superblock": false, 00:15:35.343 "num_base_bdevs": 4, 00:15:35.343 "num_base_bdevs_discovered": 2, 00:15:35.343 "num_base_bdevs_operational": 4, 00:15:35.343 "base_bdevs_list": [ 00:15:35.343 { 00:15:35.343 "name": "BaseBdev1", 00:15:35.343 "uuid": "eafd503a-2366-4512-b69c-d5c34961b277", 00:15:35.343 "is_configured": true, 00:15:35.343 "data_offset": 0, 00:15:35.343 "data_size": 65536 00:15:35.343 }, 00:15:35.343 { 00:15:35.343 "name": "BaseBdev2", 00:15:35.343 "uuid": "2787dc22-e29f-42fc-8ba2-4a1f6c8527e1", 00:15:35.343 "is_configured": true, 00:15:35.343 "data_offset": 0, 00:15:35.343 "data_size": 65536 00:15:35.343 }, 00:15:35.343 { 00:15:35.343 "name": "BaseBdev3", 00:15:35.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:35.343 "is_configured": false, 00:15:35.343 "data_offset": 0, 00:15:35.343 "data_size": 0 00:15:35.343 }, 00:15:35.343 { 00:15:35.343 "name": "BaseBdev4", 00:15:35.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:35.343 "is_configured": false, 00:15:35.343 "data_offset": 0, 00:15:35.343 "data_size": 0 00:15:35.343 } 00:15:35.343 ] 00:15:35.343 }' 00:15:35.343 18:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:35.343 18:52:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:35.911 18:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:35.911 [2024-07-24 18:52:20.824286] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:35.911 BaseBdev3 00:15:35.911 18:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:35.911 18:52:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:35.911 18:52:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:35.911 18:52:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:35.911 18:52:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:35.911 18:52:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:35.911 18:52:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:36.170 18:52:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:36.170 [ 00:15:36.170 { 00:15:36.170 "name": "BaseBdev3", 00:15:36.170 "aliases": [ 00:15:36.170 "3d9c27ae-3ee0-47b2-857f-bb237073685b" 00:15:36.170 ], 00:15:36.170 "product_name": "Malloc disk", 00:15:36.170 "block_size": 512, 00:15:36.170 "num_blocks": 65536, 00:15:36.170 "uuid": "3d9c27ae-3ee0-47b2-857f-bb237073685b", 00:15:36.170 "assigned_rate_limits": { 00:15:36.170 "rw_ios_per_sec": 0, 00:15:36.170 "rw_mbytes_per_sec": 0, 00:15:36.170 "r_mbytes_per_sec": 0, 00:15:36.170 "w_mbytes_per_sec": 0 00:15:36.170 }, 00:15:36.170 "claimed": true, 00:15:36.170 "claim_type": "exclusive_write", 00:15:36.170 "zoned": false, 00:15:36.170 "supported_io_types": { 00:15:36.170 "read": true, 00:15:36.170 "write": true, 00:15:36.170 "unmap": true, 00:15:36.170 "flush": true, 00:15:36.170 "reset": true, 00:15:36.170 "nvme_admin": false, 00:15:36.170 "nvme_io": false, 00:15:36.170 "nvme_io_md": false, 00:15:36.170 "write_zeroes": true, 00:15:36.170 "zcopy": true, 00:15:36.170 "get_zone_info": false, 00:15:36.170 "zone_management": false, 00:15:36.170 "zone_append": false, 00:15:36.170 "compare": false, 00:15:36.170 "compare_and_write": false, 00:15:36.170 "abort": true, 00:15:36.170 "seek_hole": false, 00:15:36.170 "seek_data": false, 00:15:36.170 "copy": true, 00:15:36.170 "nvme_iov_md": false 00:15:36.170 }, 00:15:36.170 "memory_domains": [ 00:15:36.170 { 00:15:36.170 "dma_device_id": "system", 00:15:36.170 "dma_device_type": 1 00:15:36.170 }, 00:15:36.170 { 00:15:36.170 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:36.170 "dma_device_type": 2 00:15:36.170 } 00:15:36.170 ], 00:15:36.170 "driver_specific": {} 00:15:36.170 } 00:15:36.170 ] 00:15:36.170 18:52:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:36.170 18:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:36.170 18:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:36.170 18:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:36.429 18:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:36.429 18:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:36.429 18:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:36.429 18:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:36.429 18:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:36.429 18:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:36.429 18:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:36.429 18:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:36.429 18:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:36.429 18:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.429 18:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:36.429 18:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:36.429 "name": "Existed_Raid", 00:15:36.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:36.429 "strip_size_kb": 64, 00:15:36.429 "state": "configuring", 00:15:36.429 "raid_level": "concat", 00:15:36.429 "superblock": false, 00:15:36.429 "num_base_bdevs": 4, 00:15:36.429 "num_base_bdevs_discovered": 3, 00:15:36.429 "num_base_bdevs_operational": 4, 00:15:36.429 "base_bdevs_list": [ 00:15:36.429 { 00:15:36.429 "name": "BaseBdev1", 00:15:36.429 "uuid": "eafd503a-2366-4512-b69c-d5c34961b277", 00:15:36.429 "is_configured": true, 00:15:36.429 "data_offset": 0, 00:15:36.429 "data_size": 65536 00:15:36.429 }, 00:15:36.429 { 00:15:36.429 "name": "BaseBdev2", 00:15:36.429 "uuid": "2787dc22-e29f-42fc-8ba2-4a1f6c8527e1", 00:15:36.429 "is_configured": true, 00:15:36.429 "data_offset": 0, 00:15:36.429 "data_size": 65536 00:15:36.429 }, 00:15:36.429 { 00:15:36.429 "name": "BaseBdev3", 00:15:36.429 "uuid": "3d9c27ae-3ee0-47b2-857f-bb237073685b", 00:15:36.429 "is_configured": true, 00:15:36.429 "data_offset": 0, 00:15:36.429 "data_size": 65536 00:15:36.429 }, 00:15:36.429 { 00:15:36.429 "name": "BaseBdev4", 00:15:36.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:36.429 "is_configured": false, 00:15:36.429 "data_offset": 0, 00:15:36.429 "data_size": 0 00:15:36.429 } 00:15:36.429 ] 00:15:36.429 }' 00:15:36.429 18:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:36.429 18:52:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:36.996 18:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:37.254 [2024-07-24 18:52:22.006130] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:37.254 [2024-07-24 18:52:22.006160] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xd2e490 00:15:37.254 [2024-07-24 18:52:22.006164] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:15:37.254 [2024-07-24 18:52:22.006295] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd1a2d0 00:15:37.254 [2024-07-24 18:52:22.006385] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd2e490 00:15:37.254 [2024-07-24 18:52:22.006390] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd2e490 00:15:37.254 [2024-07-24 18:52:22.006534] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:37.254 BaseBdev4 00:15:37.254 18:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:15:37.254 18:52:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:15:37.254 18:52:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:37.254 18:52:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:37.254 18:52:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:37.254 18:52:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:37.254 18:52:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:37.255 18:52:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:37.512 [ 00:15:37.512 { 00:15:37.512 "name": "BaseBdev4", 00:15:37.512 "aliases": [ 00:15:37.512 "b5529f29-f907-4c04-9df3-3f4d37fb0a17" 00:15:37.512 ], 00:15:37.512 "product_name": "Malloc disk", 00:15:37.512 "block_size": 512, 00:15:37.512 "num_blocks": 65536, 00:15:37.512 "uuid": "b5529f29-f907-4c04-9df3-3f4d37fb0a17", 00:15:37.512 "assigned_rate_limits": { 00:15:37.512 "rw_ios_per_sec": 0, 00:15:37.512 "rw_mbytes_per_sec": 0, 00:15:37.512 "r_mbytes_per_sec": 0, 00:15:37.512 "w_mbytes_per_sec": 0 00:15:37.512 }, 00:15:37.512 "claimed": true, 00:15:37.512 "claim_type": "exclusive_write", 00:15:37.512 "zoned": false, 00:15:37.512 "supported_io_types": { 00:15:37.512 "read": true, 00:15:37.512 "write": true, 00:15:37.512 "unmap": true, 00:15:37.512 "flush": true, 00:15:37.512 "reset": true, 00:15:37.512 "nvme_admin": false, 00:15:37.512 "nvme_io": false, 00:15:37.512 "nvme_io_md": false, 00:15:37.512 "write_zeroes": true, 00:15:37.512 "zcopy": true, 00:15:37.512 "get_zone_info": false, 00:15:37.512 "zone_management": false, 00:15:37.512 "zone_append": false, 00:15:37.512 "compare": false, 00:15:37.512 "compare_and_write": false, 00:15:37.512 "abort": true, 00:15:37.512 "seek_hole": false, 00:15:37.512 "seek_data": false, 00:15:37.512 "copy": true, 00:15:37.512 "nvme_iov_md": false 00:15:37.512 }, 00:15:37.513 "memory_domains": [ 00:15:37.513 { 00:15:37.513 "dma_device_id": "system", 00:15:37.513 "dma_device_type": 1 00:15:37.513 }, 00:15:37.513 { 00:15:37.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.513 "dma_device_type": 2 00:15:37.513 } 00:15:37.513 ], 00:15:37.513 "driver_specific": {} 00:15:37.513 } 00:15:37.513 ] 00:15:37.513 18:52:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:37.513 18:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:37.513 18:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:37.513 18:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:15:37.513 18:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:37.513 18:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:37.513 18:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:37.513 18:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:37.513 18:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:37.513 18:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:37.513 18:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:37.513 18:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:37.513 18:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:37.513 18:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.513 18:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:37.772 18:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:37.772 "name": "Existed_Raid", 00:15:37.772 "uuid": "a58bfe3d-a9ee-4601-92ba-e9c1d731ef83", 00:15:37.772 "strip_size_kb": 64, 00:15:37.772 "state": "online", 00:15:37.772 "raid_level": "concat", 00:15:37.772 "superblock": false, 00:15:37.772 "num_base_bdevs": 4, 00:15:37.772 "num_base_bdevs_discovered": 4, 00:15:37.772 "num_base_bdevs_operational": 4, 00:15:37.772 "base_bdevs_list": [ 00:15:37.772 { 00:15:37.772 "name": "BaseBdev1", 00:15:37.772 "uuid": "eafd503a-2366-4512-b69c-d5c34961b277", 00:15:37.772 "is_configured": true, 00:15:37.772 "data_offset": 0, 00:15:37.772 "data_size": 65536 00:15:37.772 }, 00:15:37.772 { 00:15:37.772 "name": "BaseBdev2", 00:15:37.772 "uuid": "2787dc22-e29f-42fc-8ba2-4a1f6c8527e1", 00:15:37.772 "is_configured": true, 00:15:37.772 "data_offset": 0, 00:15:37.772 "data_size": 65536 00:15:37.772 }, 00:15:37.772 { 00:15:37.772 "name": "BaseBdev3", 00:15:37.772 "uuid": "3d9c27ae-3ee0-47b2-857f-bb237073685b", 00:15:37.772 "is_configured": true, 00:15:37.772 "data_offset": 0, 00:15:37.772 "data_size": 65536 00:15:37.772 }, 00:15:37.772 { 00:15:37.772 "name": "BaseBdev4", 00:15:37.772 "uuid": "b5529f29-f907-4c04-9df3-3f4d37fb0a17", 00:15:37.772 "is_configured": true, 00:15:37.772 "data_offset": 0, 00:15:37.772 "data_size": 65536 00:15:37.772 } 00:15:37.772 ] 00:15:37.772 }' 00:15:37.772 18:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:37.772 18:52:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:38.031 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:38.031 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:38.290 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:38.290 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:38.290 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:38.290 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:38.290 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:38.290 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:38.290 [2024-07-24 18:52:23.197426] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:38.290 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:38.290 "name": "Existed_Raid", 00:15:38.290 "aliases": [ 00:15:38.290 "a58bfe3d-a9ee-4601-92ba-e9c1d731ef83" 00:15:38.290 ], 00:15:38.290 "product_name": "Raid Volume", 00:15:38.290 "block_size": 512, 00:15:38.290 "num_blocks": 262144, 00:15:38.290 "uuid": "a58bfe3d-a9ee-4601-92ba-e9c1d731ef83", 00:15:38.290 "assigned_rate_limits": { 00:15:38.290 "rw_ios_per_sec": 0, 00:15:38.290 "rw_mbytes_per_sec": 0, 00:15:38.290 "r_mbytes_per_sec": 0, 00:15:38.290 "w_mbytes_per_sec": 0 00:15:38.290 }, 00:15:38.290 "claimed": false, 00:15:38.290 "zoned": false, 00:15:38.290 "supported_io_types": { 00:15:38.290 "read": true, 00:15:38.290 "write": true, 00:15:38.290 "unmap": true, 00:15:38.290 "flush": true, 00:15:38.290 "reset": true, 00:15:38.290 "nvme_admin": false, 00:15:38.290 "nvme_io": false, 00:15:38.290 "nvme_io_md": false, 00:15:38.290 "write_zeroes": true, 00:15:38.290 "zcopy": false, 00:15:38.290 "get_zone_info": false, 00:15:38.290 "zone_management": false, 00:15:38.290 "zone_append": false, 00:15:38.290 "compare": false, 00:15:38.290 "compare_and_write": false, 00:15:38.290 "abort": false, 00:15:38.290 "seek_hole": false, 00:15:38.290 "seek_data": false, 00:15:38.290 "copy": false, 00:15:38.290 "nvme_iov_md": false 00:15:38.290 }, 00:15:38.290 "memory_domains": [ 00:15:38.290 { 00:15:38.290 "dma_device_id": "system", 00:15:38.290 "dma_device_type": 1 00:15:38.290 }, 00:15:38.290 { 00:15:38.290 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.290 "dma_device_type": 2 00:15:38.290 }, 00:15:38.290 { 00:15:38.290 "dma_device_id": "system", 00:15:38.290 "dma_device_type": 1 00:15:38.290 }, 00:15:38.290 { 00:15:38.290 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.290 "dma_device_type": 2 00:15:38.290 }, 00:15:38.290 { 00:15:38.290 "dma_device_id": "system", 00:15:38.290 "dma_device_type": 1 00:15:38.290 }, 00:15:38.290 { 00:15:38.290 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.290 "dma_device_type": 2 00:15:38.290 }, 00:15:38.290 { 00:15:38.290 "dma_device_id": "system", 00:15:38.290 "dma_device_type": 1 00:15:38.290 }, 00:15:38.290 { 00:15:38.290 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.290 "dma_device_type": 2 00:15:38.290 } 00:15:38.290 ], 00:15:38.290 "driver_specific": { 00:15:38.290 "raid": { 00:15:38.290 "uuid": "a58bfe3d-a9ee-4601-92ba-e9c1d731ef83", 00:15:38.290 "strip_size_kb": 64, 00:15:38.290 "state": "online", 00:15:38.290 "raid_level": "concat", 00:15:38.290 "superblock": false, 00:15:38.290 "num_base_bdevs": 4, 00:15:38.290 "num_base_bdevs_discovered": 4, 00:15:38.290 "num_base_bdevs_operational": 4, 00:15:38.290 "base_bdevs_list": [ 00:15:38.290 { 00:15:38.290 "name": "BaseBdev1", 00:15:38.290 "uuid": "eafd503a-2366-4512-b69c-d5c34961b277", 00:15:38.290 "is_configured": true, 00:15:38.290 "data_offset": 0, 00:15:38.290 "data_size": 65536 00:15:38.290 }, 00:15:38.290 { 00:15:38.290 "name": "BaseBdev2", 00:15:38.290 "uuid": "2787dc22-e29f-42fc-8ba2-4a1f6c8527e1", 00:15:38.290 "is_configured": true, 00:15:38.290 "data_offset": 0, 00:15:38.290 "data_size": 65536 00:15:38.290 }, 00:15:38.290 { 00:15:38.290 "name": "BaseBdev3", 00:15:38.290 "uuid": "3d9c27ae-3ee0-47b2-857f-bb237073685b", 00:15:38.290 "is_configured": true, 00:15:38.290 "data_offset": 0, 00:15:38.290 "data_size": 65536 00:15:38.290 }, 00:15:38.290 { 00:15:38.290 "name": "BaseBdev4", 00:15:38.290 "uuid": "b5529f29-f907-4c04-9df3-3f4d37fb0a17", 00:15:38.290 "is_configured": true, 00:15:38.290 "data_offset": 0, 00:15:38.290 "data_size": 65536 00:15:38.290 } 00:15:38.290 ] 00:15:38.290 } 00:15:38.290 } 00:15:38.290 }' 00:15:38.290 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:38.290 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:38.290 BaseBdev2 00:15:38.290 BaseBdev3 00:15:38.290 BaseBdev4' 00:15:38.290 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:38.290 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:38.290 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:38.554 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:38.554 "name": "BaseBdev1", 00:15:38.554 "aliases": [ 00:15:38.554 "eafd503a-2366-4512-b69c-d5c34961b277" 00:15:38.554 ], 00:15:38.554 "product_name": "Malloc disk", 00:15:38.554 "block_size": 512, 00:15:38.554 "num_blocks": 65536, 00:15:38.554 "uuid": "eafd503a-2366-4512-b69c-d5c34961b277", 00:15:38.554 "assigned_rate_limits": { 00:15:38.554 "rw_ios_per_sec": 0, 00:15:38.554 "rw_mbytes_per_sec": 0, 00:15:38.554 "r_mbytes_per_sec": 0, 00:15:38.554 "w_mbytes_per_sec": 0 00:15:38.554 }, 00:15:38.554 "claimed": true, 00:15:38.554 "claim_type": "exclusive_write", 00:15:38.554 "zoned": false, 00:15:38.554 "supported_io_types": { 00:15:38.554 "read": true, 00:15:38.554 "write": true, 00:15:38.554 "unmap": true, 00:15:38.554 "flush": true, 00:15:38.554 "reset": true, 00:15:38.554 "nvme_admin": false, 00:15:38.554 "nvme_io": false, 00:15:38.554 "nvme_io_md": false, 00:15:38.554 "write_zeroes": true, 00:15:38.554 "zcopy": true, 00:15:38.554 "get_zone_info": false, 00:15:38.554 "zone_management": false, 00:15:38.554 "zone_append": false, 00:15:38.554 "compare": false, 00:15:38.554 "compare_and_write": false, 00:15:38.554 "abort": true, 00:15:38.554 "seek_hole": false, 00:15:38.554 "seek_data": false, 00:15:38.554 "copy": true, 00:15:38.554 "nvme_iov_md": false 00:15:38.554 }, 00:15:38.554 "memory_domains": [ 00:15:38.554 { 00:15:38.554 "dma_device_id": "system", 00:15:38.554 "dma_device_type": 1 00:15:38.554 }, 00:15:38.554 { 00:15:38.554 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.554 "dma_device_type": 2 00:15:38.554 } 00:15:38.554 ], 00:15:38.554 "driver_specific": {} 00:15:38.554 }' 00:15:38.554 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.554 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.554 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:38.554 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:38.554 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:38.813 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:38.813 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:38.813 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:38.813 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:38.813 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:38.813 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:38.813 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:38.813 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:38.813 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:38.813 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:39.072 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:39.072 "name": "BaseBdev2", 00:15:39.072 "aliases": [ 00:15:39.072 "2787dc22-e29f-42fc-8ba2-4a1f6c8527e1" 00:15:39.072 ], 00:15:39.072 "product_name": "Malloc disk", 00:15:39.072 "block_size": 512, 00:15:39.072 "num_blocks": 65536, 00:15:39.072 "uuid": "2787dc22-e29f-42fc-8ba2-4a1f6c8527e1", 00:15:39.072 "assigned_rate_limits": { 00:15:39.072 "rw_ios_per_sec": 0, 00:15:39.072 "rw_mbytes_per_sec": 0, 00:15:39.072 "r_mbytes_per_sec": 0, 00:15:39.072 "w_mbytes_per_sec": 0 00:15:39.072 }, 00:15:39.072 "claimed": true, 00:15:39.072 "claim_type": "exclusive_write", 00:15:39.072 "zoned": false, 00:15:39.072 "supported_io_types": { 00:15:39.072 "read": true, 00:15:39.072 "write": true, 00:15:39.072 "unmap": true, 00:15:39.072 "flush": true, 00:15:39.072 "reset": true, 00:15:39.072 "nvme_admin": false, 00:15:39.072 "nvme_io": false, 00:15:39.072 "nvme_io_md": false, 00:15:39.072 "write_zeroes": true, 00:15:39.072 "zcopy": true, 00:15:39.072 "get_zone_info": false, 00:15:39.072 "zone_management": false, 00:15:39.072 "zone_append": false, 00:15:39.072 "compare": false, 00:15:39.072 "compare_and_write": false, 00:15:39.072 "abort": true, 00:15:39.072 "seek_hole": false, 00:15:39.072 "seek_data": false, 00:15:39.072 "copy": true, 00:15:39.072 "nvme_iov_md": false 00:15:39.072 }, 00:15:39.072 "memory_domains": [ 00:15:39.072 { 00:15:39.072 "dma_device_id": "system", 00:15:39.072 "dma_device_type": 1 00:15:39.072 }, 00:15:39.072 { 00:15:39.072 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.072 "dma_device_type": 2 00:15:39.072 } 00:15:39.072 ], 00:15:39.072 "driver_specific": {} 00:15:39.072 }' 00:15:39.072 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.072 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.072 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:39.072 18:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.072 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.072 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:39.072 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.072 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.330 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.330 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.330 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.330 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.330 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:39.330 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:39.330 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:39.588 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:39.588 "name": "BaseBdev3", 00:15:39.588 "aliases": [ 00:15:39.588 "3d9c27ae-3ee0-47b2-857f-bb237073685b" 00:15:39.588 ], 00:15:39.588 "product_name": "Malloc disk", 00:15:39.588 "block_size": 512, 00:15:39.588 "num_blocks": 65536, 00:15:39.588 "uuid": "3d9c27ae-3ee0-47b2-857f-bb237073685b", 00:15:39.588 "assigned_rate_limits": { 00:15:39.588 "rw_ios_per_sec": 0, 00:15:39.588 "rw_mbytes_per_sec": 0, 00:15:39.588 "r_mbytes_per_sec": 0, 00:15:39.588 "w_mbytes_per_sec": 0 00:15:39.588 }, 00:15:39.588 "claimed": true, 00:15:39.588 "claim_type": "exclusive_write", 00:15:39.588 "zoned": false, 00:15:39.588 "supported_io_types": { 00:15:39.588 "read": true, 00:15:39.588 "write": true, 00:15:39.588 "unmap": true, 00:15:39.588 "flush": true, 00:15:39.588 "reset": true, 00:15:39.588 "nvme_admin": false, 00:15:39.588 "nvme_io": false, 00:15:39.588 "nvme_io_md": false, 00:15:39.588 "write_zeroes": true, 00:15:39.588 "zcopy": true, 00:15:39.588 "get_zone_info": false, 00:15:39.588 "zone_management": false, 00:15:39.588 "zone_append": false, 00:15:39.588 "compare": false, 00:15:39.588 "compare_and_write": false, 00:15:39.588 "abort": true, 00:15:39.588 "seek_hole": false, 00:15:39.588 "seek_data": false, 00:15:39.588 "copy": true, 00:15:39.588 "nvme_iov_md": false 00:15:39.588 }, 00:15:39.588 "memory_domains": [ 00:15:39.588 { 00:15:39.588 "dma_device_id": "system", 00:15:39.588 "dma_device_type": 1 00:15:39.588 }, 00:15:39.588 { 00:15:39.588 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.588 "dma_device_type": 2 00:15:39.588 } 00:15:39.588 ], 00:15:39.588 "driver_specific": {} 00:15:39.588 }' 00:15:39.588 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.588 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.588 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:39.588 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.588 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.588 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:39.588 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.588 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.588 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.588 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.847 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.847 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.847 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:39.847 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:39.847 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:39.847 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:39.847 "name": "BaseBdev4", 00:15:39.847 "aliases": [ 00:15:39.847 "b5529f29-f907-4c04-9df3-3f4d37fb0a17" 00:15:39.847 ], 00:15:39.847 "product_name": "Malloc disk", 00:15:39.847 "block_size": 512, 00:15:39.847 "num_blocks": 65536, 00:15:39.847 "uuid": "b5529f29-f907-4c04-9df3-3f4d37fb0a17", 00:15:39.847 "assigned_rate_limits": { 00:15:39.847 "rw_ios_per_sec": 0, 00:15:39.847 "rw_mbytes_per_sec": 0, 00:15:39.847 "r_mbytes_per_sec": 0, 00:15:39.847 "w_mbytes_per_sec": 0 00:15:39.847 }, 00:15:39.847 "claimed": true, 00:15:39.847 "claim_type": "exclusive_write", 00:15:39.847 "zoned": false, 00:15:39.847 "supported_io_types": { 00:15:39.847 "read": true, 00:15:39.847 "write": true, 00:15:39.847 "unmap": true, 00:15:39.847 "flush": true, 00:15:39.847 "reset": true, 00:15:39.847 "nvme_admin": false, 00:15:39.847 "nvme_io": false, 00:15:39.847 "nvme_io_md": false, 00:15:39.847 "write_zeroes": true, 00:15:39.847 "zcopy": true, 00:15:39.847 "get_zone_info": false, 00:15:39.847 "zone_management": false, 00:15:39.847 "zone_append": false, 00:15:39.847 "compare": false, 00:15:39.847 "compare_and_write": false, 00:15:39.847 "abort": true, 00:15:39.847 "seek_hole": false, 00:15:39.847 "seek_data": false, 00:15:39.847 "copy": true, 00:15:39.847 "nvme_iov_md": false 00:15:39.847 }, 00:15:39.847 "memory_domains": [ 00:15:39.847 { 00:15:39.847 "dma_device_id": "system", 00:15:39.847 "dma_device_type": 1 00:15:39.847 }, 00:15:39.847 { 00:15:39.847 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.847 "dma_device_type": 2 00:15:39.847 } 00:15:39.847 ], 00:15:39.847 "driver_specific": {} 00:15:39.847 }' 00:15:39.847 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.106 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.106 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:40.106 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.106 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.106 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:40.106 18:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.106 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.106 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:40.106 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.106 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.365 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:40.365 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:40.365 [2024-07-24 18:52:25.298915] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:40.365 [2024-07-24 18:52:25.298935] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:40.365 [2024-07-24 18:52:25.298970] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:40.365 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:40.365 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:40.365 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:40.366 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:40.366 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:40.366 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:15:40.366 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:40.366 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:40.366 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:40.366 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:40.366 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:40.366 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:40.366 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:40.366 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:40.366 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:40.366 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:40.366 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:40.625 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:40.625 "name": "Existed_Raid", 00:15:40.625 "uuid": "a58bfe3d-a9ee-4601-92ba-e9c1d731ef83", 00:15:40.625 "strip_size_kb": 64, 00:15:40.625 "state": "offline", 00:15:40.625 "raid_level": "concat", 00:15:40.625 "superblock": false, 00:15:40.625 "num_base_bdevs": 4, 00:15:40.625 "num_base_bdevs_discovered": 3, 00:15:40.625 "num_base_bdevs_operational": 3, 00:15:40.625 "base_bdevs_list": [ 00:15:40.625 { 00:15:40.625 "name": null, 00:15:40.625 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:40.625 "is_configured": false, 00:15:40.625 "data_offset": 0, 00:15:40.625 "data_size": 65536 00:15:40.625 }, 00:15:40.625 { 00:15:40.625 "name": "BaseBdev2", 00:15:40.625 "uuid": "2787dc22-e29f-42fc-8ba2-4a1f6c8527e1", 00:15:40.625 "is_configured": true, 00:15:40.625 "data_offset": 0, 00:15:40.625 "data_size": 65536 00:15:40.625 }, 00:15:40.625 { 00:15:40.625 "name": "BaseBdev3", 00:15:40.625 "uuid": "3d9c27ae-3ee0-47b2-857f-bb237073685b", 00:15:40.625 "is_configured": true, 00:15:40.625 "data_offset": 0, 00:15:40.625 "data_size": 65536 00:15:40.625 }, 00:15:40.625 { 00:15:40.625 "name": "BaseBdev4", 00:15:40.625 "uuid": "b5529f29-f907-4c04-9df3-3f4d37fb0a17", 00:15:40.625 "is_configured": true, 00:15:40.625 "data_offset": 0, 00:15:40.625 "data_size": 65536 00:15:40.625 } 00:15:40.625 ] 00:15:40.625 }' 00:15:40.625 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:40.625 18:52:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:41.192 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:41.192 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:41.192 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.192 18:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:41.193 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:41.193 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:41.193 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:41.451 [2024-07-24 18:52:26.238225] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:41.451 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:41.451 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:41.451 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:41.451 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.451 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:41.451 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:41.451 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:41.710 [2024-07-24 18:52:26.580854] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:41.710 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:41.710 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:41.710 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.710 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:41.980 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:41.980 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:41.980 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:15:41.980 [2024-07-24 18:52:26.919526] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:15:41.980 [2024-07-24 18:52:26.919556] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd2e490 name Existed_Raid, state offline 00:15:41.980 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:41.980 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:41.980 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.980 18:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:42.251 18:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:42.251 18:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:42.251 18:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:15:42.251 18:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:42.251 18:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:42.251 18:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:42.251 BaseBdev2 00:15:42.510 18:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:42.510 18:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:42.510 18:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:42.510 18:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:42.510 18:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:42.510 18:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:42.510 18:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:42.510 18:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:42.768 [ 00:15:42.768 { 00:15:42.768 "name": "BaseBdev2", 00:15:42.768 "aliases": [ 00:15:42.768 "876ac4be-7ac7-497a-b7c0-fd78057a8f3e" 00:15:42.768 ], 00:15:42.768 "product_name": "Malloc disk", 00:15:42.768 "block_size": 512, 00:15:42.768 "num_blocks": 65536, 00:15:42.768 "uuid": "876ac4be-7ac7-497a-b7c0-fd78057a8f3e", 00:15:42.768 "assigned_rate_limits": { 00:15:42.768 "rw_ios_per_sec": 0, 00:15:42.768 "rw_mbytes_per_sec": 0, 00:15:42.768 "r_mbytes_per_sec": 0, 00:15:42.768 "w_mbytes_per_sec": 0 00:15:42.768 }, 00:15:42.768 "claimed": false, 00:15:42.768 "zoned": false, 00:15:42.768 "supported_io_types": { 00:15:42.768 "read": true, 00:15:42.768 "write": true, 00:15:42.768 "unmap": true, 00:15:42.768 "flush": true, 00:15:42.768 "reset": true, 00:15:42.768 "nvme_admin": false, 00:15:42.768 "nvme_io": false, 00:15:42.768 "nvme_io_md": false, 00:15:42.768 "write_zeroes": true, 00:15:42.768 "zcopy": true, 00:15:42.768 "get_zone_info": false, 00:15:42.768 "zone_management": false, 00:15:42.768 "zone_append": false, 00:15:42.768 "compare": false, 00:15:42.768 "compare_and_write": false, 00:15:42.768 "abort": true, 00:15:42.768 "seek_hole": false, 00:15:42.768 "seek_data": false, 00:15:42.768 "copy": true, 00:15:42.768 "nvme_iov_md": false 00:15:42.768 }, 00:15:42.768 "memory_domains": [ 00:15:42.768 { 00:15:42.768 "dma_device_id": "system", 00:15:42.768 "dma_device_type": 1 00:15:42.768 }, 00:15:42.768 { 00:15:42.768 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.768 "dma_device_type": 2 00:15:42.768 } 00:15:42.768 ], 00:15:42.768 "driver_specific": {} 00:15:42.768 } 00:15:42.768 ] 00:15:42.768 18:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:42.768 18:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:42.768 18:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:42.768 18:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:42.768 BaseBdev3 00:15:43.026 18:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:43.026 18:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:43.026 18:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:43.026 18:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:43.026 18:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:43.026 18:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:43.026 18:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:43.026 18:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:43.285 [ 00:15:43.285 { 00:15:43.285 "name": "BaseBdev3", 00:15:43.285 "aliases": [ 00:15:43.285 "eb2dbf15-89e4-4651-b338-8b3054bc2599" 00:15:43.285 ], 00:15:43.285 "product_name": "Malloc disk", 00:15:43.285 "block_size": 512, 00:15:43.285 "num_blocks": 65536, 00:15:43.285 "uuid": "eb2dbf15-89e4-4651-b338-8b3054bc2599", 00:15:43.285 "assigned_rate_limits": { 00:15:43.285 "rw_ios_per_sec": 0, 00:15:43.285 "rw_mbytes_per_sec": 0, 00:15:43.285 "r_mbytes_per_sec": 0, 00:15:43.285 "w_mbytes_per_sec": 0 00:15:43.285 }, 00:15:43.285 "claimed": false, 00:15:43.285 "zoned": false, 00:15:43.285 "supported_io_types": { 00:15:43.285 "read": true, 00:15:43.285 "write": true, 00:15:43.285 "unmap": true, 00:15:43.285 "flush": true, 00:15:43.285 "reset": true, 00:15:43.285 "nvme_admin": false, 00:15:43.285 "nvme_io": false, 00:15:43.285 "nvme_io_md": false, 00:15:43.285 "write_zeroes": true, 00:15:43.285 "zcopy": true, 00:15:43.285 "get_zone_info": false, 00:15:43.285 "zone_management": false, 00:15:43.285 "zone_append": false, 00:15:43.285 "compare": false, 00:15:43.285 "compare_and_write": false, 00:15:43.285 "abort": true, 00:15:43.285 "seek_hole": false, 00:15:43.285 "seek_data": false, 00:15:43.285 "copy": true, 00:15:43.285 "nvme_iov_md": false 00:15:43.285 }, 00:15:43.285 "memory_domains": [ 00:15:43.285 { 00:15:43.285 "dma_device_id": "system", 00:15:43.285 "dma_device_type": 1 00:15:43.285 }, 00:15:43.285 { 00:15:43.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.285 "dma_device_type": 2 00:15:43.285 } 00:15:43.285 ], 00:15:43.285 "driver_specific": {} 00:15:43.285 } 00:15:43.285 ] 00:15:43.285 18:52:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:43.285 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:43.285 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:43.285 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:43.285 BaseBdev4 00:15:43.285 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:15:43.285 18:52:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:15:43.285 18:52:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:43.285 18:52:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:43.285 18:52:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:43.285 18:52:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:43.285 18:52:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:43.543 18:52:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:43.802 [ 00:15:43.802 { 00:15:43.802 "name": "BaseBdev4", 00:15:43.802 "aliases": [ 00:15:43.802 "f3cafb78-6bc8-4932-9a9f-40d4c39c78fd" 00:15:43.802 ], 00:15:43.802 "product_name": "Malloc disk", 00:15:43.802 "block_size": 512, 00:15:43.802 "num_blocks": 65536, 00:15:43.802 "uuid": "f3cafb78-6bc8-4932-9a9f-40d4c39c78fd", 00:15:43.802 "assigned_rate_limits": { 00:15:43.802 "rw_ios_per_sec": 0, 00:15:43.802 "rw_mbytes_per_sec": 0, 00:15:43.802 "r_mbytes_per_sec": 0, 00:15:43.802 "w_mbytes_per_sec": 0 00:15:43.802 }, 00:15:43.802 "claimed": false, 00:15:43.802 "zoned": false, 00:15:43.802 "supported_io_types": { 00:15:43.802 "read": true, 00:15:43.802 "write": true, 00:15:43.802 "unmap": true, 00:15:43.802 "flush": true, 00:15:43.802 "reset": true, 00:15:43.802 "nvme_admin": false, 00:15:43.802 "nvme_io": false, 00:15:43.802 "nvme_io_md": false, 00:15:43.802 "write_zeroes": true, 00:15:43.802 "zcopy": true, 00:15:43.802 "get_zone_info": false, 00:15:43.802 "zone_management": false, 00:15:43.802 "zone_append": false, 00:15:43.802 "compare": false, 00:15:43.802 "compare_and_write": false, 00:15:43.802 "abort": true, 00:15:43.802 "seek_hole": false, 00:15:43.802 "seek_data": false, 00:15:43.802 "copy": true, 00:15:43.802 "nvme_iov_md": false 00:15:43.802 }, 00:15:43.802 "memory_domains": [ 00:15:43.802 { 00:15:43.802 "dma_device_id": "system", 00:15:43.802 "dma_device_type": 1 00:15:43.802 }, 00:15:43.802 { 00:15:43.802 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.802 "dma_device_type": 2 00:15:43.802 } 00:15:43.802 ], 00:15:43.802 "driver_specific": {} 00:15:43.802 } 00:15:43.802 ] 00:15:43.802 18:52:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:43.802 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:43.802 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:43.802 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:43.802 [2024-07-24 18:52:28.753174] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:43.802 [2024-07-24 18:52:28.753207] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:43.802 [2024-07-24 18:52:28.753219] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:43.802 [2024-07-24 18:52:28.754183] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:43.802 [2024-07-24 18:52:28.754213] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:43.802 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:43.802 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:43.802 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:43.802 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:43.802 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:43.802 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:43.802 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:43.802 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:43.802 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:43.802 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:43.802 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:43.802 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:44.061 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:44.061 "name": "Existed_Raid", 00:15:44.061 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.061 "strip_size_kb": 64, 00:15:44.061 "state": "configuring", 00:15:44.061 "raid_level": "concat", 00:15:44.061 "superblock": false, 00:15:44.061 "num_base_bdevs": 4, 00:15:44.061 "num_base_bdevs_discovered": 3, 00:15:44.061 "num_base_bdevs_operational": 4, 00:15:44.061 "base_bdevs_list": [ 00:15:44.061 { 00:15:44.061 "name": "BaseBdev1", 00:15:44.061 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.061 "is_configured": false, 00:15:44.061 "data_offset": 0, 00:15:44.061 "data_size": 0 00:15:44.061 }, 00:15:44.061 { 00:15:44.061 "name": "BaseBdev2", 00:15:44.061 "uuid": "876ac4be-7ac7-497a-b7c0-fd78057a8f3e", 00:15:44.061 "is_configured": true, 00:15:44.061 "data_offset": 0, 00:15:44.061 "data_size": 65536 00:15:44.061 }, 00:15:44.061 { 00:15:44.061 "name": "BaseBdev3", 00:15:44.061 "uuid": "eb2dbf15-89e4-4651-b338-8b3054bc2599", 00:15:44.061 "is_configured": true, 00:15:44.061 "data_offset": 0, 00:15:44.061 "data_size": 65536 00:15:44.061 }, 00:15:44.061 { 00:15:44.061 "name": "BaseBdev4", 00:15:44.061 "uuid": "f3cafb78-6bc8-4932-9a9f-40d4c39c78fd", 00:15:44.061 "is_configured": true, 00:15:44.061 "data_offset": 0, 00:15:44.061 "data_size": 65536 00:15:44.061 } 00:15:44.061 ] 00:15:44.061 }' 00:15:44.061 18:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:44.061 18:52:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:44.628 18:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:44.628 [2024-07-24 18:52:29.543202] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:44.628 18:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:44.628 18:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:44.628 18:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:44.628 18:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:44.628 18:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:44.628 18:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:44.628 18:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:44.628 18:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:44.628 18:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:44.628 18:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:44.628 18:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.628 18:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:44.887 18:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:44.887 "name": "Existed_Raid", 00:15:44.887 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.887 "strip_size_kb": 64, 00:15:44.887 "state": "configuring", 00:15:44.887 "raid_level": "concat", 00:15:44.887 "superblock": false, 00:15:44.887 "num_base_bdevs": 4, 00:15:44.887 "num_base_bdevs_discovered": 2, 00:15:44.887 "num_base_bdevs_operational": 4, 00:15:44.887 "base_bdevs_list": [ 00:15:44.887 { 00:15:44.887 "name": "BaseBdev1", 00:15:44.887 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.887 "is_configured": false, 00:15:44.887 "data_offset": 0, 00:15:44.887 "data_size": 0 00:15:44.887 }, 00:15:44.887 { 00:15:44.887 "name": null, 00:15:44.887 "uuid": "876ac4be-7ac7-497a-b7c0-fd78057a8f3e", 00:15:44.887 "is_configured": false, 00:15:44.887 "data_offset": 0, 00:15:44.887 "data_size": 65536 00:15:44.887 }, 00:15:44.887 { 00:15:44.887 "name": "BaseBdev3", 00:15:44.887 "uuid": "eb2dbf15-89e4-4651-b338-8b3054bc2599", 00:15:44.887 "is_configured": true, 00:15:44.887 "data_offset": 0, 00:15:44.887 "data_size": 65536 00:15:44.887 }, 00:15:44.887 { 00:15:44.887 "name": "BaseBdev4", 00:15:44.887 "uuid": "f3cafb78-6bc8-4932-9a9f-40d4c39c78fd", 00:15:44.887 "is_configured": true, 00:15:44.887 "data_offset": 0, 00:15:44.887 "data_size": 65536 00:15:44.887 } 00:15:44.887 ] 00:15:44.887 }' 00:15:44.887 18:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:44.887 18:52:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.453 18:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:45.453 18:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.454 18:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:45.454 18:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:45.711 [2024-07-24 18:52:30.552607] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:45.711 BaseBdev1 00:15:45.711 18:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:45.711 18:52:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:45.711 18:52:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:45.711 18:52:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:45.711 18:52:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:45.711 18:52:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:45.711 18:52:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:45.970 18:52:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:45.970 [ 00:15:45.970 { 00:15:45.970 "name": "BaseBdev1", 00:15:45.970 "aliases": [ 00:15:45.970 "d162e681-b44c-4468-8290-41d7357e406b" 00:15:45.970 ], 00:15:45.970 "product_name": "Malloc disk", 00:15:45.970 "block_size": 512, 00:15:45.970 "num_blocks": 65536, 00:15:45.970 "uuid": "d162e681-b44c-4468-8290-41d7357e406b", 00:15:45.970 "assigned_rate_limits": { 00:15:45.970 "rw_ios_per_sec": 0, 00:15:45.970 "rw_mbytes_per_sec": 0, 00:15:45.970 "r_mbytes_per_sec": 0, 00:15:45.970 "w_mbytes_per_sec": 0 00:15:45.970 }, 00:15:45.970 "claimed": true, 00:15:45.970 "claim_type": "exclusive_write", 00:15:45.970 "zoned": false, 00:15:45.970 "supported_io_types": { 00:15:45.970 "read": true, 00:15:45.970 "write": true, 00:15:45.970 "unmap": true, 00:15:45.970 "flush": true, 00:15:45.970 "reset": true, 00:15:45.970 "nvme_admin": false, 00:15:45.970 "nvme_io": false, 00:15:45.970 "nvme_io_md": false, 00:15:45.970 "write_zeroes": true, 00:15:45.970 "zcopy": true, 00:15:45.970 "get_zone_info": false, 00:15:45.970 "zone_management": false, 00:15:45.970 "zone_append": false, 00:15:45.970 "compare": false, 00:15:45.970 "compare_and_write": false, 00:15:45.970 "abort": true, 00:15:45.970 "seek_hole": false, 00:15:45.970 "seek_data": false, 00:15:45.970 "copy": true, 00:15:45.970 "nvme_iov_md": false 00:15:45.970 }, 00:15:45.970 "memory_domains": [ 00:15:45.970 { 00:15:45.970 "dma_device_id": "system", 00:15:45.970 "dma_device_type": 1 00:15:45.970 }, 00:15:45.970 { 00:15:45.970 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:45.970 "dma_device_type": 2 00:15:45.970 } 00:15:45.970 ], 00:15:45.970 "driver_specific": {} 00:15:45.970 } 00:15:45.970 ] 00:15:45.970 18:52:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:45.970 18:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:45.970 18:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:45.970 18:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:45.970 18:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:45.970 18:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:45.970 18:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:45.970 18:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:45.970 18:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:45.970 18:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:45.970 18:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:45.970 18:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.970 18:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:46.228 18:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:46.228 "name": "Existed_Raid", 00:15:46.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:46.228 "strip_size_kb": 64, 00:15:46.228 "state": "configuring", 00:15:46.228 "raid_level": "concat", 00:15:46.228 "superblock": false, 00:15:46.228 "num_base_bdevs": 4, 00:15:46.228 "num_base_bdevs_discovered": 3, 00:15:46.228 "num_base_bdevs_operational": 4, 00:15:46.228 "base_bdevs_list": [ 00:15:46.228 { 00:15:46.228 "name": "BaseBdev1", 00:15:46.228 "uuid": "d162e681-b44c-4468-8290-41d7357e406b", 00:15:46.228 "is_configured": true, 00:15:46.228 "data_offset": 0, 00:15:46.228 "data_size": 65536 00:15:46.228 }, 00:15:46.228 { 00:15:46.228 "name": null, 00:15:46.228 "uuid": "876ac4be-7ac7-497a-b7c0-fd78057a8f3e", 00:15:46.228 "is_configured": false, 00:15:46.228 "data_offset": 0, 00:15:46.228 "data_size": 65536 00:15:46.228 }, 00:15:46.228 { 00:15:46.228 "name": "BaseBdev3", 00:15:46.228 "uuid": "eb2dbf15-89e4-4651-b338-8b3054bc2599", 00:15:46.229 "is_configured": true, 00:15:46.229 "data_offset": 0, 00:15:46.229 "data_size": 65536 00:15:46.229 }, 00:15:46.229 { 00:15:46.229 "name": "BaseBdev4", 00:15:46.229 "uuid": "f3cafb78-6bc8-4932-9a9f-40d4c39c78fd", 00:15:46.229 "is_configured": true, 00:15:46.229 "data_offset": 0, 00:15:46.229 "data_size": 65536 00:15:46.229 } 00:15:46.229 ] 00:15:46.229 }' 00:15:46.229 18:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:46.229 18:52:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:46.794 18:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:46.795 18:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:46.795 18:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:46.795 18:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:47.053 [2024-07-24 18:52:31.880058] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:47.053 18:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:47.053 18:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:47.053 18:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:47.053 18:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:47.053 18:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:47.054 18:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:47.054 18:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:47.054 18:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:47.054 18:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:47.054 18:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:47.054 18:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.054 18:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:47.312 18:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:47.312 "name": "Existed_Raid", 00:15:47.312 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.312 "strip_size_kb": 64, 00:15:47.312 "state": "configuring", 00:15:47.312 "raid_level": "concat", 00:15:47.312 "superblock": false, 00:15:47.312 "num_base_bdevs": 4, 00:15:47.312 "num_base_bdevs_discovered": 2, 00:15:47.312 "num_base_bdevs_operational": 4, 00:15:47.312 "base_bdevs_list": [ 00:15:47.312 { 00:15:47.312 "name": "BaseBdev1", 00:15:47.312 "uuid": "d162e681-b44c-4468-8290-41d7357e406b", 00:15:47.312 "is_configured": true, 00:15:47.312 "data_offset": 0, 00:15:47.312 "data_size": 65536 00:15:47.312 }, 00:15:47.312 { 00:15:47.312 "name": null, 00:15:47.312 "uuid": "876ac4be-7ac7-497a-b7c0-fd78057a8f3e", 00:15:47.312 "is_configured": false, 00:15:47.312 "data_offset": 0, 00:15:47.312 "data_size": 65536 00:15:47.312 }, 00:15:47.312 { 00:15:47.312 "name": null, 00:15:47.312 "uuid": "eb2dbf15-89e4-4651-b338-8b3054bc2599", 00:15:47.312 "is_configured": false, 00:15:47.312 "data_offset": 0, 00:15:47.312 "data_size": 65536 00:15:47.312 }, 00:15:47.312 { 00:15:47.312 "name": "BaseBdev4", 00:15:47.312 "uuid": "f3cafb78-6bc8-4932-9a9f-40d4c39c78fd", 00:15:47.312 "is_configured": true, 00:15:47.312 "data_offset": 0, 00:15:47.312 "data_size": 65536 00:15:47.312 } 00:15:47.312 ] 00:15:47.313 }' 00:15:47.313 18:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:47.313 18:52:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:47.571 18:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.571 18:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:47.829 18:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:47.829 18:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:48.087 [2024-07-24 18:52:32.882671] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:48.087 18:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:48.087 18:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:48.087 18:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:48.087 18:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:48.087 18:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:48.087 18:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:48.087 18:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.087 18:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.087 18:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.087 18:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.087 18:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.087 18:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:48.087 18:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.087 "name": "Existed_Raid", 00:15:48.087 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.087 "strip_size_kb": 64, 00:15:48.087 "state": "configuring", 00:15:48.087 "raid_level": "concat", 00:15:48.087 "superblock": false, 00:15:48.087 "num_base_bdevs": 4, 00:15:48.087 "num_base_bdevs_discovered": 3, 00:15:48.087 "num_base_bdevs_operational": 4, 00:15:48.087 "base_bdevs_list": [ 00:15:48.087 { 00:15:48.087 "name": "BaseBdev1", 00:15:48.087 "uuid": "d162e681-b44c-4468-8290-41d7357e406b", 00:15:48.087 "is_configured": true, 00:15:48.087 "data_offset": 0, 00:15:48.087 "data_size": 65536 00:15:48.087 }, 00:15:48.087 { 00:15:48.087 "name": null, 00:15:48.087 "uuid": "876ac4be-7ac7-497a-b7c0-fd78057a8f3e", 00:15:48.087 "is_configured": false, 00:15:48.087 "data_offset": 0, 00:15:48.087 "data_size": 65536 00:15:48.087 }, 00:15:48.087 { 00:15:48.087 "name": "BaseBdev3", 00:15:48.087 "uuid": "eb2dbf15-89e4-4651-b338-8b3054bc2599", 00:15:48.087 "is_configured": true, 00:15:48.087 "data_offset": 0, 00:15:48.087 "data_size": 65536 00:15:48.087 }, 00:15:48.087 { 00:15:48.087 "name": "BaseBdev4", 00:15:48.087 "uuid": "f3cafb78-6bc8-4932-9a9f-40d4c39c78fd", 00:15:48.087 "is_configured": true, 00:15:48.087 "data_offset": 0, 00:15:48.087 "data_size": 65536 00:15:48.087 } 00:15:48.087 ] 00:15:48.087 }' 00:15:48.087 18:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.087 18:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:48.651 18:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.651 18:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:48.908 18:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:48.908 18:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:48.908 [2024-07-24 18:52:33.873232] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:48.908 18:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:48.909 18:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:48.909 18:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:48.909 18:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:48.909 18:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:48.909 18:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:48.909 18:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.909 18:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.909 18:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.909 18:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.909 18:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.909 18:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:49.167 18:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:49.167 "name": "Existed_Raid", 00:15:49.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:49.167 "strip_size_kb": 64, 00:15:49.167 "state": "configuring", 00:15:49.167 "raid_level": "concat", 00:15:49.167 "superblock": false, 00:15:49.167 "num_base_bdevs": 4, 00:15:49.167 "num_base_bdevs_discovered": 2, 00:15:49.167 "num_base_bdevs_operational": 4, 00:15:49.167 "base_bdevs_list": [ 00:15:49.167 { 00:15:49.167 "name": null, 00:15:49.167 "uuid": "d162e681-b44c-4468-8290-41d7357e406b", 00:15:49.167 "is_configured": false, 00:15:49.167 "data_offset": 0, 00:15:49.167 "data_size": 65536 00:15:49.167 }, 00:15:49.167 { 00:15:49.167 "name": null, 00:15:49.167 "uuid": "876ac4be-7ac7-497a-b7c0-fd78057a8f3e", 00:15:49.167 "is_configured": false, 00:15:49.167 "data_offset": 0, 00:15:49.167 "data_size": 65536 00:15:49.167 }, 00:15:49.167 { 00:15:49.167 "name": "BaseBdev3", 00:15:49.167 "uuid": "eb2dbf15-89e4-4651-b338-8b3054bc2599", 00:15:49.167 "is_configured": true, 00:15:49.167 "data_offset": 0, 00:15:49.167 "data_size": 65536 00:15:49.167 }, 00:15:49.167 { 00:15:49.167 "name": "BaseBdev4", 00:15:49.167 "uuid": "f3cafb78-6bc8-4932-9a9f-40d4c39c78fd", 00:15:49.167 "is_configured": true, 00:15:49.167 "data_offset": 0, 00:15:49.167 "data_size": 65536 00:15:49.167 } 00:15:49.167 ] 00:15:49.167 }' 00:15:49.167 18:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:49.167 18:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.734 18:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:49.734 18:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:49.734 18:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:49.734 18:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:49.992 [2024-07-24 18:52:34.861705] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:49.992 18:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:49.993 18:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:49.993 18:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:49.993 18:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:49.993 18:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:49.993 18:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:49.993 18:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:49.993 18:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:49.993 18:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:49.993 18:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:49.993 18:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:49.993 18:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:50.252 18:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:50.252 "name": "Existed_Raid", 00:15:50.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:50.252 "strip_size_kb": 64, 00:15:50.252 "state": "configuring", 00:15:50.252 "raid_level": "concat", 00:15:50.252 "superblock": false, 00:15:50.252 "num_base_bdevs": 4, 00:15:50.252 "num_base_bdevs_discovered": 3, 00:15:50.252 "num_base_bdevs_operational": 4, 00:15:50.252 "base_bdevs_list": [ 00:15:50.252 { 00:15:50.252 "name": null, 00:15:50.252 "uuid": "d162e681-b44c-4468-8290-41d7357e406b", 00:15:50.252 "is_configured": false, 00:15:50.252 "data_offset": 0, 00:15:50.252 "data_size": 65536 00:15:50.252 }, 00:15:50.252 { 00:15:50.252 "name": "BaseBdev2", 00:15:50.252 "uuid": "876ac4be-7ac7-497a-b7c0-fd78057a8f3e", 00:15:50.252 "is_configured": true, 00:15:50.252 "data_offset": 0, 00:15:50.252 "data_size": 65536 00:15:50.252 }, 00:15:50.252 { 00:15:50.252 "name": "BaseBdev3", 00:15:50.252 "uuid": "eb2dbf15-89e4-4651-b338-8b3054bc2599", 00:15:50.252 "is_configured": true, 00:15:50.252 "data_offset": 0, 00:15:50.252 "data_size": 65536 00:15:50.252 }, 00:15:50.252 { 00:15:50.252 "name": "BaseBdev4", 00:15:50.252 "uuid": "f3cafb78-6bc8-4932-9a9f-40d4c39c78fd", 00:15:50.252 "is_configured": true, 00:15:50.252 "data_offset": 0, 00:15:50.252 "data_size": 65536 00:15:50.252 } 00:15:50.252 ] 00:15:50.252 }' 00:15:50.252 18:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:50.252 18:52:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:50.818 18:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:50.818 18:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:50.818 18:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:50.818 18:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:50.818 18:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:51.076 18:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u d162e681-b44c-4468-8290-41d7357e406b 00:15:51.076 [2024-07-24 18:52:36.063405] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:51.076 [2024-07-24 18:52:36.063435] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xd26880 00:15:51.076 [2024-07-24 18:52:36.063439] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:15:51.076 [2024-07-24 18:52:36.063572] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd2db70 00:15:51.076 [2024-07-24 18:52:36.063654] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd26880 00:15:51.076 [2024-07-24 18:52:36.063659] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd26880 00:15:51.076 [2024-07-24 18:52:36.063793] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:51.076 NewBaseBdev 00:15:51.076 18:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:51.076 18:52:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:51.076 18:52:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:51.076 18:52:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:51.076 18:52:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:51.076 18:52:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:51.076 18:52:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:51.335 18:52:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:51.594 [ 00:15:51.594 { 00:15:51.594 "name": "NewBaseBdev", 00:15:51.594 "aliases": [ 00:15:51.594 "d162e681-b44c-4468-8290-41d7357e406b" 00:15:51.594 ], 00:15:51.594 "product_name": "Malloc disk", 00:15:51.594 "block_size": 512, 00:15:51.594 "num_blocks": 65536, 00:15:51.594 "uuid": "d162e681-b44c-4468-8290-41d7357e406b", 00:15:51.594 "assigned_rate_limits": { 00:15:51.594 "rw_ios_per_sec": 0, 00:15:51.594 "rw_mbytes_per_sec": 0, 00:15:51.594 "r_mbytes_per_sec": 0, 00:15:51.594 "w_mbytes_per_sec": 0 00:15:51.594 }, 00:15:51.594 "claimed": true, 00:15:51.594 "claim_type": "exclusive_write", 00:15:51.594 "zoned": false, 00:15:51.594 "supported_io_types": { 00:15:51.594 "read": true, 00:15:51.594 "write": true, 00:15:51.594 "unmap": true, 00:15:51.594 "flush": true, 00:15:51.594 "reset": true, 00:15:51.594 "nvme_admin": false, 00:15:51.594 "nvme_io": false, 00:15:51.594 "nvme_io_md": false, 00:15:51.594 "write_zeroes": true, 00:15:51.594 "zcopy": true, 00:15:51.594 "get_zone_info": false, 00:15:51.594 "zone_management": false, 00:15:51.594 "zone_append": false, 00:15:51.594 "compare": false, 00:15:51.594 "compare_and_write": false, 00:15:51.594 "abort": true, 00:15:51.594 "seek_hole": false, 00:15:51.594 "seek_data": false, 00:15:51.594 "copy": true, 00:15:51.594 "nvme_iov_md": false 00:15:51.594 }, 00:15:51.594 "memory_domains": [ 00:15:51.594 { 00:15:51.594 "dma_device_id": "system", 00:15:51.594 "dma_device_type": 1 00:15:51.594 }, 00:15:51.594 { 00:15:51.594 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.594 "dma_device_type": 2 00:15:51.594 } 00:15:51.594 ], 00:15:51.594 "driver_specific": {} 00:15:51.594 } 00:15:51.594 ] 00:15:51.594 18:52:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:51.594 18:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:15:51.594 18:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:51.594 18:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:51.594 18:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:51.594 18:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:51.594 18:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:51.594 18:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:51.594 18:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:51.594 18:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:51.594 18:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:51.594 18:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:51.594 18:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:51.594 18:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:51.594 "name": "Existed_Raid", 00:15:51.594 "uuid": "e2612525-4ba3-4655-af9e-51d3063578ec", 00:15:51.594 "strip_size_kb": 64, 00:15:51.594 "state": "online", 00:15:51.594 "raid_level": "concat", 00:15:51.594 "superblock": false, 00:15:51.594 "num_base_bdevs": 4, 00:15:51.594 "num_base_bdevs_discovered": 4, 00:15:51.594 "num_base_bdevs_operational": 4, 00:15:51.594 "base_bdevs_list": [ 00:15:51.594 { 00:15:51.594 "name": "NewBaseBdev", 00:15:51.594 "uuid": "d162e681-b44c-4468-8290-41d7357e406b", 00:15:51.594 "is_configured": true, 00:15:51.594 "data_offset": 0, 00:15:51.594 "data_size": 65536 00:15:51.594 }, 00:15:51.594 { 00:15:51.594 "name": "BaseBdev2", 00:15:51.594 "uuid": "876ac4be-7ac7-497a-b7c0-fd78057a8f3e", 00:15:51.594 "is_configured": true, 00:15:51.594 "data_offset": 0, 00:15:51.594 "data_size": 65536 00:15:51.594 }, 00:15:51.594 { 00:15:51.594 "name": "BaseBdev3", 00:15:51.594 "uuid": "eb2dbf15-89e4-4651-b338-8b3054bc2599", 00:15:51.594 "is_configured": true, 00:15:51.594 "data_offset": 0, 00:15:51.594 "data_size": 65536 00:15:51.594 }, 00:15:51.594 { 00:15:51.594 "name": "BaseBdev4", 00:15:51.594 "uuid": "f3cafb78-6bc8-4932-9a9f-40d4c39c78fd", 00:15:51.594 "is_configured": true, 00:15:51.594 "data_offset": 0, 00:15:51.594 "data_size": 65536 00:15:51.594 } 00:15:51.594 ] 00:15:51.594 }' 00:15:51.594 18:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:51.594 18:52:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:52.161 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:52.161 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:52.161 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:52.161 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:52.161 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:52.161 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:52.161 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:52.161 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:52.419 [2024-07-24 18:52:37.194562] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:52.419 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:52.419 "name": "Existed_Raid", 00:15:52.419 "aliases": [ 00:15:52.419 "e2612525-4ba3-4655-af9e-51d3063578ec" 00:15:52.419 ], 00:15:52.419 "product_name": "Raid Volume", 00:15:52.419 "block_size": 512, 00:15:52.419 "num_blocks": 262144, 00:15:52.419 "uuid": "e2612525-4ba3-4655-af9e-51d3063578ec", 00:15:52.420 "assigned_rate_limits": { 00:15:52.420 "rw_ios_per_sec": 0, 00:15:52.420 "rw_mbytes_per_sec": 0, 00:15:52.420 "r_mbytes_per_sec": 0, 00:15:52.420 "w_mbytes_per_sec": 0 00:15:52.420 }, 00:15:52.420 "claimed": false, 00:15:52.420 "zoned": false, 00:15:52.420 "supported_io_types": { 00:15:52.420 "read": true, 00:15:52.420 "write": true, 00:15:52.420 "unmap": true, 00:15:52.420 "flush": true, 00:15:52.420 "reset": true, 00:15:52.420 "nvme_admin": false, 00:15:52.420 "nvme_io": false, 00:15:52.420 "nvme_io_md": false, 00:15:52.420 "write_zeroes": true, 00:15:52.420 "zcopy": false, 00:15:52.420 "get_zone_info": false, 00:15:52.420 "zone_management": false, 00:15:52.420 "zone_append": false, 00:15:52.420 "compare": false, 00:15:52.420 "compare_and_write": false, 00:15:52.420 "abort": false, 00:15:52.420 "seek_hole": false, 00:15:52.420 "seek_data": false, 00:15:52.420 "copy": false, 00:15:52.420 "nvme_iov_md": false 00:15:52.420 }, 00:15:52.420 "memory_domains": [ 00:15:52.420 { 00:15:52.420 "dma_device_id": "system", 00:15:52.420 "dma_device_type": 1 00:15:52.420 }, 00:15:52.420 { 00:15:52.420 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.420 "dma_device_type": 2 00:15:52.420 }, 00:15:52.420 { 00:15:52.420 "dma_device_id": "system", 00:15:52.420 "dma_device_type": 1 00:15:52.420 }, 00:15:52.420 { 00:15:52.420 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.420 "dma_device_type": 2 00:15:52.420 }, 00:15:52.420 { 00:15:52.420 "dma_device_id": "system", 00:15:52.420 "dma_device_type": 1 00:15:52.420 }, 00:15:52.420 { 00:15:52.420 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.420 "dma_device_type": 2 00:15:52.420 }, 00:15:52.420 { 00:15:52.420 "dma_device_id": "system", 00:15:52.420 "dma_device_type": 1 00:15:52.420 }, 00:15:52.420 { 00:15:52.420 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.420 "dma_device_type": 2 00:15:52.420 } 00:15:52.420 ], 00:15:52.420 "driver_specific": { 00:15:52.420 "raid": { 00:15:52.420 "uuid": "e2612525-4ba3-4655-af9e-51d3063578ec", 00:15:52.420 "strip_size_kb": 64, 00:15:52.420 "state": "online", 00:15:52.420 "raid_level": "concat", 00:15:52.420 "superblock": false, 00:15:52.420 "num_base_bdevs": 4, 00:15:52.420 "num_base_bdevs_discovered": 4, 00:15:52.420 "num_base_bdevs_operational": 4, 00:15:52.420 "base_bdevs_list": [ 00:15:52.420 { 00:15:52.420 "name": "NewBaseBdev", 00:15:52.420 "uuid": "d162e681-b44c-4468-8290-41d7357e406b", 00:15:52.420 "is_configured": true, 00:15:52.420 "data_offset": 0, 00:15:52.420 "data_size": 65536 00:15:52.420 }, 00:15:52.420 { 00:15:52.420 "name": "BaseBdev2", 00:15:52.420 "uuid": "876ac4be-7ac7-497a-b7c0-fd78057a8f3e", 00:15:52.420 "is_configured": true, 00:15:52.420 "data_offset": 0, 00:15:52.420 "data_size": 65536 00:15:52.420 }, 00:15:52.420 { 00:15:52.420 "name": "BaseBdev3", 00:15:52.420 "uuid": "eb2dbf15-89e4-4651-b338-8b3054bc2599", 00:15:52.420 "is_configured": true, 00:15:52.420 "data_offset": 0, 00:15:52.420 "data_size": 65536 00:15:52.420 }, 00:15:52.420 { 00:15:52.420 "name": "BaseBdev4", 00:15:52.420 "uuid": "f3cafb78-6bc8-4932-9a9f-40d4c39c78fd", 00:15:52.420 "is_configured": true, 00:15:52.420 "data_offset": 0, 00:15:52.420 "data_size": 65536 00:15:52.420 } 00:15:52.420 ] 00:15:52.420 } 00:15:52.420 } 00:15:52.420 }' 00:15:52.420 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:52.420 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:52.420 BaseBdev2 00:15:52.420 BaseBdev3 00:15:52.420 BaseBdev4' 00:15:52.420 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:52.420 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:52.420 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:52.679 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:52.679 "name": "NewBaseBdev", 00:15:52.679 "aliases": [ 00:15:52.679 "d162e681-b44c-4468-8290-41d7357e406b" 00:15:52.679 ], 00:15:52.679 "product_name": "Malloc disk", 00:15:52.679 "block_size": 512, 00:15:52.679 "num_blocks": 65536, 00:15:52.679 "uuid": "d162e681-b44c-4468-8290-41d7357e406b", 00:15:52.679 "assigned_rate_limits": { 00:15:52.679 "rw_ios_per_sec": 0, 00:15:52.679 "rw_mbytes_per_sec": 0, 00:15:52.679 "r_mbytes_per_sec": 0, 00:15:52.679 "w_mbytes_per_sec": 0 00:15:52.679 }, 00:15:52.679 "claimed": true, 00:15:52.679 "claim_type": "exclusive_write", 00:15:52.679 "zoned": false, 00:15:52.679 "supported_io_types": { 00:15:52.679 "read": true, 00:15:52.679 "write": true, 00:15:52.679 "unmap": true, 00:15:52.679 "flush": true, 00:15:52.679 "reset": true, 00:15:52.679 "nvme_admin": false, 00:15:52.679 "nvme_io": false, 00:15:52.679 "nvme_io_md": false, 00:15:52.679 "write_zeroes": true, 00:15:52.679 "zcopy": true, 00:15:52.679 "get_zone_info": false, 00:15:52.679 "zone_management": false, 00:15:52.679 "zone_append": false, 00:15:52.679 "compare": false, 00:15:52.679 "compare_and_write": false, 00:15:52.679 "abort": true, 00:15:52.679 "seek_hole": false, 00:15:52.679 "seek_data": false, 00:15:52.679 "copy": true, 00:15:52.679 "nvme_iov_md": false 00:15:52.679 }, 00:15:52.679 "memory_domains": [ 00:15:52.679 { 00:15:52.679 "dma_device_id": "system", 00:15:52.679 "dma_device_type": 1 00:15:52.679 }, 00:15:52.679 { 00:15:52.679 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.679 "dma_device_type": 2 00:15:52.679 } 00:15:52.679 ], 00:15:52.679 "driver_specific": {} 00:15:52.679 }' 00:15:52.679 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:52.679 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:52.679 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:52.679 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:52.679 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:52.679 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:52.679 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:52.679 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:52.679 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:52.679 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.938 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.938 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:52.938 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:52.938 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:52.938 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:52.938 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:52.938 "name": "BaseBdev2", 00:15:52.938 "aliases": [ 00:15:52.938 "876ac4be-7ac7-497a-b7c0-fd78057a8f3e" 00:15:52.938 ], 00:15:52.938 "product_name": "Malloc disk", 00:15:52.938 "block_size": 512, 00:15:52.938 "num_blocks": 65536, 00:15:52.938 "uuid": "876ac4be-7ac7-497a-b7c0-fd78057a8f3e", 00:15:52.938 "assigned_rate_limits": { 00:15:52.938 "rw_ios_per_sec": 0, 00:15:52.938 "rw_mbytes_per_sec": 0, 00:15:52.938 "r_mbytes_per_sec": 0, 00:15:52.938 "w_mbytes_per_sec": 0 00:15:52.938 }, 00:15:52.938 "claimed": true, 00:15:52.938 "claim_type": "exclusive_write", 00:15:52.938 "zoned": false, 00:15:52.938 "supported_io_types": { 00:15:52.938 "read": true, 00:15:52.938 "write": true, 00:15:52.938 "unmap": true, 00:15:52.938 "flush": true, 00:15:52.938 "reset": true, 00:15:52.938 "nvme_admin": false, 00:15:52.938 "nvme_io": false, 00:15:52.938 "nvme_io_md": false, 00:15:52.938 "write_zeroes": true, 00:15:52.938 "zcopy": true, 00:15:52.938 "get_zone_info": false, 00:15:52.938 "zone_management": false, 00:15:52.938 "zone_append": false, 00:15:52.938 "compare": false, 00:15:52.938 "compare_and_write": false, 00:15:52.938 "abort": true, 00:15:52.938 "seek_hole": false, 00:15:52.938 "seek_data": false, 00:15:52.938 "copy": true, 00:15:52.938 "nvme_iov_md": false 00:15:52.938 }, 00:15:52.938 "memory_domains": [ 00:15:52.938 { 00:15:52.938 "dma_device_id": "system", 00:15:52.938 "dma_device_type": 1 00:15:52.938 }, 00:15:52.938 { 00:15:52.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.938 "dma_device_type": 2 00:15:52.938 } 00:15:52.938 ], 00:15:52.938 "driver_specific": {} 00:15:52.938 }' 00:15:52.938 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:53.196 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:53.196 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:53.196 18:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:53.196 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:53.196 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:53.196 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:53.196 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:53.196 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:53.197 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:53.197 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:53.456 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:53.456 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:53.456 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:53.456 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:53.456 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:53.456 "name": "BaseBdev3", 00:15:53.456 "aliases": [ 00:15:53.456 "eb2dbf15-89e4-4651-b338-8b3054bc2599" 00:15:53.456 ], 00:15:53.456 "product_name": "Malloc disk", 00:15:53.456 "block_size": 512, 00:15:53.456 "num_blocks": 65536, 00:15:53.456 "uuid": "eb2dbf15-89e4-4651-b338-8b3054bc2599", 00:15:53.456 "assigned_rate_limits": { 00:15:53.456 "rw_ios_per_sec": 0, 00:15:53.456 "rw_mbytes_per_sec": 0, 00:15:53.456 "r_mbytes_per_sec": 0, 00:15:53.456 "w_mbytes_per_sec": 0 00:15:53.456 }, 00:15:53.456 "claimed": true, 00:15:53.456 "claim_type": "exclusive_write", 00:15:53.456 "zoned": false, 00:15:53.456 "supported_io_types": { 00:15:53.456 "read": true, 00:15:53.456 "write": true, 00:15:53.456 "unmap": true, 00:15:53.456 "flush": true, 00:15:53.456 "reset": true, 00:15:53.456 "nvme_admin": false, 00:15:53.456 "nvme_io": false, 00:15:53.456 "nvme_io_md": false, 00:15:53.456 "write_zeroes": true, 00:15:53.456 "zcopy": true, 00:15:53.456 "get_zone_info": false, 00:15:53.456 "zone_management": false, 00:15:53.456 "zone_append": false, 00:15:53.456 "compare": false, 00:15:53.456 "compare_and_write": false, 00:15:53.456 "abort": true, 00:15:53.456 "seek_hole": false, 00:15:53.456 "seek_data": false, 00:15:53.456 "copy": true, 00:15:53.456 "nvme_iov_md": false 00:15:53.456 }, 00:15:53.456 "memory_domains": [ 00:15:53.456 { 00:15:53.456 "dma_device_id": "system", 00:15:53.456 "dma_device_type": 1 00:15:53.456 }, 00:15:53.456 { 00:15:53.456 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.456 "dma_device_type": 2 00:15:53.456 } 00:15:53.456 ], 00:15:53.456 "driver_specific": {} 00:15:53.456 }' 00:15:53.456 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:53.456 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:53.456 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:53.456 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:53.715 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:53.715 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:53.715 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:53.715 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:53.715 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:53.715 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:53.715 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:53.715 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:53.715 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:53.715 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:53.715 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:53.974 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:53.974 "name": "BaseBdev4", 00:15:53.974 "aliases": [ 00:15:53.974 "f3cafb78-6bc8-4932-9a9f-40d4c39c78fd" 00:15:53.974 ], 00:15:53.974 "product_name": "Malloc disk", 00:15:53.974 "block_size": 512, 00:15:53.974 "num_blocks": 65536, 00:15:53.974 "uuid": "f3cafb78-6bc8-4932-9a9f-40d4c39c78fd", 00:15:53.974 "assigned_rate_limits": { 00:15:53.974 "rw_ios_per_sec": 0, 00:15:53.974 "rw_mbytes_per_sec": 0, 00:15:53.974 "r_mbytes_per_sec": 0, 00:15:53.974 "w_mbytes_per_sec": 0 00:15:53.974 }, 00:15:53.974 "claimed": true, 00:15:53.974 "claim_type": "exclusive_write", 00:15:53.974 "zoned": false, 00:15:53.974 "supported_io_types": { 00:15:53.974 "read": true, 00:15:53.974 "write": true, 00:15:53.974 "unmap": true, 00:15:53.974 "flush": true, 00:15:53.974 "reset": true, 00:15:53.974 "nvme_admin": false, 00:15:53.974 "nvme_io": false, 00:15:53.974 "nvme_io_md": false, 00:15:53.974 "write_zeroes": true, 00:15:53.974 "zcopy": true, 00:15:53.974 "get_zone_info": false, 00:15:53.974 "zone_management": false, 00:15:53.974 "zone_append": false, 00:15:53.974 "compare": false, 00:15:53.974 "compare_and_write": false, 00:15:53.974 "abort": true, 00:15:53.974 "seek_hole": false, 00:15:53.974 "seek_data": false, 00:15:53.974 "copy": true, 00:15:53.974 "nvme_iov_md": false 00:15:53.974 }, 00:15:53.974 "memory_domains": [ 00:15:53.974 { 00:15:53.974 "dma_device_id": "system", 00:15:53.974 "dma_device_type": 1 00:15:53.974 }, 00:15:53.974 { 00:15:53.974 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.974 "dma_device_type": 2 00:15:53.974 } 00:15:53.974 ], 00:15:53.974 "driver_specific": {} 00:15:53.974 }' 00:15:53.974 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:53.974 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:53.974 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:53.974 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:53.974 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:54.232 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:54.232 18:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:54.232 18:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:54.232 18:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:54.232 18:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:54.232 18:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:54.232 18:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:54.232 18:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:54.491 [2024-07-24 18:52:39.291774] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:54.491 [2024-07-24 18:52:39.291794] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:54.491 [2024-07-24 18:52:39.291838] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:54.491 [2024-07-24 18:52:39.291880] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:54.491 [2024-07-24 18:52:39.291886] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd26880 name Existed_Raid, state offline 00:15:54.491 18:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2120142 00:15:54.491 18:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2120142 ']' 00:15:54.491 18:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2120142 00:15:54.491 18:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:15:54.491 18:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:54.491 18:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2120142 00:15:54.491 18:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:54.491 18:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:54.491 18:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2120142' 00:15:54.491 killing process with pid 2120142 00:15:54.491 18:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2120142 00:15:54.491 [2024-07-24 18:52:39.339293] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:54.491 18:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2120142 00:15:54.491 [2024-07-24 18:52:39.370436] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:54.750 00:15:54.750 real 0m24.169s 00:15:54.750 user 0m45.107s 00:15:54.750 sys 0m3.695s 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:54.750 ************************************ 00:15:54.750 END TEST raid_state_function_test 00:15:54.750 ************************************ 00:15:54.750 18:52:39 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:15:54.750 18:52:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:54.750 18:52:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:54.750 18:52:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:54.750 ************************************ 00:15:54.750 START TEST raid_state_function_test_sb 00:15:54.750 ************************************ 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2124732 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2124732' 00:15:54.750 Process raid pid: 2124732 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2124732 /var/tmp/spdk-raid.sock 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2124732 ']' 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:54.750 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:54.750 18:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:54.750 [2024-07-24 18:52:39.667176] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:15:54.750 [2024-07-24 18:52:39.667216] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:54.750 [2024-07-24 18:52:39.732436] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:55.009 [2024-07-24 18:52:39.812387] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:55.009 [2024-07-24 18:52:39.873777] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:55.009 [2024-07-24 18:52:39.873801] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:55.576 18:52:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:55.576 18:52:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:15:55.576 18:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:55.835 [2024-07-24 18:52:40.612947] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:55.835 [2024-07-24 18:52:40.612978] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:55.835 [2024-07-24 18:52:40.612984] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:55.835 [2024-07-24 18:52:40.612989] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:55.835 [2024-07-24 18:52:40.612993] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:55.835 [2024-07-24 18:52:40.612998] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:55.835 [2024-07-24 18:52:40.613003] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:55.835 [2024-07-24 18:52:40.613008] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:55.835 18:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:55.835 18:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:55.835 18:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:55.835 18:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:55.835 18:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:55.835 18:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:55.835 18:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:55.835 18:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:55.835 18:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:55.835 18:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:55.835 18:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:55.835 18:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:55.835 18:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:55.835 "name": "Existed_Raid", 00:15:55.835 "uuid": "fdb2af8b-e3b6-4107-986a-bb06f973398a", 00:15:55.835 "strip_size_kb": 64, 00:15:55.835 "state": "configuring", 00:15:55.835 "raid_level": "concat", 00:15:55.835 "superblock": true, 00:15:55.835 "num_base_bdevs": 4, 00:15:55.835 "num_base_bdevs_discovered": 0, 00:15:55.835 "num_base_bdevs_operational": 4, 00:15:55.835 "base_bdevs_list": [ 00:15:55.835 { 00:15:55.835 "name": "BaseBdev1", 00:15:55.835 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:55.835 "is_configured": false, 00:15:55.835 "data_offset": 0, 00:15:55.835 "data_size": 0 00:15:55.835 }, 00:15:55.835 { 00:15:55.835 "name": "BaseBdev2", 00:15:55.835 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:55.835 "is_configured": false, 00:15:55.835 "data_offset": 0, 00:15:55.835 "data_size": 0 00:15:55.835 }, 00:15:55.835 { 00:15:55.835 "name": "BaseBdev3", 00:15:55.835 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:55.835 "is_configured": false, 00:15:55.835 "data_offset": 0, 00:15:55.835 "data_size": 0 00:15:55.835 }, 00:15:55.835 { 00:15:55.835 "name": "BaseBdev4", 00:15:55.835 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:55.835 "is_configured": false, 00:15:55.835 "data_offset": 0, 00:15:55.835 "data_size": 0 00:15:55.835 } 00:15:55.835 ] 00:15:55.835 }' 00:15:55.835 18:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:55.835 18:52:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:56.400 18:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:56.658 [2024-07-24 18:52:41.447004] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:56.658 [2024-07-24 18:52:41.447029] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd3ebc0 name Existed_Raid, state configuring 00:15:56.658 18:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:56.658 [2024-07-24 18:52:41.611451] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:56.658 [2024-07-24 18:52:41.611479] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:56.658 [2024-07-24 18:52:41.611485] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:56.658 [2024-07-24 18:52:41.611490] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:56.658 [2024-07-24 18:52:41.611494] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:56.658 [2024-07-24 18:52:41.611499] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:56.658 [2024-07-24 18:52:41.611503] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:56.658 [2024-07-24 18:52:41.611508] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:56.658 18:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:56.915 [2024-07-24 18:52:41.784315] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:56.915 BaseBdev1 00:15:56.915 18:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:56.915 18:52:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:56.915 18:52:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:56.915 18:52:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:56.915 18:52:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:56.915 18:52:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:56.915 18:52:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:57.172 18:52:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:57.172 [ 00:15:57.172 { 00:15:57.172 "name": "BaseBdev1", 00:15:57.172 "aliases": [ 00:15:57.172 "746bc0c4-2b30-4bac-9ab7-f66de605bd6f" 00:15:57.172 ], 00:15:57.172 "product_name": "Malloc disk", 00:15:57.172 "block_size": 512, 00:15:57.172 "num_blocks": 65536, 00:15:57.172 "uuid": "746bc0c4-2b30-4bac-9ab7-f66de605bd6f", 00:15:57.172 "assigned_rate_limits": { 00:15:57.172 "rw_ios_per_sec": 0, 00:15:57.172 "rw_mbytes_per_sec": 0, 00:15:57.172 "r_mbytes_per_sec": 0, 00:15:57.172 "w_mbytes_per_sec": 0 00:15:57.172 }, 00:15:57.172 "claimed": true, 00:15:57.172 "claim_type": "exclusive_write", 00:15:57.172 "zoned": false, 00:15:57.172 "supported_io_types": { 00:15:57.172 "read": true, 00:15:57.172 "write": true, 00:15:57.172 "unmap": true, 00:15:57.172 "flush": true, 00:15:57.172 "reset": true, 00:15:57.172 "nvme_admin": false, 00:15:57.172 "nvme_io": false, 00:15:57.172 "nvme_io_md": false, 00:15:57.172 "write_zeroes": true, 00:15:57.172 "zcopy": true, 00:15:57.172 "get_zone_info": false, 00:15:57.172 "zone_management": false, 00:15:57.172 "zone_append": false, 00:15:57.172 "compare": false, 00:15:57.172 "compare_and_write": false, 00:15:57.172 "abort": true, 00:15:57.172 "seek_hole": false, 00:15:57.172 "seek_data": false, 00:15:57.172 "copy": true, 00:15:57.172 "nvme_iov_md": false 00:15:57.172 }, 00:15:57.172 "memory_domains": [ 00:15:57.172 { 00:15:57.172 "dma_device_id": "system", 00:15:57.172 "dma_device_type": 1 00:15:57.172 }, 00:15:57.172 { 00:15:57.172 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:57.172 "dma_device_type": 2 00:15:57.172 } 00:15:57.172 ], 00:15:57.172 "driver_specific": {} 00:15:57.172 } 00:15:57.172 ] 00:15:57.172 18:52:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:57.172 18:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:57.172 18:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:57.172 18:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:57.172 18:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:57.172 18:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:57.172 18:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:57.172 18:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:57.172 18:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:57.172 18:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:57.172 18:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:57.172 18:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:57.172 18:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:57.430 18:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:57.430 "name": "Existed_Raid", 00:15:57.430 "uuid": "b73cf42b-6868-4832-a5c0-5ad9b86bc2a1", 00:15:57.430 "strip_size_kb": 64, 00:15:57.430 "state": "configuring", 00:15:57.430 "raid_level": "concat", 00:15:57.430 "superblock": true, 00:15:57.430 "num_base_bdevs": 4, 00:15:57.430 "num_base_bdevs_discovered": 1, 00:15:57.430 "num_base_bdevs_operational": 4, 00:15:57.430 "base_bdevs_list": [ 00:15:57.430 { 00:15:57.430 "name": "BaseBdev1", 00:15:57.430 "uuid": "746bc0c4-2b30-4bac-9ab7-f66de605bd6f", 00:15:57.430 "is_configured": true, 00:15:57.430 "data_offset": 2048, 00:15:57.430 "data_size": 63488 00:15:57.430 }, 00:15:57.430 { 00:15:57.430 "name": "BaseBdev2", 00:15:57.430 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:57.430 "is_configured": false, 00:15:57.430 "data_offset": 0, 00:15:57.430 "data_size": 0 00:15:57.430 }, 00:15:57.430 { 00:15:57.430 "name": "BaseBdev3", 00:15:57.430 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:57.430 "is_configured": false, 00:15:57.430 "data_offset": 0, 00:15:57.430 "data_size": 0 00:15:57.430 }, 00:15:57.430 { 00:15:57.430 "name": "BaseBdev4", 00:15:57.430 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:57.430 "is_configured": false, 00:15:57.430 "data_offset": 0, 00:15:57.430 "data_size": 0 00:15:57.430 } 00:15:57.430 ] 00:15:57.430 }' 00:15:57.430 18:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:57.430 18:52:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:57.996 18:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:57.997 [2024-07-24 18:52:42.951333] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:57.997 [2024-07-24 18:52:42.951365] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd3e430 name Existed_Raid, state configuring 00:15:57.997 18:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:58.255 [2024-07-24 18:52:43.119809] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:58.255 [2024-07-24 18:52:43.120871] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:58.255 [2024-07-24 18:52:43.120896] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:58.255 [2024-07-24 18:52:43.120901] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:58.255 [2024-07-24 18:52:43.120906] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:58.255 [2024-07-24 18:52:43.120910] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:58.255 [2024-07-24 18:52:43.120914] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:58.255 18:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:58.255 18:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:58.255 18:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:58.255 18:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:58.255 18:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:58.255 18:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:58.255 18:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:58.255 18:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:58.255 18:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:58.255 18:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:58.255 18:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:58.255 18:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:58.255 18:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.255 18:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:58.513 18:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:58.513 "name": "Existed_Raid", 00:15:58.513 "uuid": "2bf5a7f3-9f01-4ba6-954f-b402380d539b", 00:15:58.513 "strip_size_kb": 64, 00:15:58.513 "state": "configuring", 00:15:58.513 "raid_level": "concat", 00:15:58.513 "superblock": true, 00:15:58.513 "num_base_bdevs": 4, 00:15:58.513 "num_base_bdevs_discovered": 1, 00:15:58.513 "num_base_bdevs_operational": 4, 00:15:58.513 "base_bdevs_list": [ 00:15:58.513 { 00:15:58.513 "name": "BaseBdev1", 00:15:58.513 "uuid": "746bc0c4-2b30-4bac-9ab7-f66de605bd6f", 00:15:58.513 "is_configured": true, 00:15:58.513 "data_offset": 2048, 00:15:58.513 "data_size": 63488 00:15:58.513 }, 00:15:58.513 { 00:15:58.513 "name": "BaseBdev2", 00:15:58.513 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.513 "is_configured": false, 00:15:58.513 "data_offset": 0, 00:15:58.513 "data_size": 0 00:15:58.513 }, 00:15:58.513 { 00:15:58.513 "name": "BaseBdev3", 00:15:58.513 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.513 "is_configured": false, 00:15:58.513 "data_offset": 0, 00:15:58.513 "data_size": 0 00:15:58.513 }, 00:15:58.513 { 00:15:58.513 "name": "BaseBdev4", 00:15:58.513 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.513 "is_configured": false, 00:15:58.513 "data_offset": 0, 00:15:58.513 "data_size": 0 00:15:58.513 } 00:15:58.513 ] 00:15:58.513 }' 00:15:58.513 18:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:58.513 18:52:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:58.810 18:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:59.069 [2024-07-24 18:52:43.924770] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:59.069 BaseBdev2 00:15:59.069 18:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:59.069 18:52:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:59.069 18:52:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:59.069 18:52:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:59.069 18:52:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:59.069 18:52:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:59.069 18:52:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:59.326 18:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:59.326 [ 00:15:59.326 { 00:15:59.326 "name": "BaseBdev2", 00:15:59.326 "aliases": [ 00:15:59.326 "186bcf2a-e509-4de8-81cb-2c2e9560191d" 00:15:59.326 ], 00:15:59.326 "product_name": "Malloc disk", 00:15:59.326 "block_size": 512, 00:15:59.326 "num_blocks": 65536, 00:15:59.326 "uuid": "186bcf2a-e509-4de8-81cb-2c2e9560191d", 00:15:59.326 "assigned_rate_limits": { 00:15:59.326 "rw_ios_per_sec": 0, 00:15:59.326 "rw_mbytes_per_sec": 0, 00:15:59.326 "r_mbytes_per_sec": 0, 00:15:59.326 "w_mbytes_per_sec": 0 00:15:59.326 }, 00:15:59.326 "claimed": true, 00:15:59.326 "claim_type": "exclusive_write", 00:15:59.326 "zoned": false, 00:15:59.326 "supported_io_types": { 00:15:59.326 "read": true, 00:15:59.326 "write": true, 00:15:59.326 "unmap": true, 00:15:59.326 "flush": true, 00:15:59.326 "reset": true, 00:15:59.326 "nvme_admin": false, 00:15:59.326 "nvme_io": false, 00:15:59.326 "nvme_io_md": false, 00:15:59.326 "write_zeroes": true, 00:15:59.326 "zcopy": true, 00:15:59.326 "get_zone_info": false, 00:15:59.326 "zone_management": false, 00:15:59.326 "zone_append": false, 00:15:59.326 "compare": false, 00:15:59.326 "compare_and_write": false, 00:15:59.326 "abort": true, 00:15:59.326 "seek_hole": false, 00:15:59.326 "seek_data": false, 00:15:59.326 "copy": true, 00:15:59.326 "nvme_iov_md": false 00:15:59.326 }, 00:15:59.326 "memory_domains": [ 00:15:59.326 { 00:15:59.326 "dma_device_id": "system", 00:15:59.326 "dma_device_type": 1 00:15:59.326 }, 00:15:59.326 { 00:15:59.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:59.326 "dma_device_type": 2 00:15:59.326 } 00:15:59.326 ], 00:15:59.326 "driver_specific": {} 00:15:59.326 } 00:15:59.326 ] 00:15:59.326 18:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:59.326 18:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:59.326 18:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:59.326 18:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:59.326 18:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:59.326 18:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:59.326 18:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:59.326 18:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:59.326 18:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:59.326 18:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:59.326 18:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:59.326 18:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:59.326 18:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:59.326 18:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.326 18:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:59.584 18:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:59.584 "name": "Existed_Raid", 00:15:59.584 "uuid": "2bf5a7f3-9f01-4ba6-954f-b402380d539b", 00:15:59.584 "strip_size_kb": 64, 00:15:59.584 "state": "configuring", 00:15:59.584 "raid_level": "concat", 00:15:59.584 "superblock": true, 00:15:59.584 "num_base_bdevs": 4, 00:15:59.584 "num_base_bdevs_discovered": 2, 00:15:59.584 "num_base_bdevs_operational": 4, 00:15:59.584 "base_bdevs_list": [ 00:15:59.584 { 00:15:59.584 "name": "BaseBdev1", 00:15:59.584 "uuid": "746bc0c4-2b30-4bac-9ab7-f66de605bd6f", 00:15:59.584 "is_configured": true, 00:15:59.584 "data_offset": 2048, 00:15:59.584 "data_size": 63488 00:15:59.584 }, 00:15:59.584 { 00:15:59.584 "name": "BaseBdev2", 00:15:59.584 "uuid": "186bcf2a-e509-4de8-81cb-2c2e9560191d", 00:15:59.584 "is_configured": true, 00:15:59.584 "data_offset": 2048, 00:15:59.584 "data_size": 63488 00:15:59.584 }, 00:15:59.584 { 00:15:59.584 "name": "BaseBdev3", 00:15:59.584 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.584 "is_configured": false, 00:15:59.584 "data_offset": 0, 00:15:59.584 "data_size": 0 00:15:59.585 }, 00:15:59.585 { 00:15:59.585 "name": "BaseBdev4", 00:15:59.585 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.585 "is_configured": false, 00:15:59.585 "data_offset": 0, 00:15:59.585 "data_size": 0 00:15:59.585 } 00:15:59.585 ] 00:15:59.585 }' 00:15:59.585 18:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:59.585 18:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:00.151 18:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:00.151 [2024-07-24 18:52:45.098629] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:00.151 BaseBdev3 00:16:00.151 18:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:00.151 18:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:00.151 18:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:00.151 18:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:00.151 18:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:00.151 18:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:00.151 18:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:00.408 18:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:00.667 [ 00:16:00.667 { 00:16:00.667 "name": "BaseBdev3", 00:16:00.667 "aliases": [ 00:16:00.667 "a8977f53-ff41-45cc-bcc7-16b613555e68" 00:16:00.667 ], 00:16:00.667 "product_name": "Malloc disk", 00:16:00.667 "block_size": 512, 00:16:00.667 "num_blocks": 65536, 00:16:00.667 "uuid": "a8977f53-ff41-45cc-bcc7-16b613555e68", 00:16:00.667 "assigned_rate_limits": { 00:16:00.667 "rw_ios_per_sec": 0, 00:16:00.667 "rw_mbytes_per_sec": 0, 00:16:00.667 "r_mbytes_per_sec": 0, 00:16:00.667 "w_mbytes_per_sec": 0 00:16:00.667 }, 00:16:00.667 "claimed": true, 00:16:00.667 "claim_type": "exclusive_write", 00:16:00.667 "zoned": false, 00:16:00.667 "supported_io_types": { 00:16:00.667 "read": true, 00:16:00.667 "write": true, 00:16:00.667 "unmap": true, 00:16:00.667 "flush": true, 00:16:00.667 "reset": true, 00:16:00.667 "nvme_admin": false, 00:16:00.667 "nvme_io": false, 00:16:00.667 "nvme_io_md": false, 00:16:00.667 "write_zeroes": true, 00:16:00.667 "zcopy": true, 00:16:00.667 "get_zone_info": false, 00:16:00.667 "zone_management": false, 00:16:00.667 "zone_append": false, 00:16:00.667 "compare": false, 00:16:00.667 "compare_and_write": false, 00:16:00.667 "abort": true, 00:16:00.667 "seek_hole": false, 00:16:00.667 "seek_data": false, 00:16:00.667 "copy": true, 00:16:00.667 "nvme_iov_md": false 00:16:00.667 }, 00:16:00.667 "memory_domains": [ 00:16:00.667 { 00:16:00.667 "dma_device_id": "system", 00:16:00.667 "dma_device_type": 1 00:16:00.667 }, 00:16:00.667 { 00:16:00.667 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:00.667 "dma_device_type": 2 00:16:00.667 } 00:16:00.667 ], 00:16:00.667 "driver_specific": {} 00:16:00.667 } 00:16:00.667 ] 00:16:00.667 18:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:00.667 18:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:00.667 18:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:00.667 18:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:00.667 18:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:00.667 18:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:00.667 18:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:00.667 18:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:00.667 18:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:00.667 18:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:00.667 18:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:00.667 18:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:00.667 18:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:00.667 18:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:00.667 18:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:00.667 18:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:00.667 "name": "Existed_Raid", 00:16:00.667 "uuid": "2bf5a7f3-9f01-4ba6-954f-b402380d539b", 00:16:00.667 "strip_size_kb": 64, 00:16:00.667 "state": "configuring", 00:16:00.667 "raid_level": "concat", 00:16:00.667 "superblock": true, 00:16:00.667 "num_base_bdevs": 4, 00:16:00.667 "num_base_bdevs_discovered": 3, 00:16:00.667 "num_base_bdevs_operational": 4, 00:16:00.667 "base_bdevs_list": [ 00:16:00.667 { 00:16:00.667 "name": "BaseBdev1", 00:16:00.667 "uuid": "746bc0c4-2b30-4bac-9ab7-f66de605bd6f", 00:16:00.667 "is_configured": true, 00:16:00.667 "data_offset": 2048, 00:16:00.667 "data_size": 63488 00:16:00.667 }, 00:16:00.667 { 00:16:00.667 "name": "BaseBdev2", 00:16:00.667 "uuid": "186bcf2a-e509-4de8-81cb-2c2e9560191d", 00:16:00.667 "is_configured": true, 00:16:00.667 "data_offset": 2048, 00:16:00.667 "data_size": 63488 00:16:00.667 }, 00:16:00.667 { 00:16:00.667 "name": "BaseBdev3", 00:16:00.667 "uuid": "a8977f53-ff41-45cc-bcc7-16b613555e68", 00:16:00.667 "is_configured": true, 00:16:00.667 "data_offset": 2048, 00:16:00.667 "data_size": 63488 00:16:00.667 }, 00:16:00.667 { 00:16:00.667 "name": "BaseBdev4", 00:16:00.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:00.667 "is_configured": false, 00:16:00.667 "data_offset": 0, 00:16:00.667 "data_size": 0 00:16:00.667 } 00:16:00.667 ] 00:16:00.667 }' 00:16:00.667 18:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:00.667 18:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:01.262 18:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:01.520 [2024-07-24 18:52:46.312433] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:01.520 [2024-07-24 18:52:46.312582] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xd3f490 00:16:01.520 [2024-07-24 18:52:46.312592] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:01.520 [2024-07-24 18:52:46.312709] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd2b2d0 00:16:01.520 [2024-07-24 18:52:46.312795] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd3f490 00:16:01.520 [2024-07-24 18:52:46.312801] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd3f490 00:16:01.520 [2024-07-24 18:52:46.312867] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:01.520 BaseBdev4 00:16:01.520 18:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:16:01.520 18:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:16:01.520 18:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:01.520 18:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:01.520 18:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:01.520 18:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:01.520 18:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:01.520 18:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:01.780 [ 00:16:01.780 { 00:16:01.780 "name": "BaseBdev4", 00:16:01.780 "aliases": [ 00:16:01.780 "5d8b6e7e-7678-45cd-bf94-4ae1a9661f80" 00:16:01.780 ], 00:16:01.780 "product_name": "Malloc disk", 00:16:01.780 "block_size": 512, 00:16:01.780 "num_blocks": 65536, 00:16:01.780 "uuid": "5d8b6e7e-7678-45cd-bf94-4ae1a9661f80", 00:16:01.780 "assigned_rate_limits": { 00:16:01.780 "rw_ios_per_sec": 0, 00:16:01.780 "rw_mbytes_per_sec": 0, 00:16:01.780 "r_mbytes_per_sec": 0, 00:16:01.780 "w_mbytes_per_sec": 0 00:16:01.780 }, 00:16:01.780 "claimed": true, 00:16:01.780 "claim_type": "exclusive_write", 00:16:01.780 "zoned": false, 00:16:01.780 "supported_io_types": { 00:16:01.780 "read": true, 00:16:01.780 "write": true, 00:16:01.780 "unmap": true, 00:16:01.780 "flush": true, 00:16:01.780 "reset": true, 00:16:01.780 "nvme_admin": false, 00:16:01.780 "nvme_io": false, 00:16:01.780 "nvme_io_md": false, 00:16:01.780 "write_zeroes": true, 00:16:01.780 "zcopy": true, 00:16:01.780 "get_zone_info": false, 00:16:01.780 "zone_management": false, 00:16:01.780 "zone_append": false, 00:16:01.780 "compare": false, 00:16:01.780 "compare_and_write": false, 00:16:01.780 "abort": true, 00:16:01.780 "seek_hole": false, 00:16:01.780 "seek_data": false, 00:16:01.780 "copy": true, 00:16:01.780 "nvme_iov_md": false 00:16:01.780 }, 00:16:01.780 "memory_domains": [ 00:16:01.780 { 00:16:01.780 "dma_device_id": "system", 00:16:01.780 "dma_device_type": 1 00:16:01.780 }, 00:16:01.780 { 00:16:01.780 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:01.780 "dma_device_type": 2 00:16:01.780 } 00:16:01.780 ], 00:16:01.780 "driver_specific": {} 00:16:01.780 } 00:16:01.780 ] 00:16:01.780 18:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:01.780 18:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:01.780 18:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:01.780 18:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:16:01.780 18:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:01.780 18:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:01.780 18:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:01.780 18:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:01.780 18:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:01.780 18:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:01.780 18:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:01.780 18:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:01.780 18:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:01.780 18:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:01.780 18:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.039 18:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:02.039 "name": "Existed_Raid", 00:16:02.039 "uuid": "2bf5a7f3-9f01-4ba6-954f-b402380d539b", 00:16:02.039 "strip_size_kb": 64, 00:16:02.039 "state": "online", 00:16:02.039 "raid_level": "concat", 00:16:02.039 "superblock": true, 00:16:02.039 "num_base_bdevs": 4, 00:16:02.039 "num_base_bdevs_discovered": 4, 00:16:02.039 "num_base_bdevs_operational": 4, 00:16:02.039 "base_bdevs_list": [ 00:16:02.039 { 00:16:02.039 "name": "BaseBdev1", 00:16:02.039 "uuid": "746bc0c4-2b30-4bac-9ab7-f66de605bd6f", 00:16:02.039 "is_configured": true, 00:16:02.039 "data_offset": 2048, 00:16:02.039 "data_size": 63488 00:16:02.039 }, 00:16:02.039 { 00:16:02.039 "name": "BaseBdev2", 00:16:02.039 "uuid": "186bcf2a-e509-4de8-81cb-2c2e9560191d", 00:16:02.039 "is_configured": true, 00:16:02.039 "data_offset": 2048, 00:16:02.039 "data_size": 63488 00:16:02.039 }, 00:16:02.039 { 00:16:02.039 "name": "BaseBdev3", 00:16:02.039 "uuid": "a8977f53-ff41-45cc-bcc7-16b613555e68", 00:16:02.039 "is_configured": true, 00:16:02.039 "data_offset": 2048, 00:16:02.039 "data_size": 63488 00:16:02.039 }, 00:16:02.039 { 00:16:02.039 "name": "BaseBdev4", 00:16:02.039 "uuid": "5d8b6e7e-7678-45cd-bf94-4ae1a9661f80", 00:16:02.039 "is_configured": true, 00:16:02.039 "data_offset": 2048, 00:16:02.039 "data_size": 63488 00:16:02.039 } 00:16:02.039 ] 00:16:02.039 }' 00:16:02.039 18:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:02.039 18:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:02.606 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:02.606 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:02.606 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:02.606 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:02.606 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:02.606 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:02.606 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:02.606 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:02.606 [2024-07-24 18:52:47.467651] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:02.606 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:02.606 "name": "Existed_Raid", 00:16:02.606 "aliases": [ 00:16:02.606 "2bf5a7f3-9f01-4ba6-954f-b402380d539b" 00:16:02.606 ], 00:16:02.606 "product_name": "Raid Volume", 00:16:02.606 "block_size": 512, 00:16:02.606 "num_blocks": 253952, 00:16:02.606 "uuid": "2bf5a7f3-9f01-4ba6-954f-b402380d539b", 00:16:02.606 "assigned_rate_limits": { 00:16:02.606 "rw_ios_per_sec": 0, 00:16:02.606 "rw_mbytes_per_sec": 0, 00:16:02.606 "r_mbytes_per_sec": 0, 00:16:02.606 "w_mbytes_per_sec": 0 00:16:02.606 }, 00:16:02.606 "claimed": false, 00:16:02.606 "zoned": false, 00:16:02.606 "supported_io_types": { 00:16:02.606 "read": true, 00:16:02.606 "write": true, 00:16:02.606 "unmap": true, 00:16:02.606 "flush": true, 00:16:02.606 "reset": true, 00:16:02.606 "nvme_admin": false, 00:16:02.606 "nvme_io": false, 00:16:02.606 "nvme_io_md": false, 00:16:02.606 "write_zeroes": true, 00:16:02.606 "zcopy": false, 00:16:02.606 "get_zone_info": false, 00:16:02.606 "zone_management": false, 00:16:02.606 "zone_append": false, 00:16:02.606 "compare": false, 00:16:02.606 "compare_and_write": false, 00:16:02.606 "abort": false, 00:16:02.606 "seek_hole": false, 00:16:02.606 "seek_data": false, 00:16:02.606 "copy": false, 00:16:02.606 "nvme_iov_md": false 00:16:02.606 }, 00:16:02.606 "memory_domains": [ 00:16:02.606 { 00:16:02.606 "dma_device_id": "system", 00:16:02.606 "dma_device_type": 1 00:16:02.606 }, 00:16:02.606 { 00:16:02.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:02.606 "dma_device_type": 2 00:16:02.606 }, 00:16:02.606 { 00:16:02.606 "dma_device_id": "system", 00:16:02.606 "dma_device_type": 1 00:16:02.606 }, 00:16:02.606 { 00:16:02.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:02.606 "dma_device_type": 2 00:16:02.606 }, 00:16:02.606 { 00:16:02.606 "dma_device_id": "system", 00:16:02.606 "dma_device_type": 1 00:16:02.606 }, 00:16:02.606 { 00:16:02.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:02.606 "dma_device_type": 2 00:16:02.606 }, 00:16:02.606 { 00:16:02.606 "dma_device_id": "system", 00:16:02.606 "dma_device_type": 1 00:16:02.606 }, 00:16:02.606 { 00:16:02.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:02.606 "dma_device_type": 2 00:16:02.606 } 00:16:02.606 ], 00:16:02.606 "driver_specific": { 00:16:02.606 "raid": { 00:16:02.606 "uuid": "2bf5a7f3-9f01-4ba6-954f-b402380d539b", 00:16:02.606 "strip_size_kb": 64, 00:16:02.606 "state": "online", 00:16:02.606 "raid_level": "concat", 00:16:02.606 "superblock": true, 00:16:02.606 "num_base_bdevs": 4, 00:16:02.606 "num_base_bdevs_discovered": 4, 00:16:02.606 "num_base_bdevs_operational": 4, 00:16:02.606 "base_bdevs_list": [ 00:16:02.606 { 00:16:02.606 "name": "BaseBdev1", 00:16:02.606 "uuid": "746bc0c4-2b30-4bac-9ab7-f66de605bd6f", 00:16:02.606 "is_configured": true, 00:16:02.606 "data_offset": 2048, 00:16:02.606 "data_size": 63488 00:16:02.606 }, 00:16:02.606 { 00:16:02.606 "name": "BaseBdev2", 00:16:02.606 "uuid": "186bcf2a-e509-4de8-81cb-2c2e9560191d", 00:16:02.606 "is_configured": true, 00:16:02.606 "data_offset": 2048, 00:16:02.606 "data_size": 63488 00:16:02.606 }, 00:16:02.606 { 00:16:02.606 "name": "BaseBdev3", 00:16:02.606 "uuid": "a8977f53-ff41-45cc-bcc7-16b613555e68", 00:16:02.606 "is_configured": true, 00:16:02.606 "data_offset": 2048, 00:16:02.606 "data_size": 63488 00:16:02.606 }, 00:16:02.606 { 00:16:02.606 "name": "BaseBdev4", 00:16:02.606 "uuid": "5d8b6e7e-7678-45cd-bf94-4ae1a9661f80", 00:16:02.606 "is_configured": true, 00:16:02.606 "data_offset": 2048, 00:16:02.606 "data_size": 63488 00:16:02.606 } 00:16:02.606 ] 00:16:02.606 } 00:16:02.606 } 00:16:02.606 }' 00:16:02.606 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:02.606 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:02.606 BaseBdev2 00:16:02.606 BaseBdev3 00:16:02.606 BaseBdev4' 00:16:02.606 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:02.606 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:02.606 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:02.865 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:02.865 "name": "BaseBdev1", 00:16:02.865 "aliases": [ 00:16:02.865 "746bc0c4-2b30-4bac-9ab7-f66de605bd6f" 00:16:02.865 ], 00:16:02.865 "product_name": "Malloc disk", 00:16:02.865 "block_size": 512, 00:16:02.865 "num_blocks": 65536, 00:16:02.865 "uuid": "746bc0c4-2b30-4bac-9ab7-f66de605bd6f", 00:16:02.865 "assigned_rate_limits": { 00:16:02.865 "rw_ios_per_sec": 0, 00:16:02.865 "rw_mbytes_per_sec": 0, 00:16:02.865 "r_mbytes_per_sec": 0, 00:16:02.865 "w_mbytes_per_sec": 0 00:16:02.865 }, 00:16:02.865 "claimed": true, 00:16:02.865 "claim_type": "exclusive_write", 00:16:02.865 "zoned": false, 00:16:02.865 "supported_io_types": { 00:16:02.865 "read": true, 00:16:02.865 "write": true, 00:16:02.865 "unmap": true, 00:16:02.865 "flush": true, 00:16:02.865 "reset": true, 00:16:02.865 "nvme_admin": false, 00:16:02.865 "nvme_io": false, 00:16:02.865 "nvme_io_md": false, 00:16:02.865 "write_zeroes": true, 00:16:02.865 "zcopy": true, 00:16:02.865 "get_zone_info": false, 00:16:02.865 "zone_management": false, 00:16:02.865 "zone_append": false, 00:16:02.865 "compare": false, 00:16:02.865 "compare_and_write": false, 00:16:02.865 "abort": true, 00:16:02.865 "seek_hole": false, 00:16:02.865 "seek_data": false, 00:16:02.865 "copy": true, 00:16:02.865 "nvme_iov_md": false 00:16:02.865 }, 00:16:02.865 "memory_domains": [ 00:16:02.865 { 00:16:02.865 "dma_device_id": "system", 00:16:02.865 "dma_device_type": 1 00:16:02.865 }, 00:16:02.865 { 00:16:02.865 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:02.865 "dma_device_type": 2 00:16:02.865 } 00:16:02.865 ], 00:16:02.865 "driver_specific": {} 00:16:02.865 }' 00:16:02.865 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:02.865 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:02.865 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:02.865 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:02.865 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:02.865 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:02.865 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:03.124 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:03.124 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:03.124 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:03.124 18:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:03.124 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:03.124 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:03.124 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:03.124 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:03.382 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:03.382 "name": "BaseBdev2", 00:16:03.382 "aliases": [ 00:16:03.382 "186bcf2a-e509-4de8-81cb-2c2e9560191d" 00:16:03.382 ], 00:16:03.382 "product_name": "Malloc disk", 00:16:03.382 "block_size": 512, 00:16:03.382 "num_blocks": 65536, 00:16:03.382 "uuid": "186bcf2a-e509-4de8-81cb-2c2e9560191d", 00:16:03.382 "assigned_rate_limits": { 00:16:03.382 "rw_ios_per_sec": 0, 00:16:03.383 "rw_mbytes_per_sec": 0, 00:16:03.383 "r_mbytes_per_sec": 0, 00:16:03.383 "w_mbytes_per_sec": 0 00:16:03.383 }, 00:16:03.383 "claimed": true, 00:16:03.383 "claim_type": "exclusive_write", 00:16:03.383 "zoned": false, 00:16:03.383 "supported_io_types": { 00:16:03.383 "read": true, 00:16:03.383 "write": true, 00:16:03.383 "unmap": true, 00:16:03.383 "flush": true, 00:16:03.383 "reset": true, 00:16:03.383 "nvme_admin": false, 00:16:03.383 "nvme_io": false, 00:16:03.383 "nvme_io_md": false, 00:16:03.383 "write_zeroes": true, 00:16:03.383 "zcopy": true, 00:16:03.383 "get_zone_info": false, 00:16:03.383 "zone_management": false, 00:16:03.383 "zone_append": false, 00:16:03.383 "compare": false, 00:16:03.383 "compare_and_write": false, 00:16:03.383 "abort": true, 00:16:03.383 "seek_hole": false, 00:16:03.383 "seek_data": false, 00:16:03.383 "copy": true, 00:16:03.383 "nvme_iov_md": false 00:16:03.383 }, 00:16:03.383 "memory_domains": [ 00:16:03.383 { 00:16:03.383 "dma_device_id": "system", 00:16:03.383 "dma_device_type": 1 00:16:03.383 }, 00:16:03.383 { 00:16:03.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:03.383 "dma_device_type": 2 00:16:03.383 } 00:16:03.383 ], 00:16:03.383 "driver_specific": {} 00:16:03.383 }' 00:16:03.383 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:03.383 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:03.383 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:03.383 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:03.383 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:03.383 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:03.383 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:03.383 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:03.642 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:03.642 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:03.642 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:03.642 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:03.642 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:03.642 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:03.642 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:03.900 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:03.900 "name": "BaseBdev3", 00:16:03.900 "aliases": [ 00:16:03.900 "a8977f53-ff41-45cc-bcc7-16b613555e68" 00:16:03.900 ], 00:16:03.900 "product_name": "Malloc disk", 00:16:03.900 "block_size": 512, 00:16:03.900 "num_blocks": 65536, 00:16:03.900 "uuid": "a8977f53-ff41-45cc-bcc7-16b613555e68", 00:16:03.900 "assigned_rate_limits": { 00:16:03.900 "rw_ios_per_sec": 0, 00:16:03.900 "rw_mbytes_per_sec": 0, 00:16:03.900 "r_mbytes_per_sec": 0, 00:16:03.900 "w_mbytes_per_sec": 0 00:16:03.900 }, 00:16:03.900 "claimed": true, 00:16:03.900 "claim_type": "exclusive_write", 00:16:03.900 "zoned": false, 00:16:03.900 "supported_io_types": { 00:16:03.900 "read": true, 00:16:03.900 "write": true, 00:16:03.900 "unmap": true, 00:16:03.900 "flush": true, 00:16:03.900 "reset": true, 00:16:03.900 "nvme_admin": false, 00:16:03.900 "nvme_io": false, 00:16:03.900 "nvme_io_md": false, 00:16:03.900 "write_zeroes": true, 00:16:03.900 "zcopy": true, 00:16:03.900 "get_zone_info": false, 00:16:03.900 "zone_management": false, 00:16:03.900 "zone_append": false, 00:16:03.900 "compare": false, 00:16:03.900 "compare_and_write": false, 00:16:03.900 "abort": true, 00:16:03.900 "seek_hole": false, 00:16:03.900 "seek_data": false, 00:16:03.900 "copy": true, 00:16:03.900 "nvme_iov_md": false 00:16:03.900 }, 00:16:03.900 "memory_domains": [ 00:16:03.900 { 00:16:03.900 "dma_device_id": "system", 00:16:03.900 "dma_device_type": 1 00:16:03.900 }, 00:16:03.900 { 00:16:03.900 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:03.900 "dma_device_type": 2 00:16:03.900 } 00:16:03.900 ], 00:16:03.900 "driver_specific": {} 00:16:03.900 }' 00:16:03.900 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:03.900 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:03.900 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:03.900 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:03.900 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:03.900 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:03.900 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:03.900 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:03.900 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:03.900 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:04.158 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:04.158 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:04.158 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:04.158 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:04.158 18:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:04.158 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:04.158 "name": "BaseBdev4", 00:16:04.158 "aliases": [ 00:16:04.158 "5d8b6e7e-7678-45cd-bf94-4ae1a9661f80" 00:16:04.158 ], 00:16:04.158 "product_name": "Malloc disk", 00:16:04.158 "block_size": 512, 00:16:04.158 "num_blocks": 65536, 00:16:04.158 "uuid": "5d8b6e7e-7678-45cd-bf94-4ae1a9661f80", 00:16:04.158 "assigned_rate_limits": { 00:16:04.158 "rw_ios_per_sec": 0, 00:16:04.158 "rw_mbytes_per_sec": 0, 00:16:04.158 "r_mbytes_per_sec": 0, 00:16:04.158 "w_mbytes_per_sec": 0 00:16:04.158 }, 00:16:04.158 "claimed": true, 00:16:04.158 "claim_type": "exclusive_write", 00:16:04.158 "zoned": false, 00:16:04.158 "supported_io_types": { 00:16:04.158 "read": true, 00:16:04.158 "write": true, 00:16:04.158 "unmap": true, 00:16:04.158 "flush": true, 00:16:04.158 "reset": true, 00:16:04.158 "nvme_admin": false, 00:16:04.158 "nvme_io": false, 00:16:04.158 "nvme_io_md": false, 00:16:04.158 "write_zeroes": true, 00:16:04.158 "zcopy": true, 00:16:04.158 "get_zone_info": false, 00:16:04.158 "zone_management": false, 00:16:04.158 "zone_append": false, 00:16:04.158 "compare": false, 00:16:04.158 "compare_and_write": false, 00:16:04.158 "abort": true, 00:16:04.158 "seek_hole": false, 00:16:04.158 "seek_data": false, 00:16:04.158 "copy": true, 00:16:04.158 "nvme_iov_md": false 00:16:04.158 }, 00:16:04.158 "memory_domains": [ 00:16:04.158 { 00:16:04.158 "dma_device_id": "system", 00:16:04.158 "dma_device_type": 1 00:16:04.158 }, 00:16:04.159 { 00:16:04.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:04.159 "dma_device_type": 2 00:16:04.159 } 00:16:04.159 ], 00:16:04.159 "driver_specific": {} 00:16:04.159 }' 00:16:04.159 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:04.417 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:04.417 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:04.417 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:04.417 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:04.417 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:04.417 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:04.417 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:04.417 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:04.417 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:04.675 [2024-07-24 18:52:49.617049] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:04.675 [2024-07-24 18:52:49.617069] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:04.675 [2024-07-24 18:52:49.617102] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.675 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.934 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.934 "name": "Existed_Raid", 00:16:04.934 "uuid": "2bf5a7f3-9f01-4ba6-954f-b402380d539b", 00:16:04.934 "strip_size_kb": 64, 00:16:04.934 "state": "offline", 00:16:04.934 "raid_level": "concat", 00:16:04.934 "superblock": true, 00:16:04.934 "num_base_bdevs": 4, 00:16:04.934 "num_base_bdevs_discovered": 3, 00:16:04.934 "num_base_bdevs_operational": 3, 00:16:04.934 "base_bdevs_list": [ 00:16:04.934 { 00:16:04.934 "name": null, 00:16:04.934 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.934 "is_configured": false, 00:16:04.934 "data_offset": 2048, 00:16:04.934 "data_size": 63488 00:16:04.934 }, 00:16:04.934 { 00:16:04.934 "name": "BaseBdev2", 00:16:04.934 "uuid": "186bcf2a-e509-4de8-81cb-2c2e9560191d", 00:16:04.934 "is_configured": true, 00:16:04.934 "data_offset": 2048, 00:16:04.934 "data_size": 63488 00:16:04.934 }, 00:16:04.934 { 00:16:04.934 "name": "BaseBdev3", 00:16:04.934 "uuid": "a8977f53-ff41-45cc-bcc7-16b613555e68", 00:16:04.934 "is_configured": true, 00:16:04.934 "data_offset": 2048, 00:16:04.934 "data_size": 63488 00:16:04.934 }, 00:16:04.934 { 00:16:04.934 "name": "BaseBdev4", 00:16:04.934 "uuid": "5d8b6e7e-7678-45cd-bf94-4ae1a9661f80", 00:16:04.934 "is_configured": true, 00:16:04.934 "data_offset": 2048, 00:16:04.934 "data_size": 63488 00:16:04.934 } 00:16:04.934 ] 00:16:04.934 }' 00:16:04.935 18:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.935 18:52:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:05.502 18:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:05.502 18:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:05.502 18:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.502 18:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:05.502 18:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:05.502 18:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:05.502 18:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:05.760 [2024-07-24 18:52:50.596370] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:05.760 18:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:05.760 18:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:05.760 18:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.760 18:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:06.019 18:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:06.019 18:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:06.019 18:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:06.019 [2024-07-24 18:52:50.947109] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:06.019 18:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:06.019 18:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:06.019 18:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.019 18:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:06.278 18:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:06.278 18:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:06.278 18:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:16:06.536 [2024-07-24 18:52:51.289677] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:16:06.536 [2024-07-24 18:52:51.289708] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd3f490 name Existed_Raid, state offline 00:16:06.536 18:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:06.536 18:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:06.537 18:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:06.537 18:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.537 18:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:06.537 18:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:06.537 18:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:16:06.537 18:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:06.537 18:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:06.537 18:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:06.795 BaseBdev2 00:16:06.795 18:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:06.795 18:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:06.795 18:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:06.795 18:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:06.795 18:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:06.795 18:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:06.795 18:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:07.054 18:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:07.054 [ 00:16:07.054 { 00:16:07.054 "name": "BaseBdev2", 00:16:07.054 "aliases": [ 00:16:07.054 "7bf6cb96-43b8-4533-a8a4-d0912e49abfe" 00:16:07.054 ], 00:16:07.054 "product_name": "Malloc disk", 00:16:07.054 "block_size": 512, 00:16:07.054 "num_blocks": 65536, 00:16:07.054 "uuid": "7bf6cb96-43b8-4533-a8a4-d0912e49abfe", 00:16:07.054 "assigned_rate_limits": { 00:16:07.054 "rw_ios_per_sec": 0, 00:16:07.054 "rw_mbytes_per_sec": 0, 00:16:07.054 "r_mbytes_per_sec": 0, 00:16:07.054 "w_mbytes_per_sec": 0 00:16:07.054 }, 00:16:07.054 "claimed": false, 00:16:07.054 "zoned": false, 00:16:07.054 "supported_io_types": { 00:16:07.054 "read": true, 00:16:07.054 "write": true, 00:16:07.054 "unmap": true, 00:16:07.054 "flush": true, 00:16:07.054 "reset": true, 00:16:07.054 "nvme_admin": false, 00:16:07.054 "nvme_io": false, 00:16:07.054 "nvme_io_md": false, 00:16:07.054 "write_zeroes": true, 00:16:07.054 "zcopy": true, 00:16:07.054 "get_zone_info": false, 00:16:07.054 "zone_management": false, 00:16:07.054 "zone_append": false, 00:16:07.054 "compare": false, 00:16:07.054 "compare_and_write": false, 00:16:07.054 "abort": true, 00:16:07.054 "seek_hole": false, 00:16:07.054 "seek_data": false, 00:16:07.054 "copy": true, 00:16:07.054 "nvme_iov_md": false 00:16:07.054 }, 00:16:07.054 "memory_domains": [ 00:16:07.054 { 00:16:07.054 "dma_device_id": "system", 00:16:07.054 "dma_device_type": 1 00:16:07.054 }, 00:16:07.054 { 00:16:07.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.054 "dma_device_type": 2 00:16:07.054 } 00:16:07.054 ], 00:16:07.054 "driver_specific": {} 00:16:07.054 } 00:16:07.054 ] 00:16:07.054 18:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:07.054 18:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:07.054 18:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:07.055 18:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:07.313 BaseBdev3 00:16:07.313 18:52:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:07.313 18:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:07.313 18:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:07.313 18:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:07.313 18:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:07.313 18:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:07.313 18:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:07.313 18:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:07.571 [ 00:16:07.571 { 00:16:07.571 "name": "BaseBdev3", 00:16:07.571 "aliases": [ 00:16:07.571 "3efec0cb-e390-427e-bce2-b2faa0632a52" 00:16:07.571 ], 00:16:07.571 "product_name": "Malloc disk", 00:16:07.571 "block_size": 512, 00:16:07.572 "num_blocks": 65536, 00:16:07.572 "uuid": "3efec0cb-e390-427e-bce2-b2faa0632a52", 00:16:07.572 "assigned_rate_limits": { 00:16:07.572 "rw_ios_per_sec": 0, 00:16:07.572 "rw_mbytes_per_sec": 0, 00:16:07.572 "r_mbytes_per_sec": 0, 00:16:07.572 "w_mbytes_per_sec": 0 00:16:07.572 }, 00:16:07.572 "claimed": false, 00:16:07.572 "zoned": false, 00:16:07.572 "supported_io_types": { 00:16:07.572 "read": true, 00:16:07.572 "write": true, 00:16:07.572 "unmap": true, 00:16:07.572 "flush": true, 00:16:07.572 "reset": true, 00:16:07.572 "nvme_admin": false, 00:16:07.572 "nvme_io": false, 00:16:07.572 "nvme_io_md": false, 00:16:07.572 "write_zeroes": true, 00:16:07.572 "zcopy": true, 00:16:07.572 "get_zone_info": false, 00:16:07.572 "zone_management": false, 00:16:07.572 "zone_append": false, 00:16:07.572 "compare": false, 00:16:07.572 "compare_and_write": false, 00:16:07.572 "abort": true, 00:16:07.572 "seek_hole": false, 00:16:07.572 "seek_data": false, 00:16:07.572 "copy": true, 00:16:07.572 "nvme_iov_md": false 00:16:07.572 }, 00:16:07.572 "memory_domains": [ 00:16:07.572 { 00:16:07.572 "dma_device_id": "system", 00:16:07.572 "dma_device_type": 1 00:16:07.572 }, 00:16:07.572 { 00:16:07.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.572 "dma_device_type": 2 00:16:07.572 } 00:16:07.572 ], 00:16:07.572 "driver_specific": {} 00:16:07.572 } 00:16:07.572 ] 00:16:07.572 18:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:07.572 18:52:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:07.572 18:52:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:07.572 18:52:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:07.831 BaseBdev4 00:16:07.831 18:52:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:16:07.831 18:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:16:07.831 18:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:07.831 18:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:07.831 18:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:07.831 18:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:07.831 18:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:07.831 18:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:08.090 [ 00:16:08.090 { 00:16:08.090 "name": "BaseBdev4", 00:16:08.090 "aliases": [ 00:16:08.090 "9fdae26f-e462-4bb2-afee-966d37b9556a" 00:16:08.090 ], 00:16:08.090 "product_name": "Malloc disk", 00:16:08.090 "block_size": 512, 00:16:08.090 "num_blocks": 65536, 00:16:08.090 "uuid": "9fdae26f-e462-4bb2-afee-966d37b9556a", 00:16:08.090 "assigned_rate_limits": { 00:16:08.090 "rw_ios_per_sec": 0, 00:16:08.090 "rw_mbytes_per_sec": 0, 00:16:08.090 "r_mbytes_per_sec": 0, 00:16:08.090 "w_mbytes_per_sec": 0 00:16:08.090 }, 00:16:08.090 "claimed": false, 00:16:08.090 "zoned": false, 00:16:08.090 "supported_io_types": { 00:16:08.090 "read": true, 00:16:08.090 "write": true, 00:16:08.090 "unmap": true, 00:16:08.090 "flush": true, 00:16:08.090 "reset": true, 00:16:08.090 "nvme_admin": false, 00:16:08.090 "nvme_io": false, 00:16:08.090 "nvme_io_md": false, 00:16:08.090 "write_zeroes": true, 00:16:08.090 "zcopy": true, 00:16:08.090 "get_zone_info": false, 00:16:08.090 "zone_management": false, 00:16:08.090 "zone_append": false, 00:16:08.090 "compare": false, 00:16:08.090 "compare_and_write": false, 00:16:08.090 "abort": true, 00:16:08.090 "seek_hole": false, 00:16:08.090 "seek_data": false, 00:16:08.090 "copy": true, 00:16:08.090 "nvme_iov_md": false 00:16:08.090 }, 00:16:08.090 "memory_domains": [ 00:16:08.090 { 00:16:08.090 "dma_device_id": "system", 00:16:08.090 "dma_device_type": 1 00:16:08.090 }, 00:16:08.090 { 00:16:08.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:08.090 "dma_device_type": 2 00:16:08.090 } 00:16:08.090 ], 00:16:08.090 "driver_specific": {} 00:16:08.090 } 00:16:08.090 ] 00:16:08.090 18:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:08.090 18:52:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:08.090 18:52:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:08.090 18:52:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:08.349 [2024-07-24 18:52:53.099251] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:08.349 [2024-07-24 18:52:53.099282] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:08.349 [2024-07-24 18:52:53.099293] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:08.349 [2024-07-24 18:52:53.100207] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:08.349 [2024-07-24 18:52:53.100238] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:08.349 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:08.349 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:08.349 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:08.349 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:08.349 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:08.349 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:08.349 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:08.349 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:08.349 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:08.349 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:08.349 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.349 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:08.349 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:08.349 "name": "Existed_Raid", 00:16:08.349 "uuid": "0febfa1d-ec92-4a08-9eaf-3b748edb3117", 00:16:08.349 "strip_size_kb": 64, 00:16:08.349 "state": "configuring", 00:16:08.349 "raid_level": "concat", 00:16:08.349 "superblock": true, 00:16:08.349 "num_base_bdevs": 4, 00:16:08.349 "num_base_bdevs_discovered": 3, 00:16:08.349 "num_base_bdevs_operational": 4, 00:16:08.349 "base_bdevs_list": [ 00:16:08.349 { 00:16:08.349 "name": "BaseBdev1", 00:16:08.349 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.349 "is_configured": false, 00:16:08.349 "data_offset": 0, 00:16:08.349 "data_size": 0 00:16:08.349 }, 00:16:08.349 { 00:16:08.349 "name": "BaseBdev2", 00:16:08.349 "uuid": "7bf6cb96-43b8-4533-a8a4-d0912e49abfe", 00:16:08.349 "is_configured": true, 00:16:08.349 "data_offset": 2048, 00:16:08.349 "data_size": 63488 00:16:08.349 }, 00:16:08.349 { 00:16:08.349 "name": "BaseBdev3", 00:16:08.349 "uuid": "3efec0cb-e390-427e-bce2-b2faa0632a52", 00:16:08.349 "is_configured": true, 00:16:08.349 "data_offset": 2048, 00:16:08.349 "data_size": 63488 00:16:08.349 }, 00:16:08.349 { 00:16:08.349 "name": "BaseBdev4", 00:16:08.349 "uuid": "9fdae26f-e462-4bb2-afee-966d37b9556a", 00:16:08.349 "is_configured": true, 00:16:08.349 "data_offset": 2048, 00:16:08.349 "data_size": 63488 00:16:08.349 } 00:16:08.349 ] 00:16:08.349 }' 00:16:08.349 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:08.349 18:52:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:08.918 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:08.918 [2024-07-24 18:52:53.909325] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:09.177 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:09.177 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:09.177 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:09.177 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:09.177 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:09.177 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:09.177 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:09.177 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:09.177 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:09.177 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:09.177 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.177 18:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:09.177 18:52:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:09.177 "name": "Existed_Raid", 00:16:09.177 "uuid": "0febfa1d-ec92-4a08-9eaf-3b748edb3117", 00:16:09.177 "strip_size_kb": 64, 00:16:09.177 "state": "configuring", 00:16:09.177 "raid_level": "concat", 00:16:09.177 "superblock": true, 00:16:09.177 "num_base_bdevs": 4, 00:16:09.177 "num_base_bdevs_discovered": 2, 00:16:09.177 "num_base_bdevs_operational": 4, 00:16:09.177 "base_bdevs_list": [ 00:16:09.177 { 00:16:09.177 "name": "BaseBdev1", 00:16:09.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:09.177 "is_configured": false, 00:16:09.177 "data_offset": 0, 00:16:09.177 "data_size": 0 00:16:09.177 }, 00:16:09.177 { 00:16:09.177 "name": null, 00:16:09.177 "uuid": "7bf6cb96-43b8-4533-a8a4-d0912e49abfe", 00:16:09.177 "is_configured": false, 00:16:09.177 "data_offset": 2048, 00:16:09.177 "data_size": 63488 00:16:09.177 }, 00:16:09.177 { 00:16:09.177 "name": "BaseBdev3", 00:16:09.178 "uuid": "3efec0cb-e390-427e-bce2-b2faa0632a52", 00:16:09.178 "is_configured": true, 00:16:09.178 "data_offset": 2048, 00:16:09.178 "data_size": 63488 00:16:09.178 }, 00:16:09.178 { 00:16:09.178 "name": "BaseBdev4", 00:16:09.178 "uuid": "9fdae26f-e462-4bb2-afee-966d37b9556a", 00:16:09.178 "is_configured": true, 00:16:09.178 "data_offset": 2048, 00:16:09.178 "data_size": 63488 00:16:09.178 } 00:16:09.178 ] 00:16:09.178 }' 00:16:09.178 18:52:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:09.178 18:52:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:09.745 18:52:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.745 18:52:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:09.745 18:52:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:09.745 18:52:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:10.005 [2024-07-24 18:52:54.906659] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:10.005 BaseBdev1 00:16:10.005 18:52:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:10.005 18:52:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:10.005 18:52:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:10.005 18:52:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:10.005 18:52:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:10.005 18:52:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:10.005 18:52:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:10.263 18:52:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:10.263 [ 00:16:10.263 { 00:16:10.263 "name": "BaseBdev1", 00:16:10.263 "aliases": [ 00:16:10.263 "a9573f8a-0492-477e-aa94-d07695307c61" 00:16:10.263 ], 00:16:10.263 "product_name": "Malloc disk", 00:16:10.264 "block_size": 512, 00:16:10.264 "num_blocks": 65536, 00:16:10.264 "uuid": "a9573f8a-0492-477e-aa94-d07695307c61", 00:16:10.264 "assigned_rate_limits": { 00:16:10.264 "rw_ios_per_sec": 0, 00:16:10.264 "rw_mbytes_per_sec": 0, 00:16:10.264 "r_mbytes_per_sec": 0, 00:16:10.264 "w_mbytes_per_sec": 0 00:16:10.264 }, 00:16:10.264 "claimed": true, 00:16:10.264 "claim_type": "exclusive_write", 00:16:10.264 "zoned": false, 00:16:10.264 "supported_io_types": { 00:16:10.264 "read": true, 00:16:10.264 "write": true, 00:16:10.264 "unmap": true, 00:16:10.264 "flush": true, 00:16:10.264 "reset": true, 00:16:10.264 "nvme_admin": false, 00:16:10.264 "nvme_io": false, 00:16:10.264 "nvme_io_md": false, 00:16:10.264 "write_zeroes": true, 00:16:10.264 "zcopy": true, 00:16:10.264 "get_zone_info": false, 00:16:10.264 "zone_management": false, 00:16:10.264 "zone_append": false, 00:16:10.264 "compare": false, 00:16:10.264 "compare_and_write": false, 00:16:10.264 "abort": true, 00:16:10.264 "seek_hole": false, 00:16:10.264 "seek_data": false, 00:16:10.264 "copy": true, 00:16:10.264 "nvme_iov_md": false 00:16:10.264 }, 00:16:10.264 "memory_domains": [ 00:16:10.264 { 00:16:10.264 "dma_device_id": "system", 00:16:10.264 "dma_device_type": 1 00:16:10.264 }, 00:16:10.264 { 00:16:10.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.264 "dma_device_type": 2 00:16:10.264 } 00:16:10.264 ], 00:16:10.264 "driver_specific": {} 00:16:10.264 } 00:16:10.264 ] 00:16:10.264 18:52:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:10.264 18:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:10.264 18:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:10.264 18:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:10.264 18:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:10.264 18:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:10.264 18:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:10.264 18:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:10.264 18:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:10.264 18:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:10.264 18:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:10.264 18:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.264 18:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:10.523 18:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:10.523 "name": "Existed_Raid", 00:16:10.523 "uuid": "0febfa1d-ec92-4a08-9eaf-3b748edb3117", 00:16:10.523 "strip_size_kb": 64, 00:16:10.523 "state": "configuring", 00:16:10.523 "raid_level": "concat", 00:16:10.523 "superblock": true, 00:16:10.523 "num_base_bdevs": 4, 00:16:10.523 "num_base_bdevs_discovered": 3, 00:16:10.523 "num_base_bdevs_operational": 4, 00:16:10.523 "base_bdevs_list": [ 00:16:10.523 { 00:16:10.523 "name": "BaseBdev1", 00:16:10.523 "uuid": "a9573f8a-0492-477e-aa94-d07695307c61", 00:16:10.523 "is_configured": true, 00:16:10.523 "data_offset": 2048, 00:16:10.523 "data_size": 63488 00:16:10.523 }, 00:16:10.523 { 00:16:10.523 "name": null, 00:16:10.523 "uuid": "7bf6cb96-43b8-4533-a8a4-d0912e49abfe", 00:16:10.523 "is_configured": false, 00:16:10.523 "data_offset": 2048, 00:16:10.523 "data_size": 63488 00:16:10.523 }, 00:16:10.523 { 00:16:10.523 "name": "BaseBdev3", 00:16:10.523 "uuid": "3efec0cb-e390-427e-bce2-b2faa0632a52", 00:16:10.523 "is_configured": true, 00:16:10.523 "data_offset": 2048, 00:16:10.523 "data_size": 63488 00:16:10.523 }, 00:16:10.523 { 00:16:10.523 "name": "BaseBdev4", 00:16:10.523 "uuid": "9fdae26f-e462-4bb2-afee-966d37b9556a", 00:16:10.523 "is_configured": true, 00:16:10.523 "data_offset": 2048, 00:16:10.523 "data_size": 63488 00:16:10.523 } 00:16:10.523 ] 00:16:10.523 }' 00:16:10.523 18:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:10.523 18:52:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:11.091 18:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:11.091 18:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:11.091 18:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:11.091 18:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:11.350 [2024-07-24 18:52:56.202016] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:11.350 18:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:11.350 18:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:11.350 18:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:11.350 18:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:11.350 18:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:11.350 18:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:11.350 18:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:11.350 18:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:11.350 18:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:11.350 18:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:11.350 18:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:11.350 18:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:11.609 18:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:11.609 "name": "Existed_Raid", 00:16:11.609 "uuid": "0febfa1d-ec92-4a08-9eaf-3b748edb3117", 00:16:11.609 "strip_size_kb": 64, 00:16:11.609 "state": "configuring", 00:16:11.609 "raid_level": "concat", 00:16:11.609 "superblock": true, 00:16:11.609 "num_base_bdevs": 4, 00:16:11.609 "num_base_bdevs_discovered": 2, 00:16:11.609 "num_base_bdevs_operational": 4, 00:16:11.609 "base_bdevs_list": [ 00:16:11.609 { 00:16:11.609 "name": "BaseBdev1", 00:16:11.609 "uuid": "a9573f8a-0492-477e-aa94-d07695307c61", 00:16:11.609 "is_configured": true, 00:16:11.609 "data_offset": 2048, 00:16:11.609 "data_size": 63488 00:16:11.609 }, 00:16:11.609 { 00:16:11.609 "name": null, 00:16:11.609 "uuid": "7bf6cb96-43b8-4533-a8a4-d0912e49abfe", 00:16:11.609 "is_configured": false, 00:16:11.609 "data_offset": 2048, 00:16:11.609 "data_size": 63488 00:16:11.609 }, 00:16:11.609 { 00:16:11.609 "name": null, 00:16:11.609 "uuid": "3efec0cb-e390-427e-bce2-b2faa0632a52", 00:16:11.609 "is_configured": false, 00:16:11.609 "data_offset": 2048, 00:16:11.609 "data_size": 63488 00:16:11.609 }, 00:16:11.609 { 00:16:11.609 "name": "BaseBdev4", 00:16:11.609 "uuid": "9fdae26f-e462-4bb2-afee-966d37b9556a", 00:16:11.609 "is_configured": true, 00:16:11.609 "data_offset": 2048, 00:16:11.609 "data_size": 63488 00:16:11.609 } 00:16:11.609 ] 00:16:11.609 }' 00:16:11.609 18:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:11.609 18:52:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:11.868 18:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:11.869 18:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:12.127 18:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:12.127 18:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:12.386 [2024-07-24 18:52:57.168543] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:12.386 18:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:12.386 18:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:12.386 18:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:12.386 18:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:12.386 18:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:12.386 18:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:12.386 18:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:12.386 18:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:12.386 18:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:12.386 18:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:12.386 18:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.386 18:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:12.386 18:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:12.386 "name": "Existed_Raid", 00:16:12.386 "uuid": "0febfa1d-ec92-4a08-9eaf-3b748edb3117", 00:16:12.386 "strip_size_kb": 64, 00:16:12.386 "state": "configuring", 00:16:12.386 "raid_level": "concat", 00:16:12.386 "superblock": true, 00:16:12.386 "num_base_bdevs": 4, 00:16:12.386 "num_base_bdevs_discovered": 3, 00:16:12.386 "num_base_bdevs_operational": 4, 00:16:12.386 "base_bdevs_list": [ 00:16:12.386 { 00:16:12.386 "name": "BaseBdev1", 00:16:12.386 "uuid": "a9573f8a-0492-477e-aa94-d07695307c61", 00:16:12.386 "is_configured": true, 00:16:12.386 "data_offset": 2048, 00:16:12.386 "data_size": 63488 00:16:12.386 }, 00:16:12.386 { 00:16:12.386 "name": null, 00:16:12.386 "uuid": "7bf6cb96-43b8-4533-a8a4-d0912e49abfe", 00:16:12.386 "is_configured": false, 00:16:12.386 "data_offset": 2048, 00:16:12.386 "data_size": 63488 00:16:12.386 }, 00:16:12.386 { 00:16:12.386 "name": "BaseBdev3", 00:16:12.386 "uuid": "3efec0cb-e390-427e-bce2-b2faa0632a52", 00:16:12.386 "is_configured": true, 00:16:12.386 "data_offset": 2048, 00:16:12.386 "data_size": 63488 00:16:12.386 }, 00:16:12.386 { 00:16:12.386 "name": "BaseBdev4", 00:16:12.386 "uuid": "9fdae26f-e462-4bb2-afee-966d37b9556a", 00:16:12.386 "is_configured": true, 00:16:12.386 "data_offset": 2048, 00:16:12.386 "data_size": 63488 00:16:12.386 } 00:16:12.386 ] 00:16:12.386 }' 00:16:12.386 18:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:12.386 18:52:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:12.954 18:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:12.954 18:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.213 18:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:13.213 18:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:13.213 [2024-07-24 18:52:58.191195] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:13.213 18:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:13.213 18:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:13.213 18:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:13.213 18:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:13.213 18:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:13.213 18:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:13.213 18:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:13.213 18:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:13.213 18:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:13.213 18:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:13.213 18:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.213 18:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:13.472 18:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:13.472 "name": "Existed_Raid", 00:16:13.472 "uuid": "0febfa1d-ec92-4a08-9eaf-3b748edb3117", 00:16:13.472 "strip_size_kb": 64, 00:16:13.472 "state": "configuring", 00:16:13.472 "raid_level": "concat", 00:16:13.472 "superblock": true, 00:16:13.472 "num_base_bdevs": 4, 00:16:13.472 "num_base_bdevs_discovered": 2, 00:16:13.472 "num_base_bdevs_operational": 4, 00:16:13.472 "base_bdevs_list": [ 00:16:13.472 { 00:16:13.472 "name": null, 00:16:13.472 "uuid": "a9573f8a-0492-477e-aa94-d07695307c61", 00:16:13.472 "is_configured": false, 00:16:13.472 "data_offset": 2048, 00:16:13.472 "data_size": 63488 00:16:13.472 }, 00:16:13.472 { 00:16:13.472 "name": null, 00:16:13.472 "uuid": "7bf6cb96-43b8-4533-a8a4-d0912e49abfe", 00:16:13.472 "is_configured": false, 00:16:13.472 "data_offset": 2048, 00:16:13.472 "data_size": 63488 00:16:13.472 }, 00:16:13.472 { 00:16:13.472 "name": "BaseBdev3", 00:16:13.472 "uuid": "3efec0cb-e390-427e-bce2-b2faa0632a52", 00:16:13.472 "is_configured": true, 00:16:13.472 "data_offset": 2048, 00:16:13.472 "data_size": 63488 00:16:13.472 }, 00:16:13.472 { 00:16:13.472 "name": "BaseBdev4", 00:16:13.472 "uuid": "9fdae26f-e462-4bb2-afee-966d37b9556a", 00:16:13.472 "is_configured": true, 00:16:13.472 "data_offset": 2048, 00:16:13.472 "data_size": 63488 00:16:13.472 } 00:16:13.472 ] 00:16:13.472 }' 00:16:13.472 18:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:13.472 18:52:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:14.039 18:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.039 18:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:14.298 18:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:14.298 18:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:14.298 [2024-07-24 18:52:59.219567] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:14.298 18:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:14.298 18:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:14.298 18:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:14.298 18:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:14.298 18:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:14.298 18:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:14.298 18:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:14.298 18:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:14.298 18:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:14.298 18:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:14.298 18:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.298 18:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:14.556 18:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:14.556 "name": "Existed_Raid", 00:16:14.556 "uuid": "0febfa1d-ec92-4a08-9eaf-3b748edb3117", 00:16:14.556 "strip_size_kb": 64, 00:16:14.556 "state": "configuring", 00:16:14.556 "raid_level": "concat", 00:16:14.556 "superblock": true, 00:16:14.556 "num_base_bdevs": 4, 00:16:14.556 "num_base_bdevs_discovered": 3, 00:16:14.556 "num_base_bdevs_operational": 4, 00:16:14.556 "base_bdevs_list": [ 00:16:14.556 { 00:16:14.556 "name": null, 00:16:14.556 "uuid": "a9573f8a-0492-477e-aa94-d07695307c61", 00:16:14.556 "is_configured": false, 00:16:14.556 "data_offset": 2048, 00:16:14.556 "data_size": 63488 00:16:14.556 }, 00:16:14.556 { 00:16:14.556 "name": "BaseBdev2", 00:16:14.556 "uuid": "7bf6cb96-43b8-4533-a8a4-d0912e49abfe", 00:16:14.556 "is_configured": true, 00:16:14.556 "data_offset": 2048, 00:16:14.556 "data_size": 63488 00:16:14.556 }, 00:16:14.556 { 00:16:14.556 "name": "BaseBdev3", 00:16:14.556 "uuid": "3efec0cb-e390-427e-bce2-b2faa0632a52", 00:16:14.557 "is_configured": true, 00:16:14.557 "data_offset": 2048, 00:16:14.557 "data_size": 63488 00:16:14.557 }, 00:16:14.557 { 00:16:14.557 "name": "BaseBdev4", 00:16:14.557 "uuid": "9fdae26f-e462-4bb2-afee-966d37b9556a", 00:16:14.557 "is_configured": true, 00:16:14.557 "data_offset": 2048, 00:16:14.557 "data_size": 63488 00:16:14.557 } 00:16:14.557 ] 00:16:14.557 }' 00:16:14.557 18:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:14.557 18:52:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:15.126 18:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.126 18:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:15.126 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:15.126 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.126 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:15.419 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u a9573f8a-0492-477e-aa94-d07695307c61 00:16:15.419 [2024-07-24 18:53:00.362598] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:15.419 [2024-07-24 18:53:00.362727] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xd35670 00:16:15.419 [2024-07-24 18:53:00.362735] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:15.419 [2024-07-24 18:53:00.362877] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd3eb70 00:16:15.419 [2024-07-24 18:53:00.362965] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd35670 00:16:15.419 [2024-07-24 18:53:00.362971] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd35670 00:16:15.419 [2024-07-24 18:53:00.363037] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:15.419 NewBaseBdev 00:16:15.419 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:15.419 18:53:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:15.419 18:53:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:15.419 18:53:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:15.419 18:53:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:15.419 18:53:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:15.419 18:53:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:15.678 18:53:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:15.936 [ 00:16:15.936 { 00:16:15.936 "name": "NewBaseBdev", 00:16:15.936 "aliases": [ 00:16:15.936 "a9573f8a-0492-477e-aa94-d07695307c61" 00:16:15.936 ], 00:16:15.936 "product_name": "Malloc disk", 00:16:15.936 "block_size": 512, 00:16:15.936 "num_blocks": 65536, 00:16:15.936 "uuid": "a9573f8a-0492-477e-aa94-d07695307c61", 00:16:15.936 "assigned_rate_limits": { 00:16:15.936 "rw_ios_per_sec": 0, 00:16:15.936 "rw_mbytes_per_sec": 0, 00:16:15.936 "r_mbytes_per_sec": 0, 00:16:15.936 "w_mbytes_per_sec": 0 00:16:15.936 }, 00:16:15.936 "claimed": true, 00:16:15.936 "claim_type": "exclusive_write", 00:16:15.936 "zoned": false, 00:16:15.936 "supported_io_types": { 00:16:15.936 "read": true, 00:16:15.936 "write": true, 00:16:15.936 "unmap": true, 00:16:15.936 "flush": true, 00:16:15.936 "reset": true, 00:16:15.936 "nvme_admin": false, 00:16:15.936 "nvme_io": false, 00:16:15.936 "nvme_io_md": false, 00:16:15.936 "write_zeroes": true, 00:16:15.936 "zcopy": true, 00:16:15.936 "get_zone_info": false, 00:16:15.936 "zone_management": false, 00:16:15.936 "zone_append": false, 00:16:15.936 "compare": false, 00:16:15.936 "compare_and_write": false, 00:16:15.936 "abort": true, 00:16:15.936 "seek_hole": false, 00:16:15.936 "seek_data": false, 00:16:15.936 "copy": true, 00:16:15.936 "nvme_iov_md": false 00:16:15.936 }, 00:16:15.936 "memory_domains": [ 00:16:15.936 { 00:16:15.936 "dma_device_id": "system", 00:16:15.936 "dma_device_type": 1 00:16:15.936 }, 00:16:15.936 { 00:16:15.936 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:15.936 "dma_device_type": 2 00:16:15.936 } 00:16:15.936 ], 00:16:15.936 "driver_specific": {} 00:16:15.936 } 00:16:15.936 ] 00:16:15.936 18:53:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:15.936 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:16:15.936 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:15.936 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:15.936 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:15.936 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:15.937 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:15.937 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:15.937 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:15.937 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:15.937 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:15.937 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.937 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:15.937 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:15.937 "name": "Existed_Raid", 00:16:15.937 "uuid": "0febfa1d-ec92-4a08-9eaf-3b748edb3117", 00:16:15.937 "strip_size_kb": 64, 00:16:15.937 "state": "online", 00:16:15.937 "raid_level": "concat", 00:16:15.937 "superblock": true, 00:16:15.937 "num_base_bdevs": 4, 00:16:15.937 "num_base_bdevs_discovered": 4, 00:16:15.937 "num_base_bdevs_operational": 4, 00:16:15.937 "base_bdevs_list": [ 00:16:15.937 { 00:16:15.937 "name": "NewBaseBdev", 00:16:15.937 "uuid": "a9573f8a-0492-477e-aa94-d07695307c61", 00:16:15.937 "is_configured": true, 00:16:15.937 "data_offset": 2048, 00:16:15.937 "data_size": 63488 00:16:15.937 }, 00:16:15.937 { 00:16:15.937 "name": "BaseBdev2", 00:16:15.937 "uuid": "7bf6cb96-43b8-4533-a8a4-d0912e49abfe", 00:16:15.937 "is_configured": true, 00:16:15.937 "data_offset": 2048, 00:16:15.937 "data_size": 63488 00:16:15.937 }, 00:16:15.937 { 00:16:15.937 "name": "BaseBdev3", 00:16:15.937 "uuid": "3efec0cb-e390-427e-bce2-b2faa0632a52", 00:16:15.937 "is_configured": true, 00:16:15.937 "data_offset": 2048, 00:16:15.937 "data_size": 63488 00:16:15.937 }, 00:16:15.937 { 00:16:15.937 "name": "BaseBdev4", 00:16:15.937 "uuid": "9fdae26f-e462-4bb2-afee-966d37b9556a", 00:16:15.937 "is_configured": true, 00:16:15.937 "data_offset": 2048, 00:16:15.937 "data_size": 63488 00:16:15.937 } 00:16:15.937 ] 00:16:15.937 }' 00:16:15.937 18:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:15.937 18:53:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:16.525 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:16.525 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:16.525 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:16.525 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:16.525 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:16.525 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:16.525 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:16.525 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:16.525 [2024-07-24 18:53:01.437588] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:16.525 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:16.525 "name": "Existed_Raid", 00:16:16.525 "aliases": [ 00:16:16.525 "0febfa1d-ec92-4a08-9eaf-3b748edb3117" 00:16:16.525 ], 00:16:16.525 "product_name": "Raid Volume", 00:16:16.525 "block_size": 512, 00:16:16.525 "num_blocks": 253952, 00:16:16.525 "uuid": "0febfa1d-ec92-4a08-9eaf-3b748edb3117", 00:16:16.525 "assigned_rate_limits": { 00:16:16.525 "rw_ios_per_sec": 0, 00:16:16.525 "rw_mbytes_per_sec": 0, 00:16:16.525 "r_mbytes_per_sec": 0, 00:16:16.525 "w_mbytes_per_sec": 0 00:16:16.525 }, 00:16:16.525 "claimed": false, 00:16:16.525 "zoned": false, 00:16:16.525 "supported_io_types": { 00:16:16.525 "read": true, 00:16:16.525 "write": true, 00:16:16.525 "unmap": true, 00:16:16.525 "flush": true, 00:16:16.525 "reset": true, 00:16:16.525 "nvme_admin": false, 00:16:16.525 "nvme_io": false, 00:16:16.525 "nvme_io_md": false, 00:16:16.525 "write_zeroes": true, 00:16:16.525 "zcopy": false, 00:16:16.525 "get_zone_info": false, 00:16:16.525 "zone_management": false, 00:16:16.525 "zone_append": false, 00:16:16.525 "compare": false, 00:16:16.525 "compare_and_write": false, 00:16:16.525 "abort": false, 00:16:16.525 "seek_hole": false, 00:16:16.525 "seek_data": false, 00:16:16.525 "copy": false, 00:16:16.525 "nvme_iov_md": false 00:16:16.525 }, 00:16:16.525 "memory_domains": [ 00:16:16.525 { 00:16:16.525 "dma_device_id": "system", 00:16:16.525 "dma_device_type": 1 00:16:16.525 }, 00:16:16.525 { 00:16:16.525 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.525 "dma_device_type": 2 00:16:16.525 }, 00:16:16.525 { 00:16:16.525 "dma_device_id": "system", 00:16:16.525 "dma_device_type": 1 00:16:16.525 }, 00:16:16.525 { 00:16:16.525 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.525 "dma_device_type": 2 00:16:16.525 }, 00:16:16.525 { 00:16:16.525 "dma_device_id": "system", 00:16:16.525 "dma_device_type": 1 00:16:16.525 }, 00:16:16.525 { 00:16:16.525 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.525 "dma_device_type": 2 00:16:16.525 }, 00:16:16.525 { 00:16:16.525 "dma_device_id": "system", 00:16:16.525 "dma_device_type": 1 00:16:16.525 }, 00:16:16.525 { 00:16:16.525 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.525 "dma_device_type": 2 00:16:16.525 } 00:16:16.525 ], 00:16:16.525 "driver_specific": { 00:16:16.525 "raid": { 00:16:16.525 "uuid": "0febfa1d-ec92-4a08-9eaf-3b748edb3117", 00:16:16.525 "strip_size_kb": 64, 00:16:16.525 "state": "online", 00:16:16.525 "raid_level": "concat", 00:16:16.525 "superblock": true, 00:16:16.525 "num_base_bdevs": 4, 00:16:16.525 "num_base_bdevs_discovered": 4, 00:16:16.525 "num_base_bdevs_operational": 4, 00:16:16.525 "base_bdevs_list": [ 00:16:16.526 { 00:16:16.526 "name": "NewBaseBdev", 00:16:16.526 "uuid": "a9573f8a-0492-477e-aa94-d07695307c61", 00:16:16.526 "is_configured": true, 00:16:16.526 "data_offset": 2048, 00:16:16.526 "data_size": 63488 00:16:16.526 }, 00:16:16.526 { 00:16:16.526 "name": "BaseBdev2", 00:16:16.526 "uuid": "7bf6cb96-43b8-4533-a8a4-d0912e49abfe", 00:16:16.526 "is_configured": true, 00:16:16.526 "data_offset": 2048, 00:16:16.526 "data_size": 63488 00:16:16.526 }, 00:16:16.526 { 00:16:16.526 "name": "BaseBdev3", 00:16:16.526 "uuid": "3efec0cb-e390-427e-bce2-b2faa0632a52", 00:16:16.526 "is_configured": true, 00:16:16.526 "data_offset": 2048, 00:16:16.526 "data_size": 63488 00:16:16.526 }, 00:16:16.526 { 00:16:16.526 "name": "BaseBdev4", 00:16:16.526 "uuid": "9fdae26f-e462-4bb2-afee-966d37b9556a", 00:16:16.526 "is_configured": true, 00:16:16.526 "data_offset": 2048, 00:16:16.526 "data_size": 63488 00:16:16.526 } 00:16:16.526 ] 00:16:16.526 } 00:16:16.526 } 00:16:16.526 }' 00:16:16.526 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:16.526 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:16.526 BaseBdev2 00:16:16.526 BaseBdev3 00:16:16.526 BaseBdev4' 00:16:16.526 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:16.526 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:16.526 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:16.784 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:16.784 "name": "NewBaseBdev", 00:16:16.784 "aliases": [ 00:16:16.784 "a9573f8a-0492-477e-aa94-d07695307c61" 00:16:16.784 ], 00:16:16.784 "product_name": "Malloc disk", 00:16:16.784 "block_size": 512, 00:16:16.784 "num_blocks": 65536, 00:16:16.784 "uuid": "a9573f8a-0492-477e-aa94-d07695307c61", 00:16:16.784 "assigned_rate_limits": { 00:16:16.784 "rw_ios_per_sec": 0, 00:16:16.784 "rw_mbytes_per_sec": 0, 00:16:16.784 "r_mbytes_per_sec": 0, 00:16:16.784 "w_mbytes_per_sec": 0 00:16:16.784 }, 00:16:16.784 "claimed": true, 00:16:16.784 "claim_type": "exclusive_write", 00:16:16.784 "zoned": false, 00:16:16.784 "supported_io_types": { 00:16:16.784 "read": true, 00:16:16.784 "write": true, 00:16:16.784 "unmap": true, 00:16:16.784 "flush": true, 00:16:16.784 "reset": true, 00:16:16.784 "nvme_admin": false, 00:16:16.784 "nvme_io": false, 00:16:16.784 "nvme_io_md": false, 00:16:16.784 "write_zeroes": true, 00:16:16.784 "zcopy": true, 00:16:16.784 "get_zone_info": false, 00:16:16.784 "zone_management": false, 00:16:16.784 "zone_append": false, 00:16:16.784 "compare": false, 00:16:16.784 "compare_and_write": false, 00:16:16.784 "abort": true, 00:16:16.784 "seek_hole": false, 00:16:16.784 "seek_data": false, 00:16:16.784 "copy": true, 00:16:16.784 "nvme_iov_md": false 00:16:16.784 }, 00:16:16.784 "memory_domains": [ 00:16:16.784 { 00:16:16.784 "dma_device_id": "system", 00:16:16.784 "dma_device_type": 1 00:16:16.784 }, 00:16:16.784 { 00:16:16.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.784 "dma_device_type": 2 00:16:16.784 } 00:16:16.784 ], 00:16:16.784 "driver_specific": {} 00:16:16.784 }' 00:16:16.784 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:16.784 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:16.784 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:16.784 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.043 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.043 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:17.043 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.043 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.043 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:17.043 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:17.043 18:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:17.043 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:17.043 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:17.043 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:17.043 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:17.302 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:17.302 "name": "BaseBdev2", 00:16:17.302 "aliases": [ 00:16:17.302 "7bf6cb96-43b8-4533-a8a4-d0912e49abfe" 00:16:17.302 ], 00:16:17.302 "product_name": "Malloc disk", 00:16:17.302 "block_size": 512, 00:16:17.302 "num_blocks": 65536, 00:16:17.302 "uuid": "7bf6cb96-43b8-4533-a8a4-d0912e49abfe", 00:16:17.302 "assigned_rate_limits": { 00:16:17.302 "rw_ios_per_sec": 0, 00:16:17.302 "rw_mbytes_per_sec": 0, 00:16:17.302 "r_mbytes_per_sec": 0, 00:16:17.302 "w_mbytes_per_sec": 0 00:16:17.302 }, 00:16:17.302 "claimed": true, 00:16:17.302 "claim_type": "exclusive_write", 00:16:17.302 "zoned": false, 00:16:17.302 "supported_io_types": { 00:16:17.302 "read": true, 00:16:17.302 "write": true, 00:16:17.302 "unmap": true, 00:16:17.302 "flush": true, 00:16:17.302 "reset": true, 00:16:17.302 "nvme_admin": false, 00:16:17.302 "nvme_io": false, 00:16:17.302 "nvme_io_md": false, 00:16:17.302 "write_zeroes": true, 00:16:17.302 "zcopy": true, 00:16:17.302 "get_zone_info": false, 00:16:17.302 "zone_management": false, 00:16:17.302 "zone_append": false, 00:16:17.302 "compare": false, 00:16:17.302 "compare_and_write": false, 00:16:17.302 "abort": true, 00:16:17.302 "seek_hole": false, 00:16:17.302 "seek_data": false, 00:16:17.302 "copy": true, 00:16:17.302 "nvme_iov_md": false 00:16:17.302 }, 00:16:17.302 "memory_domains": [ 00:16:17.302 { 00:16:17.302 "dma_device_id": "system", 00:16:17.302 "dma_device_type": 1 00:16:17.302 }, 00:16:17.302 { 00:16:17.302 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.302 "dma_device_type": 2 00:16:17.302 } 00:16:17.302 ], 00:16:17.302 "driver_specific": {} 00:16:17.302 }' 00:16:17.302 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.302 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.302 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:17.302 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.302 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.561 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:17.561 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.561 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.561 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:17.561 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:17.561 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:17.561 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:17.561 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:17.561 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:17.561 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:17.820 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:17.820 "name": "BaseBdev3", 00:16:17.820 "aliases": [ 00:16:17.820 "3efec0cb-e390-427e-bce2-b2faa0632a52" 00:16:17.820 ], 00:16:17.820 "product_name": "Malloc disk", 00:16:17.820 "block_size": 512, 00:16:17.820 "num_blocks": 65536, 00:16:17.820 "uuid": "3efec0cb-e390-427e-bce2-b2faa0632a52", 00:16:17.820 "assigned_rate_limits": { 00:16:17.820 "rw_ios_per_sec": 0, 00:16:17.820 "rw_mbytes_per_sec": 0, 00:16:17.820 "r_mbytes_per_sec": 0, 00:16:17.820 "w_mbytes_per_sec": 0 00:16:17.820 }, 00:16:17.820 "claimed": true, 00:16:17.820 "claim_type": "exclusive_write", 00:16:17.820 "zoned": false, 00:16:17.820 "supported_io_types": { 00:16:17.820 "read": true, 00:16:17.820 "write": true, 00:16:17.820 "unmap": true, 00:16:17.820 "flush": true, 00:16:17.820 "reset": true, 00:16:17.820 "nvme_admin": false, 00:16:17.820 "nvme_io": false, 00:16:17.820 "nvme_io_md": false, 00:16:17.820 "write_zeroes": true, 00:16:17.820 "zcopy": true, 00:16:17.820 "get_zone_info": false, 00:16:17.820 "zone_management": false, 00:16:17.820 "zone_append": false, 00:16:17.820 "compare": false, 00:16:17.820 "compare_and_write": false, 00:16:17.820 "abort": true, 00:16:17.820 "seek_hole": false, 00:16:17.820 "seek_data": false, 00:16:17.820 "copy": true, 00:16:17.820 "nvme_iov_md": false 00:16:17.820 }, 00:16:17.820 "memory_domains": [ 00:16:17.820 { 00:16:17.820 "dma_device_id": "system", 00:16:17.820 "dma_device_type": 1 00:16:17.820 }, 00:16:17.820 { 00:16:17.820 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.820 "dma_device_type": 2 00:16:17.820 } 00:16:17.820 ], 00:16:17.820 "driver_specific": {} 00:16:17.820 }' 00:16:17.820 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.820 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.820 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:17.820 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.820 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.820 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:17.820 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.820 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.080 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:18.080 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.080 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.080 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:18.080 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:18.080 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:18.080 18:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:18.339 18:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:18.339 "name": "BaseBdev4", 00:16:18.339 "aliases": [ 00:16:18.339 "9fdae26f-e462-4bb2-afee-966d37b9556a" 00:16:18.339 ], 00:16:18.339 "product_name": "Malloc disk", 00:16:18.339 "block_size": 512, 00:16:18.339 "num_blocks": 65536, 00:16:18.339 "uuid": "9fdae26f-e462-4bb2-afee-966d37b9556a", 00:16:18.339 "assigned_rate_limits": { 00:16:18.339 "rw_ios_per_sec": 0, 00:16:18.339 "rw_mbytes_per_sec": 0, 00:16:18.339 "r_mbytes_per_sec": 0, 00:16:18.339 "w_mbytes_per_sec": 0 00:16:18.339 }, 00:16:18.339 "claimed": true, 00:16:18.339 "claim_type": "exclusive_write", 00:16:18.339 "zoned": false, 00:16:18.339 "supported_io_types": { 00:16:18.339 "read": true, 00:16:18.339 "write": true, 00:16:18.339 "unmap": true, 00:16:18.339 "flush": true, 00:16:18.339 "reset": true, 00:16:18.339 "nvme_admin": false, 00:16:18.339 "nvme_io": false, 00:16:18.339 "nvme_io_md": false, 00:16:18.339 "write_zeroes": true, 00:16:18.339 "zcopy": true, 00:16:18.339 "get_zone_info": false, 00:16:18.339 "zone_management": false, 00:16:18.339 "zone_append": false, 00:16:18.339 "compare": false, 00:16:18.339 "compare_and_write": false, 00:16:18.339 "abort": true, 00:16:18.339 "seek_hole": false, 00:16:18.339 "seek_data": false, 00:16:18.339 "copy": true, 00:16:18.339 "nvme_iov_md": false 00:16:18.339 }, 00:16:18.339 "memory_domains": [ 00:16:18.339 { 00:16:18.339 "dma_device_id": "system", 00:16:18.339 "dma_device_type": 1 00:16:18.339 }, 00:16:18.339 { 00:16:18.339 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:18.339 "dma_device_type": 2 00:16:18.339 } 00:16:18.339 ], 00:16:18.339 "driver_specific": {} 00:16:18.339 }' 00:16:18.339 18:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.339 18:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.339 18:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:18.339 18:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.339 18:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.339 18:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:18.339 18:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.339 18:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.339 18:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:18.339 18:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.598 18:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.598 18:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:18.598 18:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:18.598 [2024-07-24 18:53:03.550856] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:18.598 [2024-07-24 18:53:03.550876] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:18.598 [2024-07-24 18:53:03.550914] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:18.598 [2024-07-24 18:53:03.550960] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:18.598 [2024-07-24 18:53:03.550965] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd35670 name Existed_Raid, state offline 00:16:18.598 18:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2124732 00:16:18.598 18:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2124732 ']' 00:16:18.598 18:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2124732 00:16:18.598 18:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:16:18.598 18:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:18.598 18:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2124732 00:16:18.598 18:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:18.598 18:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:18.598 18:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2124732' 00:16:18.598 killing process with pid 2124732 00:16:18.598 18:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2124732 00:16:18.598 [2024-07-24 18:53:03.597054] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:18.598 18:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2124732 00:16:18.857 [2024-07-24 18:53:03.654149] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:19.116 18:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:19.116 00:16:19.116 real 0m24.337s 00:16:19.116 user 0m45.173s 00:16:19.116 sys 0m3.786s 00:16:19.116 18:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:19.116 18:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:19.116 ************************************ 00:16:19.116 END TEST raid_state_function_test_sb 00:16:19.116 ************************************ 00:16:19.116 18:53:03 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:16:19.116 18:53:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:16:19.116 18:53:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:19.116 18:53:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:19.116 ************************************ 00:16:19.116 START TEST raid_superblock_test 00:16:19.116 ************************************ 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2129458 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2129458 /var/tmp/spdk-raid.sock 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2129458 ']' 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:19.116 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:19.116 18:53:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:19.116 [2024-07-24 18:53:04.056667] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:16:19.116 [2024-07-24 18:53:04.056707] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2129458 ] 00:16:19.116 [2024-07-24 18:53:04.120794] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:19.375 [2024-07-24 18:53:04.200704] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:19.375 [2024-07-24 18:53:04.249937] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:19.375 [2024-07-24 18:53:04.249960] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:19.945 18:53:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:19.945 18:53:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:16:19.945 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:19.945 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:19.945 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:19.945 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:19.945 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:19.945 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:19.945 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:19.945 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:19.945 18:53:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:20.204 malloc1 00:16:20.204 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:20.204 [2024-07-24 18:53:05.165918] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:20.204 [2024-07-24 18:53:05.165952] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:20.204 [2024-07-24 18:53:05.165964] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x185ce20 00:16:20.204 [2024-07-24 18:53:05.165970] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:20.204 [2024-07-24 18:53:05.167182] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:20.204 [2024-07-24 18:53:05.167204] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:20.204 pt1 00:16:20.204 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:20.204 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:20.204 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:20.204 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:20.204 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:20.204 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:20.204 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:20.204 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:20.204 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:20.462 malloc2 00:16:20.462 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:20.750 [2024-07-24 18:53:05.482407] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:20.750 [2024-07-24 18:53:05.482436] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:20.750 [2024-07-24 18:53:05.482445] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a06ed0 00:16:20.750 [2024-07-24 18:53:05.482450] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:20.750 [2024-07-24 18:53:05.483455] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:20.750 [2024-07-24 18:53:05.483481] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:20.750 pt2 00:16:20.750 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:20.750 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:20.750 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:16:20.750 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:16:20.750 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:20.750 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:20.750 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:20.750 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:20.750 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:20.750 malloc3 00:16:20.750 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:21.009 [2024-07-24 18:53:05.802683] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:21.009 [2024-07-24 18:53:05.802716] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:21.009 [2024-07-24 18:53:05.802726] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a0aa30 00:16:21.009 [2024-07-24 18:53:05.802732] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:21.009 [2024-07-24 18:53:05.803779] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:21.009 [2024-07-24 18:53:05.803814] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:21.009 pt3 00:16:21.009 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:21.009 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:21.009 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:16:21.009 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:16:21.009 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:16:21.009 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:21.009 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:21.009 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:21.009 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:16:21.009 malloc4 00:16:21.009 18:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:16:21.268 [2024-07-24 18:53:06.127067] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:16:21.268 [2024-07-24 18:53:06.127104] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:21.268 [2024-07-24 18:53:06.127114] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a07900 00:16:21.268 [2024-07-24 18:53:06.127119] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:21.268 [2024-07-24 18:53:06.128230] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:21.268 [2024-07-24 18:53:06.128251] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:16:21.268 pt4 00:16:21.268 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:21.268 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:21.268 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:16:21.527 [2024-07-24 18:53:06.295540] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:21.527 [2024-07-24 18:53:06.296480] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:21.527 [2024-07-24 18:53:06.296518] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:21.527 [2024-07-24 18:53:06.296546] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:16:21.527 [2024-07-24 18:53:06.296659] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a0ad40 00:16:21.527 [2024-07-24 18:53:06.296665] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:21.527 [2024-07-24 18:53:06.296802] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a0f140 00:16:21.527 [2024-07-24 18:53:06.296900] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a0ad40 00:16:21.527 [2024-07-24 18:53:06.296906] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a0ad40 00:16:21.527 [2024-07-24 18:53:06.296968] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:21.527 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:21.527 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:21.527 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:21.527 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:21.527 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:21.527 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:21.527 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:21.527 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:21.527 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:21.527 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:21.527 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.527 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:21.527 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:21.527 "name": "raid_bdev1", 00:16:21.527 "uuid": "b7d29aeb-8d1c-40ad-a7cb-fb2de02beb2c", 00:16:21.527 "strip_size_kb": 64, 00:16:21.527 "state": "online", 00:16:21.527 "raid_level": "concat", 00:16:21.527 "superblock": true, 00:16:21.527 "num_base_bdevs": 4, 00:16:21.527 "num_base_bdevs_discovered": 4, 00:16:21.527 "num_base_bdevs_operational": 4, 00:16:21.527 "base_bdevs_list": [ 00:16:21.527 { 00:16:21.527 "name": "pt1", 00:16:21.527 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:21.527 "is_configured": true, 00:16:21.527 "data_offset": 2048, 00:16:21.527 "data_size": 63488 00:16:21.527 }, 00:16:21.527 { 00:16:21.527 "name": "pt2", 00:16:21.527 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:21.527 "is_configured": true, 00:16:21.527 "data_offset": 2048, 00:16:21.527 "data_size": 63488 00:16:21.527 }, 00:16:21.527 { 00:16:21.527 "name": "pt3", 00:16:21.527 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:21.527 "is_configured": true, 00:16:21.527 "data_offset": 2048, 00:16:21.527 "data_size": 63488 00:16:21.527 }, 00:16:21.527 { 00:16:21.527 "name": "pt4", 00:16:21.527 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:21.527 "is_configured": true, 00:16:21.527 "data_offset": 2048, 00:16:21.527 "data_size": 63488 00:16:21.527 } 00:16:21.527 ] 00:16:21.527 }' 00:16:21.527 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:21.527 18:53:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:22.093 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:22.093 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:22.093 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:22.093 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:22.093 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:22.093 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:22.093 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:22.093 18:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:22.352 [2024-07-24 18:53:07.121827] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:22.352 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:22.352 "name": "raid_bdev1", 00:16:22.352 "aliases": [ 00:16:22.352 "b7d29aeb-8d1c-40ad-a7cb-fb2de02beb2c" 00:16:22.352 ], 00:16:22.352 "product_name": "Raid Volume", 00:16:22.352 "block_size": 512, 00:16:22.352 "num_blocks": 253952, 00:16:22.352 "uuid": "b7d29aeb-8d1c-40ad-a7cb-fb2de02beb2c", 00:16:22.352 "assigned_rate_limits": { 00:16:22.352 "rw_ios_per_sec": 0, 00:16:22.352 "rw_mbytes_per_sec": 0, 00:16:22.352 "r_mbytes_per_sec": 0, 00:16:22.352 "w_mbytes_per_sec": 0 00:16:22.352 }, 00:16:22.352 "claimed": false, 00:16:22.352 "zoned": false, 00:16:22.352 "supported_io_types": { 00:16:22.352 "read": true, 00:16:22.352 "write": true, 00:16:22.352 "unmap": true, 00:16:22.352 "flush": true, 00:16:22.352 "reset": true, 00:16:22.352 "nvme_admin": false, 00:16:22.352 "nvme_io": false, 00:16:22.352 "nvme_io_md": false, 00:16:22.352 "write_zeroes": true, 00:16:22.352 "zcopy": false, 00:16:22.352 "get_zone_info": false, 00:16:22.352 "zone_management": false, 00:16:22.352 "zone_append": false, 00:16:22.352 "compare": false, 00:16:22.352 "compare_and_write": false, 00:16:22.352 "abort": false, 00:16:22.352 "seek_hole": false, 00:16:22.352 "seek_data": false, 00:16:22.352 "copy": false, 00:16:22.352 "nvme_iov_md": false 00:16:22.352 }, 00:16:22.352 "memory_domains": [ 00:16:22.352 { 00:16:22.352 "dma_device_id": "system", 00:16:22.352 "dma_device_type": 1 00:16:22.352 }, 00:16:22.352 { 00:16:22.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.352 "dma_device_type": 2 00:16:22.352 }, 00:16:22.352 { 00:16:22.352 "dma_device_id": "system", 00:16:22.352 "dma_device_type": 1 00:16:22.352 }, 00:16:22.352 { 00:16:22.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.352 "dma_device_type": 2 00:16:22.352 }, 00:16:22.352 { 00:16:22.352 "dma_device_id": "system", 00:16:22.352 "dma_device_type": 1 00:16:22.352 }, 00:16:22.352 { 00:16:22.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.352 "dma_device_type": 2 00:16:22.352 }, 00:16:22.352 { 00:16:22.352 "dma_device_id": "system", 00:16:22.352 "dma_device_type": 1 00:16:22.352 }, 00:16:22.352 { 00:16:22.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.352 "dma_device_type": 2 00:16:22.352 } 00:16:22.352 ], 00:16:22.352 "driver_specific": { 00:16:22.352 "raid": { 00:16:22.352 "uuid": "b7d29aeb-8d1c-40ad-a7cb-fb2de02beb2c", 00:16:22.352 "strip_size_kb": 64, 00:16:22.352 "state": "online", 00:16:22.352 "raid_level": "concat", 00:16:22.352 "superblock": true, 00:16:22.352 "num_base_bdevs": 4, 00:16:22.352 "num_base_bdevs_discovered": 4, 00:16:22.352 "num_base_bdevs_operational": 4, 00:16:22.352 "base_bdevs_list": [ 00:16:22.352 { 00:16:22.352 "name": "pt1", 00:16:22.352 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:22.352 "is_configured": true, 00:16:22.352 "data_offset": 2048, 00:16:22.352 "data_size": 63488 00:16:22.352 }, 00:16:22.352 { 00:16:22.352 "name": "pt2", 00:16:22.352 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:22.352 "is_configured": true, 00:16:22.352 "data_offset": 2048, 00:16:22.352 "data_size": 63488 00:16:22.352 }, 00:16:22.352 { 00:16:22.352 "name": "pt3", 00:16:22.352 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:22.352 "is_configured": true, 00:16:22.352 "data_offset": 2048, 00:16:22.352 "data_size": 63488 00:16:22.352 }, 00:16:22.352 { 00:16:22.352 "name": "pt4", 00:16:22.352 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:22.352 "is_configured": true, 00:16:22.352 "data_offset": 2048, 00:16:22.352 "data_size": 63488 00:16:22.352 } 00:16:22.352 ] 00:16:22.352 } 00:16:22.352 } 00:16:22.352 }' 00:16:22.352 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:22.352 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:22.352 pt2 00:16:22.352 pt3 00:16:22.352 pt4' 00:16:22.352 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:22.352 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:22.352 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:22.611 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:22.611 "name": "pt1", 00:16:22.611 "aliases": [ 00:16:22.611 "00000000-0000-0000-0000-000000000001" 00:16:22.612 ], 00:16:22.612 "product_name": "passthru", 00:16:22.612 "block_size": 512, 00:16:22.612 "num_blocks": 65536, 00:16:22.612 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:22.612 "assigned_rate_limits": { 00:16:22.612 "rw_ios_per_sec": 0, 00:16:22.612 "rw_mbytes_per_sec": 0, 00:16:22.612 "r_mbytes_per_sec": 0, 00:16:22.612 "w_mbytes_per_sec": 0 00:16:22.612 }, 00:16:22.612 "claimed": true, 00:16:22.612 "claim_type": "exclusive_write", 00:16:22.612 "zoned": false, 00:16:22.612 "supported_io_types": { 00:16:22.612 "read": true, 00:16:22.612 "write": true, 00:16:22.612 "unmap": true, 00:16:22.612 "flush": true, 00:16:22.612 "reset": true, 00:16:22.612 "nvme_admin": false, 00:16:22.612 "nvme_io": false, 00:16:22.612 "nvme_io_md": false, 00:16:22.612 "write_zeroes": true, 00:16:22.612 "zcopy": true, 00:16:22.612 "get_zone_info": false, 00:16:22.612 "zone_management": false, 00:16:22.612 "zone_append": false, 00:16:22.612 "compare": false, 00:16:22.612 "compare_and_write": false, 00:16:22.612 "abort": true, 00:16:22.612 "seek_hole": false, 00:16:22.612 "seek_data": false, 00:16:22.612 "copy": true, 00:16:22.612 "nvme_iov_md": false 00:16:22.612 }, 00:16:22.612 "memory_domains": [ 00:16:22.612 { 00:16:22.612 "dma_device_id": "system", 00:16:22.612 "dma_device_type": 1 00:16:22.612 }, 00:16:22.612 { 00:16:22.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.612 "dma_device_type": 2 00:16:22.612 } 00:16:22.612 ], 00:16:22.612 "driver_specific": { 00:16:22.612 "passthru": { 00:16:22.612 "name": "pt1", 00:16:22.612 "base_bdev_name": "malloc1" 00:16:22.612 } 00:16:22.612 } 00:16:22.612 }' 00:16:22.612 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:22.612 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:22.612 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:22.612 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:22.612 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:22.612 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:22.612 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:22.612 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:22.612 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:22.612 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:22.871 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:22.871 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:22.871 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:22.871 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:22.871 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:22.871 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:22.871 "name": "pt2", 00:16:22.871 "aliases": [ 00:16:22.871 "00000000-0000-0000-0000-000000000002" 00:16:22.871 ], 00:16:22.871 "product_name": "passthru", 00:16:22.871 "block_size": 512, 00:16:22.871 "num_blocks": 65536, 00:16:22.871 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:22.871 "assigned_rate_limits": { 00:16:22.871 "rw_ios_per_sec": 0, 00:16:22.871 "rw_mbytes_per_sec": 0, 00:16:22.871 "r_mbytes_per_sec": 0, 00:16:22.871 "w_mbytes_per_sec": 0 00:16:22.871 }, 00:16:22.871 "claimed": true, 00:16:22.871 "claim_type": "exclusive_write", 00:16:22.871 "zoned": false, 00:16:22.871 "supported_io_types": { 00:16:22.871 "read": true, 00:16:22.871 "write": true, 00:16:22.871 "unmap": true, 00:16:22.871 "flush": true, 00:16:22.871 "reset": true, 00:16:22.871 "nvme_admin": false, 00:16:22.871 "nvme_io": false, 00:16:22.871 "nvme_io_md": false, 00:16:22.871 "write_zeroes": true, 00:16:22.871 "zcopy": true, 00:16:22.871 "get_zone_info": false, 00:16:22.871 "zone_management": false, 00:16:22.871 "zone_append": false, 00:16:22.871 "compare": false, 00:16:22.871 "compare_and_write": false, 00:16:22.871 "abort": true, 00:16:22.871 "seek_hole": false, 00:16:22.871 "seek_data": false, 00:16:22.871 "copy": true, 00:16:22.871 "nvme_iov_md": false 00:16:22.871 }, 00:16:22.871 "memory_domains": [ 00:16:22.871 { 00:16:22.871 "dma_device_id": "system", 00:16:22.871 "dma_device_type": 1 00:16:22.871 }, 00:16:22.871 { 00:16:22.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.871 "dma_device_type": 2 00:16:22.871 } 00:16:22.871 ], 00:16:22.871 "driver_specific": { 00:16:22.871 "passthru": { 00:16:22.871 "name": "pt2", 00:16:22.871 "base_bdev_name": "malloc2" 00:16:22.871 } 00:16:22.871 } 00:16:22.871 }' 00:16:22.871 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:22.871 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:23.130 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:23.130 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:23.130 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:23.130 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:23.130 18:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:23.130 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:23.130 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:23.130 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:23.130 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:23.389 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:23.389 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:23.389 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:23.389 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:23.389 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:23.389 "name": "pt3", 00:16:23.389 "aliases": [ 00:16:23.389 "00000000-0000-0000-0000-000000000003" 00:16:23.389 ], 00:16:23.389 "product_name": "passthru", 00:16:23.389 "block_size": 512, 00:16:23.389 "num_blocks": 65536, 00:16:23.389 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:23.389 "assigned_rate_limits": { 00:16:23.389 "rw_ios_per_sec": 0, 00:16:23.389 "rw_mbytes_per_sec": 0, 00:16:23.389 "r_mbytes_per_sec": 0, 00:16:23.389 "w_mbytes_per_sec": 0 00:16:23.389 }, 00:16:23.389 "claimed": true, 00:16:23.389 "claim_type": "exclusive_write", 00:16:23.389 "zoned": false, 00:16:23.389 "supported_io_types": { 00:16:23.389 "read": true, 00:16:23.389 "write": true, 00:16:23.389 "unmap": true, 00:16:23.389 "flush": true, 00:16:23.389 "reset": true, 00:16:23.389 "nvme_admin": false, 00:16:23.389 "nvme_io": false, 00:16:23.389 "nvme_io_md": false, 00:16:23.389 "write_zeroes": true, 00:16:23.389 "zcopy": true, 00:16:23.389 "get_zone_info": false, 00:16:23.389 "zone_management": false, 00:16:23.389 "zone_append": false, 00:16:23.389 "compare": false, 00:16:23.389 "compare_and_write": false, 00:16:23.389 "abort": true, 00:16:23.389 "seek_hole": false, 00:16:23.389 "seek_data": false, 00:16:23.389 "copy": true, 00:16:23.389 "nvme_iov_md": false 00:16:23.389 }, 00:16:23.389 "memory_domains": [ 00:16:23.389 { 00:16:23.389 "dma_device_id": "system", 00:16:23.389 "dma_device_type": 1 00:16:23.389 }, 00:16:23.389 { 00:16:23.389 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.389 "dma_device_type": 2 00:16:23.389 } 00:16:23.389 ], 00:16:23.389 "driver_specific": { 00:16:23.389 "passthru": { 00:16:23.389 "name": "pt3", 00:16:23.389 "base_bdev_name": "malloc3" 00:16:23.389 } 00:16:23.389 } 00:16:23.389 }' 00:16:23.389 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:23.389 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:23.389 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:23.389 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:23.648 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:23.648 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:23.648 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:23.648 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:23.648 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:23.648 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:23.648 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:23.648 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:23.648 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:23.648 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:23.648 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:16:23.907 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:23.907 "name": "pt4", 00:16:23.907 "aliases": [ 00:16:23.907 "00000000-0000-0000-0000-000000000004" 00:16:23.907 ], 00:16:23.907 "product_name": "passthru", 00:16:23.907 "block_size": 512, 00:16:23.907 "num_blocks": 65536, 00:16:23.907 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:23.907 "assigned_rate_limits": { 00:16:23.907 "rw_ios_per_sec": 0, 00:16:23.907 "rw_mbytes_per_sec": 0, 00:16:23.907 "r_mbytes_per_sec": 0, 00:16:23.907 "w_mbytes_per_sec": 0 00:16:23.907 }, 00:16:23.907 "claimed": true, 00:16:23.907 "claim_type": "exclusive_write", 00:16:23.907 "zoned": false, 00:16:23.907 "supported_io_types": { 00:16:23.907 "read": true, 00:16:23.907 "write": true, 00:16:23.907 "unmap": true, 00:16:23.907 "flush": true, 00:16:23.907 "reset": true, 00:16:23.907 "nvme_admin": false, 00:16:23.907 "nvme_io": false, 00:16:23.907 "nvme_io_md": false, 00:16:23.907 "write_zeroes": true, 00:16:23.907 "zcopy": true, 00:16:23.907 "get_zone_info": false, 00:16:23.907 "zone_management": false, 00:16:23.907 "zone_append": false, 00:16:23.907 "compare": false, 00:16:23.907 "compare_and_write": false, 00:16:23.907 "abort": true, 00:16:23.907 "seek_hole": false, 00:16:23.907 "seek_data": false, 00:16:23.907 "copy": true, 00:16:23.907 "nvme_iov_md": false 00:16:23.907 }, 00:16:23.907 "memory_domains": [ 00:16:23.907 { 00:16:23.907 "dma_device_id": "system", 00:16:23.907 "dma_device_type": 1 00:16:23.907 }, 00:16:23.907 { 00:16:23.907 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.907 "dma_device_type": 2 00:16:23.907 } 00:16:23.907 ], 00:16:23.907 "driver_specific": { 00:16:23.907 "passthru": { 00:16:23.907 "name": "pt4", 00:16:23.907 "base_bdev_name": "malloc4" 00:16:23.907 } 00:16:23.907 } 00:16:23.907 }' 00:16:23.907 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:23.907 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:23.907 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:23.907 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:23.907 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:24.166 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:24.166 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:24.166 18:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:24.166 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:24.166 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:24.166 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:24.166 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:24.166 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:24.166 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:16:24.425 [2024-07-24 18:53:09.239288] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:24.425 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=b7d29aeb-8d1c-40ad-a7cb-fb2de02beb2c 00:16:24.425 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z b7d29aeb-8d1c-40ad-a7cb-fb2de02beb2c ']' 00:16:24.425 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:24.425 [2024-07-24 18:53:09.407523] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:24.425 [2024-07-24 18:53:09.407535] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:24.425 [2024-07-24 18:53:09.407571] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:24.425 [2024-07-24 18:53:09.407615] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:24.425 [2024-07-24 18:53:09.407631] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a0ad40 name raid_bdev1, state offline 00:16:24.425 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.425 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:16:24.683 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:16:24.683 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:16:24.683 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:24.683 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:24.942 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:24.942 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:24.942 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:24.942 18:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:25.200 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:25.200 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:16:25.459 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:25.459 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:25.459 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:16:25.459 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:25.459 18:53:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:16:25.459 18:53:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:25.459 18:53:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:25.459 18:53:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:25.459 18:53:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:25.459 18:53:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:25.459 18:53:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:25.459 18:53:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:25.459 18:53:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:25.459 18:53:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:25.459 18:53:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:25.719 [2024-07-24 18:53:10.574515] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:25.719 [2024-07-24 18:53:10.575519] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:25.719 [2024-07-24 18:53:10.575550] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:25.719 [2024-07-24 18:53:10.575571] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:16:25.719 [2024-07-24 18:53:10.575603] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:25.719 [2024-07-24 18:53:10.575628] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:25.719 [2024-07-24 18:53:10.575641] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:25.719 [2024-07-24 18:53:10.575653] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:16:25.719 [2024-07-24 18:53:10.575662] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:25.719 [2024-07-24 18:53:10.575667] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a07100 name raid_bdev1, state configuring 00:16:25.719 request: 00:16:25.719 { 00:16:25.719 "name": "raid_bdev1", 00:16:25.719 "raid_level": "concat", 00:16:25.719 "base_bdevs": [ 00:16:25.719 "malloc1", 00:16:25.719 "malloc2", 00:16:25.719 "malloc3", 00:16:25.719 "malloc4" 00:16:25.719 ], 00:16:25.719 "strip_size_kb": 64, 00:16:25.719 "superblock": false, 00:16:25.719 "method": "bdev_raid_create", 00:16:25.719 "req_id": 1 00:16:25.719 } 00:16:25.719 Got JSON-RPC error response 00:16:25.719 response: 00:16:25.719 { 00:16:25.719 "code": -17, 00:16:25.719 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:25.719 } 00:16:25.719 18:53:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:16:25.719 18:53:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:16:25.719 18:53:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:16:25.719 18:53:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:16:25.719 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.719 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:16:25.978 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:16:25.978 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:16:25.978 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:25.978 [2024-07-24 18:53:10.915346] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:25.978 [2024-07-24 18:53:10.915372] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:25.978 [2024-07-24 18:53:10.915384] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a09980 00:16:25.978 [2024-07-24 18:53:10.915390] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:25.978 [2024-07-24 18:53:10.916535] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:25.978 [2024-07-24 18:53:10.916555] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:25.978 [2024-07-24 18:53:10.916600] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:25.978 [2024-07-24 18:53:10.916617] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:25.978 pt1 00:16:25.978 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:16:25.978 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:25.978 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:25.978 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:25.978 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:25.978 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:25.978 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:25.978 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:25.978 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:25.978 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:25.978 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.978 18:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:26.237 18:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:26.237 "name": "raid_bdev1", 00:16:26.237 "uuid": "b7d29aeb-8d1c-40ad-a7cb-fb2de02beb2c", 00:16:26.237 "strip_size_kb": 64, 00:16:26.237 "state": "configuring", 00:16:26.237 "raid_level": "concat", 00:16:26.237 "superblock": true, 00:16:26.237 "num_base_bdevs": 4, 00:16:26.237 "num_base_bdevs_discovered": 1, 00:16:26.237 "num_base_bdevs_operational": 4, 00:16:26.237 "base_bdevs_list": [ 00:16:26.237 { 00:16:26.237 "name": "pt1", 00:16:26.237 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:26.237 "is_configured": true, 00:16:26.237 "data_offset": 2048, 00:16:26.237 "data_size": 63488 00:16:26.237 }, 00:16:26.237 { 00:16:26.237 "name": null, 00:16:26.237 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:26.237 "is_configured": false, 00:16:26.237 "data_offset": 2048, 00:16:26.237 "data_size": 63488 00:16:26.237 }, 00:16:26.237 { 00:16:26.237 "name": null, 00:16:26.237 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:26.237 "is_configured": false, 00:16:26.237 "data_offset": 2048, 00:16:26.237 "data_size": 63488 00:16:26.237 }, 00:16:26.237 { 00:16:26.237 "name": null, 00:16:26.237 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:26.237 "is_configured": false, 00:16:26.237 "data_offset": 2048, 00:16:26.237 "data_size": 63488 00:16:26.237 } 00:16:26.237 ] 00:16:26.237 }' 00:16:26.237 18:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:26.237 18:53:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:26.805 18:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:16:26.805 18:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:26.805 [2024-07-24 18:53:11.729476] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:26.805 [2024-07-24 18:53:11.729513] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:26.805 [2024-07-24 18:53:11.729527] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a09c80 00:16:26.805 [2024-07-24 18:53:11.729533] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:26.805 [2024-07-24 18:53:11.729779] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:26.805 [2024-07-24 18:53:11.729792] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:26.805 [2024-07-24 18:53:11.729835] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:26.805 [2024-07-24 18:53:11.729847] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:26.805 pt2 00:16:26.805 18:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:27.064 [2024-07-24 18:53:11.897914] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:27.064 18:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:16:27.064 18:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:27.064 18:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:27.064 18:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:27.064 18:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:27.064 18:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:27.064 18:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:27.064 18:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:27.064 18:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:27.064 18:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:27.064 18:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.064 18:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:27.323 18:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:27.323 "name": "raid_bdev1", 00:16:27.323 "uuid": "b7d29aeb-8d1c-40ad-a7cb-fb2de02beb2c", 00:16:27.323 "strip_size_kb": 64, 00:16:27.323 "state": "configuring", 00:16:27.323 "raid_level": "concat", 00:16:27.323 "superblock": true, 00:16:27.323 "num_base_bdevs": 4, 00:16:27.323 "num_base_bdevs_discovered": 1, 00:16:27.323 "num_base_bdevs_operational": 4, 00:16:27.323 "base_bdevs_list": [ 00:16:27.323 { 00:16:27.323 "name": "pt1", 00:16:27.323 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:27.323 "is_configured": true, 00:16:27.323 "data_offset": 2048, 00:16:27.323 "data_size": 63488 00:16:27.323 }, 00:16:27.323 { 00:16:27.323 "name": null, 00:16:27.323 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:27.323 "is_configured": false, 00:16:27.323 "data_offset": 2048, 00:16:27.323 "data_size": 63488 00:16:27.323 }, 00:16:27.323 { 00:16:27.323 "name": null, 00:16:27.323 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:27.323 "is_configured": false, 00:16:27.323 "data_offset": 2048, 00:16:27.323 "data_size": 63488 00:16:27.323 }, 00:16:27.323 { 00:16:27.323 "name": null, 00:16:27.323 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:27.323 "is_configured": false, 00:16:27.323 "data_offset": 2048, 00:16:27.323 "data_size": 63488 00:16:27.323 } 00:16:27.323 ] 00:16:27.323 }' 00:16:27.323 18:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:27.323 18:53:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:27.581 18:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:16:27.581 18:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:27.581 18:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:27.839 [2024-07-24 18:53:12.708051] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:27.839 [2024-07-24 18:53:12.708088] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:27.839 [2024-07-24 18:53:12.708100] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a09eb0 00:16:27.839 [2024-07-24 18:53:12.708106] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:27.839 [2024-07-24 18:53:12.708339] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:27.839 [2024-07-24 18:53:12.708348] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:27.839 [2024-07-24 18:53:12.708388] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:27.839 [2024-07-24 18:53:12.708399] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:27.839 pt2 00:16:27.839 18:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:27.839 18:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:27.839 18:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:28.097 [2024-07-24 18:53:12.892524] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:28.097 [2024-07-24 18:53:12.892544] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:28.097 [2024-07-24 18:53:12.892551] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x185bda0 00:16:28.097 [2024-07-24 18:53:12.892557] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:28.097 [2024-07-24 18:53:12.892754] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:28.097 [2024-07-24 18:53:12.892765] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:28.097 [2024-07-24 18:53:12.892794] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:28.097 [2024-07-24 18:53:12.892804] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:28.097 pt3 00:16:28.097 18:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:28.097 18:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:28.097 18:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:16:28.097 [2024-07-24 18:53:13.064982] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:16:28.097 [2024-07-24 18:53:13.065006] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:28.097 [2024-07-24 18:53:13.065015] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a0ced0 00:16:28.097 [2024-07-24 18:53:13.065022] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:28.097 [2024-07-24 18:53:13.065248] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:28.097 [2024-07-24 18:53:13.065259] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:16:28.097 [2024-07-24 18:53:13.065295] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:16:28.097 [2024-07-24 18:53:13.065308] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:16:28.097 [2024-07-24 18:53:13.065398] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a093c0 00:16:28.097 [2024-07-24 18:53:13.065405] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:28.097 [2024-07-24 18:53:13.065549] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a09890 00:16:28.097 [2024-07-24 18:53:13.065655] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a093c0 00:16:28.097 [2024-07-24 18:53:13.065661] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a093c0 00:16:28.097 [2024-07-24 18:53:13.065735] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:28.097 pt4 00:16:28.097 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:28.097 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:28.097 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:28.097 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:28.097 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:28.097 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:28.097 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:28.097 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:28.097 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:28.097 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:28.097 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:28.097 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:28.097 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:28.097 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:28.355 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:28.355 "name": "raid_bdev1", 00:16:28.355 "uuid": "b7d29aeb-8d1c-40ad-a7cb-fb2de02beb2c", 00:16:28.355 "strip_size_kb": 64, 00:16:28.355 "state": "online", 00:16:28.355 "raid_level": "concat", 00:16:28.355 "superblock": true, 00:16:28.355 "num_base_bdevs": 4, 00:16:28.355 "num_base_bdevs_discovered": 4, 00:16:28.355 "num_base_bdevs_operational": 4, 00:16:28.355 "base_bdevs_list": [ 00:16:28.355 { 00:16:28.355 "name": "pt1", 00:16:28.355 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:28.355 "is_configured": true, 00:16:28.355 "data_offset": 2048, 00:16:28.355 "data_size": 63488 00:16:28.355 }, 00:16:28.355 { 00:16:28.355 "name": "pt2", 00:16:28.355 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:28.355 "is_configured": true, 00:16:28.355 "data_offset": 2048, 00:16:28.355 "data_size": 63488 00:16:28.355 }, 00:16:28.355 { 00:16:28.355 "name": "pt3", 00:16:28.355 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:28.355 "is_configured": true, 00:16:28.355 "data_offset": 2048, 00:16:28.355 "data_size": 63488 00:16:28.355 }, 00:16:28.355 { 00:16:28.355 "name": "pt4", 00:16:28.355 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:28.355 "is_configured": true, 00:16:28.355 "data_offset": 2048, 00:16:28.355 "data_size": 63488 00:16:28.355 } 00:16:28.355 ] 00:16:28.356 }' 00:16:28.356 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:28.356 18:53:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:28.921 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:16:28.921 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:28.921 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:28.921 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:28.921 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:28.921 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:28.921 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:28.921 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:28.921 [2024-07-24 18:53:13.907353] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:28.921 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:28.921 "name": "raid_bdev1", 00:16:28.921 "aliases": [ 00:16:28.921 "b7d29aeb-8d1c-40ad-a7cb-fb2de02beb2c" 00:16:28.921 ], 00:16:28.921 "product_name": "Raid Volume", 00:16:28.921 "block_size": 512, 00:16:28.921 "num_blocks": 253952, 00:16:28.921 "uuid": "b7d29aeb-8d1c-40ad-a7cb-fb2de02beb2c", 00:16:28.921 "assigned_rate_limits": { 00:16:28.921 "rw_ios_per_sec": 0, 00:16:28.921 "rw_mbytes_per_sec": 0, 00:16:28.921 "r_mbytes_per_sec": 0, 00:16:28.921 "w_mbytes_per_sec": 0 00:16:28.921 }, 00:16:28.921 "claimed": false, 00:16:28.921 "zoned": false, 00:16:28.921 "supported_io_types": { 00:16:28.921 "read": true, 00:16:28.921 "write": true, 00:16:28.921 "unmap": true, 00:16:28.921 "flush": true, 00:16:28.921 "reset": true, 00:16:28.921 "nvme_admin": false, 00:16:28.921 "nvme_io": false, 00:16:28.921 "nvme_io_md": false, 00:16:28.921 "write_zeroes": true, 00:16:28.921 "zcopy": false, 00:16:28.921 "get_zone_info": false, 00:16:28.921 "zone_management": false, 00:16:28.921 "zone_append": false, 00:16:28.921 "compare": false, 00:16:28.921 "compare_and_write": false, 00:16:28.921 "abort": false, 00:16:28.921 "seek_hole": false, 00:16:28.921 "seek_data": false, 00:16:28.921 "copy": false, 00:16:28.921 "nvme_iov_md": false 00:16:28.921 }, 00:16:28.921 "memory_domains": [ 00:16:28.921 { 00:16:28.921 "dma_device_id": "system", 00:16:28.921 "dma_device_type": 1 00:16:28.921 }, 00:16:28.921 { 00:16:28.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.921 "dma_device_type": 2 00:16:28.921 }, 00:16:28.921 { 00:16:28.921 "dma_device_id": "system", 00:16:28.921 "dma_device_type": 1 00:16:28.921 }, 00:16:28.921 { 00:16:28.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.921 "dma_device_type": 2 00:16:28.921 }, 00:16:28.921 { 00:16:28.921 "dma_device_id": "system", 00:16:28.921 "dma_device_type": 1 00:16:28.921 }, 00:16:28.921 { 00:16:28.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.921 "dma_device_type": 2 00:16:28.921 }, 00:16:28.921 { 00:16:28.921 "dma_device_id": "system", 00:16:28.921 "dma_device_type": 1 00:16:28.921 }, 00:16:28.921 { 00:16:28.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.921 "dma_device_type": 2 00:16:28.921 } 00:16:28.921 ], 00:16:28.921 "driver_specific": { 00:16:28.921 "raid": { 00:16:28.921 "uuid": "b7d29aeb-8d1c-40ad-a7cb-fb2de02beb2c", 00:16:28.921 "strip_size_kb": 64, 00:16:28.921 "state": "online", 00:16:28.921 "raid_level": "concat", 00:16:28.921 "superblock": true, 00:16:28.921 "num_base_bdevs": 4, 00:16:28.921 "num_base_bdevs_discovered": 4, 00:16:28.921 "num_base_bdevs_operational": 4, 00:16:28.921 "base_bdevs_list": [ 00:16:28.921 { 00:16:28.921 "name": "pt1", 00:16:28.921 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:28.921 "is_configured": true, 00:16:28.921 "data_offset": 2048, 00:16:28.921 "data_size": 63488 00:16:28.921 }, 00:16:28.921 { 00:16:28.921 "name": "pt2", 00:16:28.921 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:28.921 "is_configured": true, 00:16:28.921 "data_offset": 2048, 00:16:28.921 "data_size": 63488 00:16:28.921 }, 00:16:28.921 { 00:16:28.921 "name": "pt3", 00:16:28.921 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:28.921 "is_configured": true, 00:16:28.921 "data_offset": 2048, 00:16:28.921 "data_size": 63488 00:16:28.921 }, 00:16:28.921 { 00:16:28.921 "name": "pt4", 00:16:28.921 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:28.921 "is_configured": true, 00:16:28.921 "data_offset": 2048, 00:16:28.921 "data_size": 63488 00:16:28.921 } 00:16:28.921 ] 00:16:28.921 } 00:16:28.921 } 00:16:28.921 }' 00:16:29.179 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:29.179 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:29.179 pt2 00:16:29.179 pt3 00:16:29.179 pt4' 00:16:29.179 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:29.179 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:29.179 18:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:29.179 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:29.179 "name": "pt1", 00:16:29.179 "aliases": [ 00:16:29.179 "00000000-0000-0000-0000-000000000001" 00:16:29.179 ], 00:16:29.179 "product_name": "passthru", 00:16:29.179 "block_size": 512, 00:16:29.179 "num_blocks": 65536, 00:16:29.179 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:29.179 "assigned_rate_limits": { 00:16:29.179 "rw_ios_per_sec": 0, 00:16:29.179 "rw_mbytes_per_sec": 0, 00:16:29.179 "r_mbytes_per_sec": 0, 00:16:29.179 "w_mbytes_per_sec": 0 00:16:29.179 }, 00:16:29.179 "claimed": true, 00:16:29.179 "claim_type": "exclusive_write", 00:16:29.179 "zoned": false, 00:16:29.179 "supported_io_types": { 00:16:29.179 "read": true, 00:16:29.179 "write": true, 00:16:29.179 "unmap": true, 00:16:29.179 "flush": true, 00:16:29.179 "reset": true, 00:16:29.179 "nvme_admin": false, 00:16:29.179 "nvme_io": false, 00:16:29.179 "nvme_io_md": false, 00:16:29.179 "write_zeroes": true, 00:16:29.179 "zcopy": true, 00:16:29.179 "get_zone_info": false, 00:16:29.179 "zone_management": false, 00:16:29.179 "zone_append": false, 00:16:29.179 "compare": false, 00:16:29.179 "compare_and_write": false, 00:16:29.179 "abort": true, 00:16:29.179 "seek_hole": false, 00:16:29.179 "seek_data": false, 00:16:29.179 "copy": true, 00:16:29.179 "nvme_iov_md": false 00:16:29.179 }, 00:16:29.179 "memory_domains": [ 00:16:29.179 { 00:16:29.179 "dma_device_id": "system", 00:16:29.179 "dma_device_type": 1 00:16:29.179 }, 00:16:29.179 { 00:16:29.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:29.179 "dma_device_type": 2 00:16:29.179 } 00:16:29.179 ], 00:16:29.179 "driver_specific": { 00:16:29.179 "passthru": { 00:16:29.179 "name": "pt1", 00:16:29.179 "base_bdev_name": "malloc1" 00:16:29.179 } 00:16:29.179 } 00:16:29.179 }' 00:16:29.179 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:29.179 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:29.437 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:29.437 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:29.437 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:29.437 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:29.437 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:29.437 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:29.437 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:29.437 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:29.437 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:29.437 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:29.437 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:29.437 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:29.437 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:29.695 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:29.695 "name": "pt2", 00:16:29.695 "aliases": [ 00:16:29.695 "00000000-0000-0000-0000-000000000002" 00:16:29.695 ], 00:16:29.695 "product_name": "passthru", 00:16:29.695 "block_size": 512, 00:16:29.695 "num_blocks": 65536, 00:16:29.695 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:29.695 "assigned_rate_limits": { 00:16:29.695 "rw_ios_per_sec": 0, 00:16:29.695 "rw_mbytes_per_sec": 0, 00:16:29.695 "r_mbytes_per_sec": 0, 00:16:29.695 "w_mbytes_per_sec": 0 00:16:29.695 }, 00:16:29.695 "claimed": true, 00:16:29.695 "claim_type": "exclusive_write", 00:16:29.695 "zoned": false, 00:16:29.695 "supported_io_types": { 00:16:29.695 "read": true, 00:16:29.695 "write": true, 00:16:29.695 "unmap": true, 00:16:29.695 "flush": true, 00:16:29.695 "reset": true, 00:16:29.696 "nvme_admin": false, 00:16:29.696 "nvme_io": false, 00:16:29.696 "nvme_io_md": false, 00:16:29.696 "write_zeroes": true, 00:16:29.696 "zcopy": true, 00:16:29.696 "get_zone_info": false, 00:16:29.696 "zone_management": false, 00:16:29.696 "zone_append": false, 00:16:29.696 "compare": false, 00:16:29.696 "compare_and_write": false, 00:16:29.696 "abort": true, 00:16:29.696 "seek_hole": false, 00:16:29.696 "seek_data": false, 00:16:29.696 "copy": true, 00:16:29.696 "nvme_iov_md": false 00:16:29.696 }, 00:16:29.696 "memory_domains": [ 00:16:29.696 { 00:16:29.696 "dma_device_id": "system", 00:16:29.696 "dma_device_type": 1 00:16:29.696 }, 00:16:29.696 { 00:16:29.696 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:29.696 "dma_device_type": 2 00:16:29.696 } 00:16:29.696 ], 00:16:29.696 "driver_specific": { 00:16:29.696 "passthru": { 00:16:29.696 "name": "pt2", 00:16:29.696 "base_bdev_name": "malloc2" 00:16:29.696 } 00:16:29.696 } 00:16:29.696 }' 00:16:29.696 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:29.696 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:29.696 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:29.696 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:29.954 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:29.954 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:29.954 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:29.954 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:29.954 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:29.954 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:29.954 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:29.954 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:29.954 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:29.954 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:29.954 18:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:30.211 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:30.211 "name": "pt3", 00:16:30.211 "aliases": [ 00:16:30.211 "00000000-0000-0000-0000-000000000003" 00:16:30.211 ], 00:16:30.211 "product_name": "passthru", 00:16:30.211 "block_size": 512, 00:16:30.211 "num_blocks": 65536, 00:16:30.211 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:30.211 "assigned_rate_limits": { 00:16:30.211 "rw_ios_per_sec": 0, 00:16:30.211 "rw_mbytes_per_sec": 0, 00:16:30.211 "r_mbytes_per_sec": 0, 00:16:30.211 "w_mbytes_per_sec": 0 00:16:30.211 }, 00:16:30.211 "claimed": true, 00:16:30.211 "claim_type": "exclusive_write", 00:16:30.211 "zoned": false, 00:16:30.211 "supported_io_types": { 00:16:30.211 "read": true, 00:16:30.211 "write": true, 00:16:30.211 "unmap": true, 00:16:30.211 "flush": true, 00:16:30.211 "reset": true, 00:16:30.211 "nvme_admin": false, 00:16:30.211 "nvme_io": false, 00:16:30.211 "nvme_io_md": false, 00:16:30.211 "write_zeroes": true, 00:16:30.211 "zcopy": true, 00:16:30.211 "get_zone_info": false, 00:16:30.211 "zone_management": false, 00:16:30.211 "zone_append": false, 00:16:30.211 "compare": false, 00:16:30.211 "compare_and_write": false, 00:16:30.211 "abort": true, 00:16:30.211 "seek_hole": false, 00:16:30.211 "seek_data": false, 00:16:30.211 "copy": true, 00:16:30.211 "nvme_iov_md": false 00:16:30.211 }, 00:16:30.211 "memory_domains": [ 00:16:30.211 { 00:16:30.211 "dma_device_id": "system", 00:16:30.211 "dma_device_type": 1 00:16:30.211 }, 00:16:30.211 { 00:16:30.211 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.211 "dma_device_type": 2 00:16:30.211 } 00:16:30.211 ], 00:16:30.211 "driver_specific": { 00:16:30.211 "passthru": { 00:16:30.211 "name": "pt3", 00:16:30.211 "base_bdev_name": "malloc3" 00:16:30.211 } 00:16:30.211 } 00:16:30.211 }' 00:16:30.211 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:30.211 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:30.211 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:30.211 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:30.211 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:30.469 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:30.469 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:30.469 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:30.469 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:30.469 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:30.469 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:30.469 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:30.469 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:30.469 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:16:30.469 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:30.727 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:30.727 "name": "pt4", 00:16:30.727 "aliases": [ 00:16:30.727 "00000000-0000-0000-0000-000000000004" 00:16:30.727 ], 00:16:30.727 "product_name": "passthru", 00:16:30.727 "block_size": 512, 00:16:30.727 "num_blocks": 65536, 00:16:30.727 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:30.727 "assigned_rate_limits": { 00:16:30.727 "rw_ios_per_sec": 0, 00:16:30.727 "rw_mbytes_per_sec": 0, 00:16:30.727 "r_mbytes_per_sec": 0, 00:16:30.727 "w_mbytes_per_sec": 0 00:16:30.727 }, 00:16:30.727 "claimed": true, 00:16:30.727 "claim_type": "exclusive_write", 00:16:30.727 "zoned": false, 00:16:30.727 "supported_io_types": { 00:16:30.727 "read": true, 00:16:30.727 "write": true, 00:16:30.727 "unmap": true, 00:16:30.727 "flush": true, 00:16:30.727 "reset": true, 00:16:30.727 "nvme_admin": false, 00:16:30.727 "nvme_io": false, 00:16:30.727 "nvme_io_md": false, 00:16:30.727 "write_zeroes": true, 00:16:30.727 "zcopy": true, 00:16:30.727 "get_zone_info": false, 00:16:30.727 "zone_management": false, 00:16:30.727 "zone_append": false, 00:16:30.727 "compare": false, 00:16:30.727 "compare_and_write": false, 00:16:30.727 "abort": true, 00:16:30.727 "seek_hole": false, 00:16:30.727 "seek_data": false, 00:16:30.727 "copy": true, 00:16:30.727 "nvme_iov_md": false 00:16:30.727 }, 00:16:30.727 "memory_domains": [ 00:16:30.727 { 00:16:30.727 "dma_device_id": "system", 00:16:30.727 "dma_device_type": 1 00:16:30.727 }, 00:16:30.727 { 00:16:30.727 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.727 "dma_device_type": 2 00:16:30.727 } 00:16:30.727 ], 00:16:30.727 "driver_specific": { 00:16:30.727 "passthru": { 00:16:30.727 "name": "pt4", 00:16:30.727 "base_bdev_name": "malloc4" 00:16:30.727 } 00:16:30.727 } 00:16:30.727 }' 00:16:30.727 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:30.727 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:30.727 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:30.727 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:30.727 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:30.727 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:30.727 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:30.727 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:30.985 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:30.985 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:30.985 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:30.985 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:30.985 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:30.985 18:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:16:30.985 [2024-07-24 18:53:15.992824] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:31.244 18:53:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' b7d29aeb-8d1c-40ad-a7cb-fb2de02beb2c '!=' b7d29aeb-8d1c-40ad-a7cb-fb2de02beb2c ']' 00:16:31.244 18:53:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:16:31.244 18:53:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:31.244 18:53:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:31.244 18:53:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2129458 00:16:31.244 18:53:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2129458 ']' 00:16:31.244 18:53:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2129458 00:16:31.244 18:53:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:16:31.244 18:53:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:31.244 18:53:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2129458 00:16:31.244 18:53:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:31.244 18:53:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:31.244 18:53:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2129458' 00:16:31.244 killing process with pid 2129458 00:16:31.244 18:53:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2129458 00:16:31.244 [2024-07-24 18:53:16.041868] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:31.244 [2024-07-24 18:53:16.041913] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:31.244 18:53:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2129458 00:16:31.244 [2024-07-24 18:53:16.041956] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:31.244 [2024-07-24 18:53:16.041962] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a093c0 name raid_bdev1, state offline 00:16:31.244 [2024-07-24 18:53:16.073289] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:31.244 18:53:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:16:31.244 00:16:31.244 real 0m12.236s 00:16:31.244 user 0m22.406s 00:16:31.244 sys 0m1.837s 00:16:31.244 18:53:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:31.244 18:53:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:31.244 ************************************ 00:16:31.244 END TEST raid_superblock_test 00:16:31.244 ************************************ 00:16:31.519 18:53:16 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:16:31.519 18:53:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:31.519 18:53:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:31.519 18:53:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:31.519 ************************************ 00:16:31.519 START TEST raid_read_error_test 00:16:31.519 ************************************ 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.CX6VF5Tve2 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2131840 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2131840 /var/tmp/spdk-raid.sock 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2131840 ']' 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:31.519 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:31.519 18:53:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:31.519 [2024-07-24 18:53:16.372624] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:16:31.519 [2024-07-24 18:53:16.372664] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2131840 ] 00:16:31.519 [2024-07-24 18:53:16.434781] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:31.519 [2024-07-24 18:53:16.512707] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:31.797 [2024-07-24 18:53:16.563065] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:31.797 [2024-07-24 18:53:16.563090] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:32.364 18:53:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:32.364 18:53:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:32.364 18:53:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:32.364 18:53:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:32.364 BaseBdev1_malloc 00:16:32.364 18:53:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:32.623 true 00:16:32.623 18:53:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:32.881 [2024-07-24 18:53:17.654901] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:32.881 [2024-07-24 18:53:17.654934] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:32.881 [2024-07-24 18:53:17.654945] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bbbd20 00:16:32.881 [2024-07-24 18:53:17.654951] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:32.881 [2024-07-24 18:53:17.656130] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:32.881 [2024-07-24 18:53:17.656152] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:32.881 BaseBdev1 00:16:32.881 18:53:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:32.881 18:53:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:32.881 BaseBdev2_malloc 00:16:32.881 18:53:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:33.140 true 00:16:33.140 18:53:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:33.398 [2024-07-24 18:53:18.155777] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:33.398 [2024-07-24 18:53:18.155808] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:33.398 [2024-07-24 18:53:18.155819] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bc0d50 00:16:33.398 [2024-07-24 18:53:18.155824] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:33.398 [2024-07-24 18:53:18.156841] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:33.398 [2024-07-24 18:53:18.156861] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:33.398 BaseBdev2 00:16:33.398 18:53:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:33.398 18:53:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:33.398 BaseBdev3_malloc 00:16:33.398 18:53:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:33.655 true 00:16:33.655 18:53:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:33.655 [2024-07-24 18:53:18.644404] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:33.655 [2024-07-24 18:53:18.644436] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:33.655 [2024-07-24 18:53:18.644446] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bbfef0 00:16:33.655 [2024-07-24 18:53:18.644452] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:33.655 [2024-07-24 18:53:18.645543] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:33.655 [2024-07-24 18:53:18.645564] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:33.655 BaseBdev3 00:16:33.655 18:53:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:33.655 18:53:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:16:33.913 BaseBdev4_malloc 00:16:33.914 18:53:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:16:34.172 true 00:16:34.172 18:53:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:16:34.172 [2024-07-24 18:53:19.137172] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:16:34.172 [2024-07-24 18:53:19.137203] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:34.172 [2024-07-24 18:53:19.137215] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bc4280 00:16:34.172 [2024-07-24 18:53:19.137221] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:34.172 [2024-07-24 18:53:19.138281] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:34.172 [2024-07-24 18:53:19.138301] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:16:34.172 BaseBdev4 00:16:34.172 18:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:16:34.430 [2024-07-24 18:53:19.293605] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:34.430 [2024-07-24 18:53:19.294491] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:34.430 [2024-07-24 18:53:19.294538] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:34.430 [2024-07-24 18:53:19.294575] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:34.430 [2024-07-24 18:53:19.294737] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bc5d90 00:16:34.430 [2024-07-24 18:53:19.294743] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:34.430 [2024-07-24 18:53:19.294875] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bc38d0 00:16:34.430 [2024-07-24 18:53:19.294977] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bc5d90 00:16:34.430 [2024-07-24 18:53:19.294982] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bc5d90 00:16:34.431 [2024-07-24 18:53:19.295049] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:34.431 18:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:34.431 18:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:34.431 18:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:34.431 18:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:34.431 18:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:34.431 18:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:34.431 18:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:34.431 18:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:34.431 18:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:34.431 18:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:34.431 18:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:34.431 18:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:34.688 18:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:34.689 "name": "raid_bdev1", 00:16:34.689 "uuid": "b439a3a7-56c1-4844-a57a-185347871d2b", 00:16:34.689 "strip_size_kb": 64, 00:16:34.689 "state": "online", 00:16:34.689 "raid_level": "concat", 00:16:34.689 "superblock": true, 00:16:34.689 "num_base_bdevs": 4, 00:16:34.689 "num_base_bdevs_discovered": 4, 00:16:34.689 "num_base_bdevs_operational": 4, 00:16:34.689 "base_bdevs_list": [ 00:16:34.689 { 00:16:34.689 "name": "BaseBdev1", 00:16:34.689 "uuid": "728cb84a-1423-5883-b0fe-c3b168e84e9b", 00:16:34.689 "is_configured": true, 00:16:34.689 "data_offset": 2048, 00:16:34.689 "data_size": 63488 00:16:34.689 }, 00:16:34.689 { 00:16:34.689 "name": "BaseBdev2", 00:16:34.689 "uuid": "9b85f439-a553-5c3f-8df7-2c4156fa861d", 00:16:34.689 "is_configured": true, 00:16:34.689 "data_offset": 2048, 00:16:34.689 "data_size": 63488 00:16:34.689 }, 00:16:34.689 { 00:16:34.689 "name": "BaseBdev3", 00:16:34.689 "uuid": "6211ddde-0cc2-5a83-a8b2-dd7dff4810fa", 00:16:34.689 "is_configured": true, 00:16:34.689 "data_offset": 2048, 00:16:34.689 "data_size": 63488 00:16:34.689 }, 00:16:34.689 { 00:16:34.689 "name": "BaseBdev4", 00:16:34.689 "uuid": "d63553bb-7c35-5215-a7de-e812f74aced3", 00:16:34.689 "is_configured": true, 00:16:34.689 "data_offset": 2048, 00:16:34.689 "data_size": 63488 00:16:34.689 } 00:16:34.689 ] 00:16:34.689 }' 00:16:34.689 18:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:34.689 18:53:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:35.255 18:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:35.255 18:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:35.255 [2024-07-24 18:53:20.047821] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bc9210 00:16:36.191 18:53:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:16:36.191 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:36.191 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:36.191 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:16:36.191 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:36.191 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:36.191 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:36.191 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:36.191 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:36.191 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:36.191 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:36.191 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:36.191 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:36.191 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:36.191 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:36.191 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:36.450 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:36.450 "name": "raid_bdev1", 00:16:36.450 "uuid": "b439a3a7-56c1-4844-a57a-185347871d2b", 00:16:36.450 "strip_size_kb": 64, 00:16:36.450 "state": "online", 00:16:36.450 "raid_level": "concat", 00:16:36.450 "superblock": true, 00:16:36.450 "num_base_bdevs": 4, 00:16:36.450 "num_base_bdevs_discovered": 4, 00:16:36.450 "num_base_bdevs_operational": 4, 00:16:36.450 "base_bdevs_list": [ 00:16:36.450 { 00:16:36.450 "name": "BaseBdev1", 00:16:36.450 "uuid": "728cb84a-1423-5883-b0fe-c3b168e84e9b", 00:16:36.450 "is_configured": true, 00:16:36.450 "data_offset": 2048, 00:16:36.450 "data_size": 63488 00:16:36.450 }, 00:16:36.450 { 00:16:36.450 "name": "BaseBdev2", 00:16:36.450 "uuid": "9b85f439-a553-5c3f-8df7-2c4156fa861d", 00:16:36.450 "is_configured": true, 00:16:36.450 "data_offset": 2048, 00:16:36.450 "data_size": 63488 00:16:36.450 }, 00:16:36.450 { 00:16:36.450 "name": "BaseBdev3", 00:16:36.450 "uuid": "6211ddde-0cc2-5a83-a8b2-dd7dff4810fa", 00:16:36.450 "is_configured": true, 00:16:36.450 "data_offset": 2048, 00:16:36.450 "data_size": 63488 00:16:36.450 }, 00:16:36.450 { 00:16:36.450 "name": "BaseBdev4", 00:16:36.450 "uuid": "d63553bb-7c35-5215-a7de-e812f74aced3", 00:16:36.450 "is_configured": true, 00:16:36.450 "data_offset": 2048, 00:16:36.450 "data_size": 63488 00:16:36.450 } 00:16:36.450 ] 00:16:36.450 }' 00:16:36.450 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:36.450 18:53:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:37.017 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:37.017 [2024-07-24 18:53:21.972979] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:37.017 [2024-07-24 18:53:21.973005] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:37.017 [2024-07-24 18:53:21.975033] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:37.017 [2024-07-24 18:53:21.975059] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:37.017 [2024-07-24 18:53:21.975084] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:37.017 [2024-07-24 18:53:21.975090] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bc5d90 name raid_bdev1, state offline 00:16:37.017 0 00:16:37.017 18:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2131840 00:16:37.017 18:53:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2131840 ']' 00:16:37.017 18:53:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2131840 00:16:37.017 18:53:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:16:37.017 18:53:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:37.017 18:53:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2131840 00:16:37.275 18:53:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:37.275 18:53:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:37.275 18:53:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2131840' 00:16:37.275 killing process with pid 2131840 00:16:37.275 18:53:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2131840 00:16:37.275 [2024-07-24 18:53:22.032223] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:37.275 18:53:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2131840 00:16:37.275 [2024-07-24 18:53:22.058286] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:37.275 18:53:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.CX6VF5Tve2 00:16:37.275 18:53:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:37.275 18:53:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:37.275 18:53:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:16:37.275 18:53:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:37.275 18:53:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:37.275 18:53:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:37.275 18:53:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:16:37.275 00:16:37.275 real 0m5.934s 00:16:37.275 user 0m9.353s 00:16:37.275 sys 0m0.850s 00:16:37.275 18:53:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:37.275 18:53:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:37.275 ************************************ 00:16:37.275 END TEST raid_read_error_test 00:16:37.275 ************************************ 00:16:37.275 18:53:22 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:16:37.275 18:53:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:37.275 18:53:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:37.275 18:53:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:37.534 ************************************ 00:16:37.534 START TEST raid_write_error_test 00:16:37.534 ************************************ 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.vkqrTjATVW 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2132854 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2132854 /var/tmp/spdk-raid.sock 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2132854 ']' 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:37.534 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:37.534 18:53:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:37.534 [2024-07-24 18:53:22.376485] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:16:37.534 [2024-07-24 18:53:22.376528] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2132854 ] 00:16:37.534 [2024-07-24 18:53:22.443283] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:37.534 [2024-07-24 18:53:22.521883] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:37.792 [2024-07-24 18:53:22.579427] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:37.792 [2024-07-24 18:53:22.579457] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:38.358 18:53:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:38.358 18:53:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:38.358 18:53:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:38.358 18:53:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:38.358 BaseBdev1_malloc 00:16:38.358 18:53:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:38.616 true 00:16:38.616 18:53:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:38.875 [2024-07-24 18:53:23.643511] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:38.875 [2024-07-24 18:53:23.643544] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:38.875 [2024-07-24 18:53:23.643554] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18b0d20 00:16:38.875 [2024-07-24 18:53:23.643559] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:38.875 [2024-07-24 18:53:23.644616] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:38.875 [2024-07-24 18:53:23.644637] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:38.875 BaseBdev1 00:16:38.875 18:53:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:38.875 18:53:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:38.875 BaseBdev2_malloc 00:16:38.875 18:53:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:39.133 true 00:16:39.133 18:53:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:39.392 [2024-07-24 18:53:24.176296] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:39.392 [2024-07-24 18:53:24.176325] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:39.392 [2024-07-24 18:53:24.176334] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18b5d50 00:16:39.392 [2024-07-24 18:53:24.176340] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:39.392 [2024-07-24 18:53:24.177270] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:39.392 [2024-07-24 18:53:24.177288] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:39.392 BaseBdev2 00:16:39.392 18:53:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:39.392 18:53:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:39.392 BaseBdev3_malloc 00:16:39.392 18:53:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:39.650 true 00:16:39.650 18:53:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:39.908 [2024-07-24 18:53:24.684923] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:39.908 [2024-07-24 18:53:24.684951] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:39.908 [2024-07-24 18:53:24.684960] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18b4ef0 00:16:39.908 [2024-07-24 18:53:24.684966] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:39.908 [2024-07-24 18:53:24.685914] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:39.908 [2024-07-24 18:53:24.685933] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:39.908 BaseBdev3 00:16:39.908 18:53:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:39.908 18:53:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:16:39.908 BaseBdev4_malloc 00:16:39.908 18:53:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:16:40.166 true 00:16:40.166 18:53:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:16:40.424 [2024-07-24 18:53:25.197500] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:16:40.424 [2024-07-24 18:53:25.197529] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:40.424 [2024-07-24 18:53:25.197540] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18b9280 00:16:40.424 [2024-07-24 18:53:25.197546] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:40.424 [2024-07-24 18:53:25.198547] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:40.424 [2024-07-24 18:53:25.198569] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:16:40.424 BaseBdev4 00:16:40.424 18:53:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:16:40.424 [2024-07-24 18:53:25.373993] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:40.424 [2024-07-24 18:53:25.374919] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:40.424 [2024-07-24 18:53:25.374967] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:40.424 [2024-07-24 18:53:25.375007] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:40.424 [2024-07-24 18:53:25.375173] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x18bad90 00:16:40.424 [2024-07-24 18:53:25.375180] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:40.424 [2024-07-24 18:53:25.375317] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18b88d0 00:16:40.424 [2024-07-24 18:53:25.375425] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18bad90 00:16:40.424 [2024-07-24 18:53:25.375431] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18bad90 00:16:40.424 [2024-07-24 18:53:25.375508] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:40.424 18:53:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:40.424 18:53:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:40.424 18:53:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:40.424 18:53:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:40.424 18:53:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:40.424 18:53:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:40.424 18:53:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:40.424 18:53:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:40.424 18:53:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:40.424 18:53:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:40.424 18:53:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.424 18:53:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:40.682 18:53:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:40.682 "name": "raid_bdev1", 00:16:40.682 "uuid": "a0ac3aa9-544d-4933-baaa-14a7c2e35b69", 00:16:40.682 "strip_size_kb": 64, 00:16:40.682 "state": "online", 00:16:40.682 "raid_level": "concat", 00:16:40.682 "superblock": true, 00:16:40.682 "num_base_bdevs": 4, 00:16:40.682 "num_base_bdevs_discovered": 4, 00:16:40.682 "num_base_bdevs_operational": 4, 00:16:40.682 "base_bdevs_list": [ 00:16:40.682 { 00:16:40.682 "name": "BaseBdev1", 00:16:40.682 "uuid": "2028f70c-f87f-5cf1-aab6-a97ea1d5120f", 00:16:40.682 "is_configured": true, 00:16:40.682 "data_offset": 2048, 00:16:40.682 "data_size": 63488 00:16:40.682 }, 00:16:40.682 { 00:16:40.682 "name": "BaseBdev2", 00:16:40.682 "uuid": "b8011f7a-3dfb-5741-96dd-3148d919159c", 00:16:40.682 "is_configured": true, 00:16:40.682 "data_offset": 2048, 00:16:40.682 "data_size": 63488 00:16:40.682 }, 00:16:40.682 { 00:16:40.682 "name": "BaseBdev3", 00:16:40.682 "uuid": "3d73d16b-0c31-5ed0-aee5-66965e07a119", 00:16:40.682 "is_configured": true, 00:16:40.682 "data_offset": 2048, 00:16:40.682 "data_size": 63488 00:16:40.682 }, 00:16:40.682 { 00:16:40.682 "name": "BaseBdev4", 00:16:40.682 "uuid": "ad09f5cc-d515-5bd9-8cc0-497f04623c28", 00:16:40.682 "is_configured": true, 00:16:40.682 "data_offset": 2048, 00:16:40.682 "data_size": 63488 00:16:40.683 } 00:16:40.683 ] 00:16:40.683 }' 00:16:40.683 18:53:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:40.683 18:53:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:41.249 18:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:41.249 18:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:41.249 [2024-07-24 18:53:26.124130] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18be210 00:16:42.184 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:42.444 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:42.444 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:42.444 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:16:42.444 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:42.444 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:42.444 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:42.444 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:42.444 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:42.444 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:42.444 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:42.444 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:42.444 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:42.444 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:42.444 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.444 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:42.444 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:42.444 "name": "raid_bdev1", 00:16:42.444 "uuid": "a0ac3aa9-544d-4933-baaa-14a7c2e35b69", 00:16:42.444 "strip_size_kb": 64, 00:16:42.444 "state": "online", 00:16:42.444 "raid_level": "concat", 00:16:42.444 "superblock": true, 00:16:42.444 "num_base_bdevs": 4, 00:16:42.444 "num_base_bdevs_discovered": 4, 00:16:42.444 "num_base_bdevs_operational": 4, 00:16:42.444 "base_bdevs_list": [ 00:16:42.444 { 00:16:42.444 "name": "BaseBdev1", 00:16:42.444 "uuid": "2028f70c-f87f-5cf1-aab6-a97ea1d5120f", 00:16:42.444 "is_configured": true, 00:16:42.444 "data_offset": 2048, 00:16:42.444 "data_size": 63488 00:16:42.444 }, 00:16:42.444 { 00:16:42.444 "name": "BaseBdev2", 00:16:42.444 "uuid": "b8011f7a-3dfb-5741-96dd-3148d919159c", 00:16:42.444 "is_configured": true, 00:16:42.444 "data_offset": 2048, 00:16:42.444 "data_size": 63488 00:16:42.444 }, 00:16:42.444 { 00:16:42.444 "name": "BaseBdev3", 00:16:42.444 "uuid": "3d73d16b-0c31-5ed0-aee5-66965e07a119", 00:16:42.444 "is_configured": true, 00:16:42.444 "data_offset": 2048, 00:16:42.444 "data_size": 63488 00:16:42.444 }, 00:16:42.444 { 00:16:42.444 "name": "BaseBdev4", 00:16:42.444 "uuid": "ad09f5cc-d515-5bd9-8cc0-497f04623c28", 00:16:42.444 "is_configured": true, 00:16:42.444 "data_offset": 2048, 00:16:42.444 "data_size": 63488 00:16:42.444 } 00:16:42.444 ] 00:16:42.444 }' 00:16:42.444 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:42.444 18:53:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:43.011 18:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:43.270 [2024-07-24 18:53:28.049264] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:43.270 [2024-07-24 18:53:28.049296] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:43.270 [2024-07-24 18:53:28.051346] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:43.270 [2024-07-24 18:53:28.051373] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:43.270 [2024-07-24 18:53:28.051399] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:43.270 [2024-07-24 18:53:28.051404] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18bad90 name raid_bdev1, state offline 00:16:43.270 0 00:16:43.270 18:53:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2132854 00:16:43.270 18:53:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2132854 ']' 00:16:43.270 18:53:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2132854 00:16:43.270 18:53:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:16:43.270 18:53:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:43.270 18:53:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2132854 00:16:43.270 18:53:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:43.270 18:53:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:43.270 18:53:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2132854' 00:16:43.270 killing process with pid 2132854 00:16:43.270 18:53:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2132854 00:16:43.270 [2024-07-24 18:53:28.112826] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:43.270 18:53:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2132854 00:16:43.270 [2024-07-24 18:53:28.138999] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:43.529 18:53:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.vkqrTjATVW 00:16:43.529 18:53:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:43.529 18:53:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:43.529 18:53:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:16:43.529 18:53:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:43.529 18:53:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:43.529 18:53:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:43.529 18:53:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:16:43.529 00:16:43.529 real 0m6.018s 00:16:43.529 user 0m9.424s 00:16:43.529 sys 0m0.936s 00:16:43.529 18:53:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:43.529 18:53:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:43.529 ************************************ 00:16:43.529 END TEST raid_write_error_test 00:16:43.529 ************************************ 00:16:43.529 18:53:28 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:43.529 18:53:28 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:16:43.529 18:53:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:43.529 18:53:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:43.529 18:53:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:43.529 ************************************ 00:16:43.529 START TEST raid_state_function_test 00:16:43.529 ************************************ 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2134080 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2134080' 00:16:43.529 Process raid pid: 2134080 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2134080 /var/tmp/spdk-raid.sock 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2134080 ']' 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:43.529 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:43.529 18:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:43.529 [2024-07-24 18:53:28.456994] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:16:43.529 [2024-07-24 18:53:28.457031] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:43.529 [2024-07-24 18:53:28.520127] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:43.787 [2024-07-24 18:53:28.593876] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:43.787 [2024-07-24 18:53:28.644694] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:43.787 [2024-07-24 18:53:28.644721] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:44.354 18:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:44.354 18:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:16:44.354 18:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:44.613 [2024-07-24 18:53:29.395486] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:44.613 [2024-07-24 18:53:29.395515] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:44.613 [2024-07-24 18:53:29.395521] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:44.613 [2024-07-24 18:53:29.395526] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:44.613 [2024-07-24 18:53:29.395531] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:44.613 [2024-07-24 18:53:29.395536] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:44.613 [2024-07-24 18:53:29.395540] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:44.613 [2024-07-24 18:53:29.395545] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:44.613 18:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:44.613 18:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:44.613 18:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:44.613 18:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:44.613 18:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:44.613 18:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:44.613 18:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:44.613 18:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:44.613 18:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:44.613 18:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:44.613 18:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.613 18:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:44.613 18:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:44.613 "name": "Existed_Raid", 00:16:44.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.613 "strip_size_kb": 0, 00:16:44.613 "state": "configuring", 00:16:44.613 "raid_level": "raid1", 00:16:44.613 "superblock": false, 00:16:44.613 "num_base_bdevs": 4, 00:16:44.613 "num_base_bdevs_discovered": 0, 00:16:44.613 "num_base_bdevs_operational": 4, 00:16:44.613 "base_bdevs_list": [ 00:16:44.613 { 00:16:44.613 "name": "BaseBdev1", 00:16:44.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.613 "is_configured": false, 00:16:44.613 "data_offset": 0, 00:16:44.613 "data_size": 0 00:16:44.613 }, 00:16:44.613 { 00:16:44.613 "name": "BaseBdev2", 00:16:44.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.613 "is_configured": false, 00:16:44.613 "data_offset": 0, 00:16:44.613 "data_size": 0 00:16:44.613 }, 00:16:44.613 { 00:16:44.613 "name": "BaseBdev3", 00:16:44.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.613 "is_configured": false, 00:16:44.614 "data_offset": 0, 00:16:44.614 "data_size": 0 00:16:44.614 }, 00:16:44.614 { 00:16:44.614 "name": "BaseBdev4", 00:16:44.614 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.614 "is_configured": false, 00:16:44.614 "data_offset": 0, 00:16:44.614 "data_size": 0 00:16:44.614 } 00:16:44.614 ] 00:16:44.614 }' 00:16:44.614 18:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:44.614 18:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:45.181 18:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:45.181 [2024-07-24 18:53:30.181452] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:45.181 [2024-07-24 18:53:30.181485] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xacbbc0 name Existed_Raid, state configuring 00:16:45.439 18:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:45.439 [2024-07-24 18:53:30.349883] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:45.439 [2024-07-24 18:53:30.349900] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:45.439 [2024-07-24 18:53:30.349905] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:45.439 [2024-07-24 18:53:30.349910] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:45.439 [2024-07-24 18:53:30.349914] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:45.439 [2024-07-24 18:53:30.349919] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:45.439 [2024-07-24 18:53:30.349922] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:45.439 [2024-07-24 18:53:30.349927] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:45.439 18:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:45.698 [2024-07-24 18:53:30.526545] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:45.698 BaseBdev1 00:16:45.698 18:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:45.698 18:53:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:45.698 18:53:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:45.698 18:53:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:45.698 18:53:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:45.698 18:53:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:45.698 18:53:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:45.957 18:53:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:45.957 [ 00:16:45.957 { 00:16:45.957 "name": "BaseBdev1", 00:16:45.957 "aliases": [ 00:16:45.957 "db69729a-0f77-4aa6-8b8a-37bb4f9c75c8" 00:16:45.957 ], 00:16:45.957 "product_name": "Malloc disk", 00:16:45.957 "block_size": 512, 00:16:45.957 "num_blocks": 65536, 00:16:45.957 "uuid": "db69729a-0f77-4aa6-8b8a-37bb4f9c75c8", 00:16:45.957 "assigned_rate_limits": { 00:16:45.957 "rw_ios_per_sec": 0, 00:16:45.957 "rw_mbytes_per_sec": 0, 00:16:45.957 "r_mbytes_per_sec": 0, 00:16:45.957 "w_mbytes_per_sec": 0 00:16:45.957 }, 00:16:45.957 "claimed": true, 00:16:45.957 "claim_type": "exclusive_write", 00:16:45.957 "zoned": false, 00:16:45.957 "supported_io_types": { 00:16:45.957 "read": true, 00:16:45.957 "write": true, 00:16:45.957 "unmap": true, 00:16:45.957 "flush": true, 00:16:45.957 "reset": true, 00:16:45.957 "nvme_admin": false, 00:16:45.957 "nvme_io": false, 00:16:45.957 "nvme_io_md": false, 00:16:45.957 "write_zeroes": true, 00:16:45.957 "zcopy": true, 00:16:45.957 "get_zone_info": false, 00:16:45.957 "zone_management": false, 00:16:45.957 "zone_append": false, 00:16:45.957 "compare": false, 00:16:45.957 "compare_and_write": false, 00:16:45.957 "abort": true, 00:16:45.957 "seek_hole": false, 00:16:45.957 "seek_data": false, 00:16:45.957 "copy": true, 00:16:45.957 "nvme_iov_md": false 00:16:45.957 }, 00:16:45.957 "memory_domains": [ 00:16:45.957 { 00:16:45.957 "dma_device_id": "system", 00:16:45.957 "dma_device_type": 1 00:16:45.957 }, 00:16:45.957 { 00:16:45.957 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:45.957 "dma_device_type": 2 00:16:45.957 } 00:16:45.957 ], 00:16:45.957 "driver_specific": {} 00:16:45.957 } 00:16:45.957 ] 00:16:45.957 18:53:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:45.958 18:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:45.958 18:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:45.958 18:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:45.958 18:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:45.958 18:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:45.958 18:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:45.958 18:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.958 18:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.958 18:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.958 18:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.958 18:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.958 18:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:46.217 18:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:46.217 "name": "Existed_Raid", 00:16:46.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:46.217 "strip_size_kb": 0, 00:16:46.217 "state": "configuring", 00:16:46.217 "raid_level": "raid1", 00:16:46.217 "superblock": false, 00:16:46.217 "num_base_bdevs": 4, 00:16:46.217 "num_base_bdevs_discovered": 1, 00:16:46.217 "num_base_bdevs_operational": 4, 00:16:46.217 "base_bdevs_list": [ 00:16:46.217 { 00:16:46.217 "name": "BaseBdev1", 00:16:46.217 "uuid": "db69729a-0f77-4aa6-8b8a-37bb4f9c75c8", 00:16:46.217 "is_configured": true, 00:16:46.217 "data_offset": 0, 00:16:46.217 "data_size": 65536 00:16:46.217 }, 00:16:46.217 { 00:16:46.217 "name": "BaseBdev2", 00:16:46.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:46.217 "is_configured": false, 00:16:46.217 "data_offset": 0, 00:16:46.217 "data_size": 0 00:16:46.217 }, 00:16:46.217 { 00:16:46.217 "name": "BaseBdev3", 00:16:46.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:46.217 "is_configured": false, 00:16:46.217 "data_offset": 0, 00:16:46.217 "data_size": 0 00:16:46.217 }, 00:16:46.217 { 00:16:46.217 "name": "BaseBdev4", 00:16:46.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:46.217 "is_configured": false, 00:16:46.217 "data_offset": 0, 00:16:46.217 "data_size": 0 00:16:46.217 } 00:16:46.217 ] 00:16:46.217 }' 00:16:46.217 18:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:46.217 18:53:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:46.784 18:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:46.784 [2024-07-24 18:53:31.705592] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:46.784 [2024-07-24 18:53:31.705619] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xacb430 name Existed_Raid, state configuring 00:16:46.784 18:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:47.043 [2024-07-24 18:53:31.874041] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:47.043 [2024-07-24 18:53:31.875041] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:47.043 [2024-07-24 18:53:31.875065] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:47.043 [2024-07-24 18:53:31.875070] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:47.043 [2024-07-24 18:53:31.875075] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:47.043 [2024-07-24 18:53:31.875079] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:47.043 [2024-07-24 18:53:31.875084] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:47.043 18:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:47.043 18:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:47.043 18:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:47.043 18:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:47.043 18:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:47.043 18:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:47.043 18:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:47.043 18:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:47.043 18:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:47.043 18:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:47.043 18:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:47.043 18:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:47.043 18:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.043 18:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.301 18:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.301 "name": "Existed_Raid", 00:16:47.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.301 "strip_size_kb": 0, 00:16:47.301 "state": "configuring", 00:16:47.301 "raid_level": "raid1", 00:16:47.301 "superblock": false, 00:16:47.301 "num_base_bdevs": 4, 00:16:47.301 "num_base_bdevs_discovered": 1, 00:16:47.301 "num_base_bdevs_operational": 4, 00:16:47.301 "base_bdevs_list": [ 00:16:47.301 { 00:16:47.301 "name": "BaseBdev1", 00:16:47.301 "uuid": "db69729a-0f77-4aa6-8b8a-37bb4f9c75c8", 00:16:47.301 "is_configured": true, 00:16:47.301 "data_offset": 0, 00:16:47.301 "data_size": 65536 00:16:47.301 }, 00:16:47.301 { 00:16:47.301 "name": "BaseBdev2", 00:16:47.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.301 "is_configured": false, 00:16:47.301 "data_offset": 0, 00:16:47.301 "data_size": 0 00:16:47.301 }, 00:16:47.301 { 00:16:47.301 "name": "BaseBdev3", 00:16:47.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.301 "is_configured": false, 00:16:47.301 "data_offset": 0, 00:16:47.301 "data_size": 0 00:16:47.301 }, 00:16:47.301 { 00:16:47.302 "name": "BaseBdev4", 00:16:47.302 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.302 "is_configured": false, 00:16:47.302 "data_offset": 0, 00:16:47.302 "data_size": 0 00:16:47.302 } 00:16:47.302 ] 00:16:47.302 }' 00:16:47.302 18:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.302 18:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:47.562 18:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:47.820 [2024-07-24 18:53:32.702825] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:47.820 BaseBdev2 00:16:47.820 18:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:47.820 18:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:47.820 18:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:47.820 18:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:47.820 18:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:47.820 18:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:47.820 18:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:48.080 18:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:48.080 [ 00:16:48.080 { 00:16:48.080 "name": "BaseBdev2", 00:16:48.080 "aliases": [ 00:16:48.080 "87d08b35-804e-47da-ab72-3095fbe21b82" 00:16:48.080 ], 00:16:48.080 "product_name": "Malloc disk", 00:16:48.080 "block_size": 512, 00:16:48.080 "num_blocks": 65536, 00:16:48.080 "uuid": "87d08b35-804e-47da-ab72-3095fbe21b82", 00:16:48.080 "assigned_rate_limits": { 00:16:48.080 "rw_ios_per_sec": 0, 00:16:48.080 "rw_mbytes_per_sec": 0, 00:16:48.080 "r_mbytes_per_sec": 0, 00:16:48.080 "w_mbytes_per_sec": 0 00:16:48.080 }, 00:16:48.080 "claimed": true, 00:16:48.080 "claim_type": "exclusive_write", 00:16:48.080 "zoned": false, 00:16:48.080 "supported_io_types": { 00:16:48.080 "read": true, 00:16:48.080 "write": true, 00:16:48.080 "unmap": true, 00:16:48.080 "flush": true, 00:16:48.080 "reset": true, 00:16:48.080 "nvme_admin": false, 00:16:48.080 "nvme_io": false, 00:16:48.080 "nvme_io_md": false, 00:16:48.080 "write_zeroes": true, 00:16:48.080 "zcopy": true, 00:16:48.080 "get_zone_info": false, 00:16:48.080 "zone_management": false, 00:16:48.080 "zone_append": false, 00:16:48.080 "compare": false, 00:16:48.080 "compare_and_write": false, 00:16:48.080 "abort": true, 00:16:48.080 "seek_hole": false, 00:16:48.080 "seek_data": false, 00:16:48.080 "copy": true, 00:16:48.080 "nvme_iov_md": false 00:16:48.080 }, 00:16:48.080 "memory_domains": [ 00:16:48.080 { 00:16:48.080 "dma_device_id": "system", 00:16:48.080 "dma_device_type": 1 00:16:48.080 }, 00:16:48.080 { 00:16:48.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.080 "dma_device_type": 2 00:16:48.080 } 00:16:48.080 ], 00:16:48.080 "driver_specific": {} 00:16:48.080 } 00:16:48.080 ] 00:16:48.080 18:53:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:48.080 18:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:48.080 18:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:48.080 18:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:48.080 18:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:48.080 18:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:48.080 18:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:48.080 18:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:48.080 18:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:48.080 18:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:48.080 18:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:48.081 18:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:48.081 18:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:48.081 18:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.081 18:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:48.349 18:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:48.349 "name": "Existed_Raid", 00:16:48.349 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:48.349 "strip_size_kb": 0, 00:16:48.349 "state": "configuring", 00:16:48.349 "raid_level": "raid1", 00:16:48.349 "superblock": false, 00:16:48.349 "num_base_bdevs": 4, 00:16:48.349 "num_base_bdevs_discovered": 2, 00:16:48.349 "num_base_bdevs_operational": 4, 00:16:48.349 "base_bdevs_list": [ 00:16:48.349 { 00:16:48.349 "name": "BaseBdev1", 00:16:48.349 "uuid": "db69729a-0f77-4aa6-8b8a-37bb4f9c75c8", 00:16:48.349 "is_configured": true, 00:16:48.349 "data_offset": 0, 00:16:48.349 "data_size": 65536 00:16:48.349 }, 00:16:48.349 { 00:16:48.349 "name": "BaseBdev2", 00:16:48.349 "uuid": "87d08b35-804e-47da-ab72-3095fbe21b82", 00:16:48.349 "is_configured": true, 00:16:48.349 "data_offset": 0, 00:16:48.349 "data_size": 65536 00:16:48.349 }, 00:16:48.349 { 00:16:48.349 "name": "BaseBdev3", 00:16:48.349 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:48.349 "is_configured": false, 00:16:48.349 "data_offset": 0, 00:16:48.349 "data_size": 0 00:16:48.349 }, 00:16:48.349 { 00:16:48.349 "name": "BaseBdev4", 00:16:48.349 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:48.349 "is_configured": false, 00:16:48.349 "data_offset": 0, 00:16:48.349 "data_size": 0 00:16:48.349 } 00:16:48.349 ] 00:16:48.349 }' 00:16:48.349 18:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:48.349 18:53:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:48.963 18:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:48.963 [2024-07-24 18:53:33.876489] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:48.963 BaseBdev3 00:16:48.963 18:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:48.963 18:53:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:48.963 18:53:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:48.963 18:53:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:48.963 18:53:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:48.963 18:53:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:48.963 18:53:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:49.222 18:53:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:49.222 [ 00:16:49.222 { 00:16:49.222 "name": "BaseBdev3", 00:16:49.222 "aliases": [ 00:16:49.222 "02e43a62-63de-4066-9ffd-8e22b8af6e13" 00:16:49.222 ], 00:16:49.222 "product_name": "Malloc disk", 00:16:49.222 "block_size": 512, 00:16:49.222 "num_blocks": 65536, 00:16:49.222 "uuid": "02e43a62-63de-4066-9ffd-8e22b8af6e13", 00:16:49.222 "assigned_rate_limits": { 00:16:49.222 "rw_ios_per_sec": 0, 00:16:49.222 "rw_mbytes_per_sec": 0, 00:16:49.222 "r_mbytes_per_sec": 0, 00:16:49.222 "w_mbytes_per_sec": 0 00:16:49.222 }, 00:16:49.222 "claimed": true, 00:16:49.222 "claim_type": "exclusive_write", 00:16:49.222 "zoned": false, 00:16:49.222 "supported_io_types": { 00:16:49.222 "read": true, 00:16:49.222 "write": true, 00:16:49.222 "unmap": true, 00:16:49.222 "flush": true, 00:16:49.222 "reset": true, 00:16:49.222 "nvme_admin": false, 00:16:49.222 "nvme_io": false, 00:16:49.222 "nvme_io_md": false, 00:16:49.222 "write_zeroes": true, 00:16:49.222 "zcopy": true, 00:16:49.222 "get_zone_info": false, 00:16:49.222 "zone_management": false, 00:16:49.222 "zone_append": false, 00:16:49.222 "compare": false, 00:16:49.222 "compare_and_write": false, 00:16:49.222 "abort": true, 00:16:49.222 "seek_hole": false, 00:16:49.222 "seek_data": false, 00:16:49.222 "copy": true, 00:16:49.222 "nvme_iov_md": false 00:16:49.222 }, 00:16:49.222 "memory_domains": [ 00:16:49.222 { 00:16:49.222 "dma_device_id": "system", 00:16:49.222 "dma_device_type": 1 00:16:49.222 }, 00:16:49.222 { 00:16:49.222 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:49.222 "dma_device_type": 2 00:16:49.222 } 00:16:49.222 ], 00:16:49.222 "driver_specific": {} 00:16:49.222 } 00:16:49.222 ] 00:16:49.222 18:53:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:49.222 18:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:49.222 18:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:49.222 18:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:49.222 18:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:49.222 18:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:49.222 18:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:49.222 18:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:49.222 18:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:49.222 18:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:49.222 18:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:49.222 18:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:49.222 18:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:49.480 18:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:49.480 18:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.480 18:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:49.480 "name": "Existed_Raid", 00:16:49.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:49.480 "strip_size_kb": 0, 00:16:49.480 "state": "configuring", 00:16:49.480 "raid_level": "raid1", 00:16:49.480 "superblock": false, 00:16:49.480 "num_base_bdevs": 4, 00:16:49.480 "num_base_bdevs_discovered": 3, 00:16:49.480 "num_base_bdevs_operational": 4, 00:16:49.480 "base_bdevs_list": [ 00:16:49.480 { 00:16:49.480 "name": "BaseBdev1", 00:16:49.480 "uuid": "db69729a-0f77-4aa6-8b8a-37bb4f9c75c8", 00:16:49.480 "is_configured": true, 00:16:49.480 "data_offset": 0, 00:16:49.480 "data_size": 65536 00:16:49.480 }, 00:16:49.480 { 00:16:49.480 "name": "BaseBdev2", 00:16:49.480 "uuid": "87d08b35-804e-47da-ab72-3095fbe21b82", 00:16:49.480 "is_configured": true, 00:16:49.480 "data_offset": 0, 00:16:49.480 "data_size": 65536 00:16:49.481 }, 00:16:49.481 { 00:16:49.481 "name": "BaseBdev3", 00:16:49.481 "uuid": "02e43a62-63de-4066-9ffd-8e22b8af6e13", 00:16:49.481 "is_configured": true, 00:16:49.481 "data_offset": 0, 00:16:49.481 "data_size": 65536 00:16:49.481 }, 00:16:49.481 { 00:16:49.481 "name": "BaseBdev4", 00:16:49.481 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:49.481 "is_configured": false, 00:16:49.481 "data_offset": 0, 00:16:49.481 "data_size": 0 00:16:49.481 } 00:16:49.481 ] 00:16:49.481 }' 00:16:49.481 18:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:49.481 18:53:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:50.048 18:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:50.048 [2024-07-24 18:53:35.046272] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:50.048 [2024-07-24 18:53:35.046300] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xacc490 00:16:50.048 [2024-07-24 18:53:35.046304] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:50.048 [2024-07-24 18:53:35.046437] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xab82d0 00:16:50.048 [2024-07-24 18:53:35.046537] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xacc490 00:16:50.048 [2024-07-24 18:53:35.046543] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xacc490 00:16:50.048 [2024-07-24 18:53:35.046661] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:50.048 BaseBdev4 00:16:50.307 18:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:16:50.307 18:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:16:50.307 18:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:50.307 18:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:50.307 18:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:50.307 18:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:50.307 18:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:50.307 18:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:50.566 [ 00:16:50.566 { 00:16:50.566 "name": "BaseBdev4", 00:16:50.566 "aliases": [ 00:16:50.566 "d1777556-518d-4899-afb3-5a8676d5ec1b" 00:16:50.566 ], 00:16:50.566 "product_name": "Malloc disk", 00:16:50.566 "block_size": 512, 00:16:50.566 "num_blocks": 65536, 00:16:50.566 "uuid": "d1777556-518d-4899-afb3-5a8676d5ec1b", 00:16:50.566 "assigned_rate_limits": { 00:16:50.566 "rw_ios_per_sec": 0, 00:16:50.566 "rw_mbytes_per_sec": 0, 00:16:50.566 "r_mbytes_per_sec": 0, 00:16:50.566 "w_mbytes_per_sec": 0 00:16:50.566 }, 00:16:50.566 "claimed": true, 00:16:50.566 "claim_type": "exclusive_write", 00:16:50.566 "zoned": false, 00:16:50.566 "supported_io_types": { 00:16:50.566 "read": true, 00:16:50.566 "write": true, 00:16:50.566 "unmap": true, 00:16:50.566 "flush": true, 00:16:50.566 "reset": true, 00:16:50.566 "nvme_admin": false, 00:16:50.566 "nvme_io": false, 00:16:50.566 "nvme_io_md": false, 00:16:50.566 "write_zeroes": true, 00:16:50.566 "zcopy": true, 00:16:50.566 "get_zone_info": false, 00:16:50.566 "zone_management": false, 00:16:50.566 "zone_append": false, 00:16:50.566 "compare": false, 00:16:50.566 "compare_and_write": false, 00:16:50.566 "abort": true, 00:16:50.566 "seek_hole": false, 00:16:50.566 "seek_data": false, 00:16:50.566 "copy": true, 00:16:50.566 "nvme_iov_md": false 00:16:50.566 }, 00:16:50.566 "memory_domains": [ 00:16:50.566 { 00:16:50.566 "dma_device_id": "system", 00:16:50.566 "dma_device_type": 1 00:16:50.566 }, 00:16:50.566 { 00:16:50.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.566 "dma_device_type": 2 00:16:50.566 } 00:16:50.566 ], 00:16:50.566 "driver_specific": {} 00:16:50.566 } 00:16:50.566 ] 00:16:50.566 18:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:50.567 18:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:50.567 18:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:50.567 18:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:16:50.567 18:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:50.567 18:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:50.567 18:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:50.567 18:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:50.567 18:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:50.567 18:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:50.567 18:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:50.567 18:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:50.567 18:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:50.567 18:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.567 18:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:50.567 18:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:50.567 "name": "Existed_Raid", 00:16:50.567 "uuid": "a4ebd819-a475-4427-bfa3-e768d1dccda2", 00:16:50.567 "strip_size_kb": 0, 00:16:50.567 "state": "online", 00:16:50.567 "raid_level": "raid1", 00:16:50.567 "superblock": false, 00:16:50.567 "num_base_bdevs": 4, 00:16:50.567 "num_base_bdevs_discovered": 4, 00:16:50.567 "num_base_bdevs_operational": 4, 00:16:50.567 "base_bdevs_list": [ 00:16:50.567 { 00:16:50.567 "name": "BaseBdev1", 00:16:50.567 "uuid": "db69729a-0f77-4aa6-8b8a-37bb4f9c75c8", 00:16:50.567 "is_configured": true, 00:16:50.567 "data_offset": 0, 00:16:50.567 "data_size": 65536 00:16:50.567 }, 00:16:50.567 { 00:16:50.567 "name": "BaseBdev2", 00:16:50.567 "uuid": "87d08b35-804e-47da-ab72-3095fbe21b82", 00:16:50.567 "is_configured": true, 00:16:50.567 "data_offset": 0, 00:16:50.567 "data_size": 65536 00:16:50.567 }, 00:16:50.567 { 00:16:50.567 "name": "BaseBdev3", 00:16:50.567 "uuid": "02e43a62-63de-4066-9ffd-8e22b8af6e13", 00:16:50.567 "is_configured": true, 00:16:50.567 "data_offset": 0, 00:16:50.567 "data_size": 65536 00:16:50.567 }, 00:16:50.567 { 00:16:50.567 "name": "BaseBdev4", 00:16:50.567 "uuid": "d1777556-518d-4899-afb3-5a8676d5ec1b", 00:16:50.567 "is_configured": true, 00:16:50.567 "data_offset": 0, 00:16:50.567 "data_size": 65536 00:16:50.567 } 00:16:50.567 ] 00:16:50.567 }' 00:16:50.567 18:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:50.567 18:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:51.133 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:51.133 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:51.133 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:51.133 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:51.133 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:51.133 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:51.133 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:51.133 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:51.392 [2024-07-24 18:53:36.169439] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:51.392 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:51.392 "name": "Existed_Raid", 00:16:51.392 "aliases": [ 00:16:51.392 "a4ebd819-a475-4427-bfa3-e768d1dccda2" 00:16:51.392 ], 00:16:51.392 "product_name": "Raid Volume", 00:16:51.392 "block_size": 512, 00:16:51.392 "num_blocks": 65536, 00:16:51.392 "uuid": "a4ebd819-a475-4427-bfa3-e768d1dccda2", 00:16:51.392 "assigned_rate_limits": { 00:16:51.392 "rw_ios_per_sec": 0, 00:16:51.392 "rw_mbytes_per_sec": 0, 00:16:51.392 "r_mbytes_per_sec": 0, 00:16:51.392 "w_mbytes_per_sec": 0 00:16:51.392 }, 00:16:51.392 "claimed": false, 00:16:51.392 "zoned": false, 00:16:51.392 "supported_io_types": { 00:16:51.392 "read": true, 00:16:51.392 "write": true, 00:16:51.392 "unmap": false, 00:16:51.392 "flush": false, 00:16:51.392 "reset": true, 00:16:51.392 "nvme_admin": false, 00:16:51.392 "nvme_io": false, 00:16:51.392 "nvme_io_md": false, 00:16:51.392 "write_zeroes": true, 00:16:51.392 "zcopy": false, 00:16:51.392 "get_zone_info": false, 00:16:51.392 "zone_management": false, 00:16:51.392 "zone_append": false, 00:16:51.392 "compare": false, 00:16:51.392 "compare_and_write": false, 00:16:51.392 "abort": false, 00:16:51.392 "seek_hole": false, 00:16:51.392 "seek_data": false, 00:16:51.392 "copy": false, 00:16:51.392 "nvme_iov_md": false 00:16:51.392 }, 00:16:51.392 "memory_domains": [ 00:16:51.392 { 00:16:51.392 "dma_device_id": "system", 00:16:51.392 "dma_device_type": 1 00:16:51.392 }, 00:16:51.392 { 00:16:51.392 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.392 "dma_device_type": 2 00:16:51.392 }, 00:16:51.392 { 00:16:51.392 "dma_device_id": "system", 00:16:51.392 "dma_device_type": 1 00:16:51.392 }, 00:16:51.392 { 00:16:51.392 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.392 "dma_device_type": 2 00:16:51.392 }, 00:16:51.392 { 00:16:51.392 "dma_device_id": "system", 00:16:51.392 "dma_device_type": 1 00:16:51.392 }, 00:16:51.392 { 00:16:51.392 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.392 "dma_device_type": 2 00:16:51.392 }, 00:16:51.392 { 00:16:51.392 "dma_device_id": "system", 00:16:51.392 "dma_device_type": 1 00:16:51.392 }, 00:16:51.392 { 00:16:51.392 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.392 "dma_device_type": 2 00:16:51.392 } 00:16:51.392 ], 00:16:51.392 "driver_specific": { 00:16:51.392 "raid": { 00:16:51.392 "uuid": "a4ebd819-a475-4427-bfa3-e768d1dccda2", 00:16:51.392 "strip_size_kb": 0, 00:16:51.392 "state": "online", 00:16:51.392 "raid_level": "raid1", 00:16:51.392 "superblock": false, 00:16:51.392 "num_base_bdevs": 4, 00:16:51.392 "num_base_bdevs_discovered": 4, 00:16:51.392 "num_base_bdevs_operational": 4, 00:16:51.392 "base_bdevs_list": [ 00:16:51.392 { 00:16:51.392 "name": "BaseBdev1", 00:16:51.392 "uuid": "db69729a-0f77-4aa6-8b8a-37bb4f9c75c8", 00:16:51.392 "is_configured": true, 00:16:51.392 "data_offset": 0, 00:16:51.392 "data_size": 65536 00:16:51.392 }, 00:16:51.392 { 00:16:51.392 "name": "BaseBdev2", 00:16:51.392 "uuid": "87d08b35-804e-47da-ab72-3095fbe21b82", 00:16:51.392 "is_configured": true, 00:16:51.392 "data_offset": 0, 00:16:51.392 "data_size": 65536 00:16:51.392 }, 00:16:51.392 { 00:16:51.392 "name": "BaseBdev3", 00:16:51.392 "uuid": "02e43a62-63de-4066-9ffd-8e22b8af6e13", 00:16:51.392 "is_configured": true, 00:16:51.392 "data_offset": 0, 00:16:51.392 "data_size": 65536 00:16:51.392 }, 00:16:51.392 { 00:16:51.392 "name": "BaseBdev4", 00:16:51.392 "uuid": "d1777556-518d-4899-afb3-5a8676d5ec1b", 00:16:51.392 "is_configured": true, 00:16:51.392 "data_offset": 0, 00:16:51.392 "data_size": 65536 00:16:51.392 } 00:16:51.392 ] 00:16:51.392 } 00:16:51.392 } 00:16:51.392 }' 00:16:51.392 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:51.392 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:51.392 BaseBdev2 00:16:51.392 BaseBdev3 00:16:51.392 BaseBdev4' 00:16:51.392 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:51.392 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:51.392 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:51.651 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:51.651 "name": "BaseBdev1", 00:16:51.651 "aliases": [ 00:16:51.651 "db69729a-0f77-4aa6-8b8a-37bb4f9c75c8" 00:16:51.651 ], 00:16:51.651 "product_name": "Malloc disk", 00:16:51.651 "block_size": 512, 00:16:51.651 "num_blocks": 65536, 00:16:51.651 "uuid": "db69729a-0f77-4aa6-8b8a-37bb4f9c75c8", 00:16:51.651 "assigned_rate_limits": { 00:16:51.651 "rw_ios_per_sec": 0, 00:16:51.651 "rw_mbytes_per_sec": 0, 00:16:51.651 "r_mbytes_per_sec": 0, 00:16:51.651 "w_mbytes_per_sec": 0 00:16:51.651 }, 00:16:51.651 "claimed": true, 00:16:51.651 "claim_type": "exclusive_write", 00:16:51.651 "zoned": false, 00:16:51.651 "supported_io_types": { 00:16:51.651 "read": true, 00:16:51.651 "write": true, 00:16:51.651 "unmap": true, 00:16:51.651 "flush": true, 00:16:51.651 "reset": true, 00:16:51.651 "nvme_admin": false, 00:16:51.651 "nvme_io": false, 00:16:51.651 "nvme_io_md": false, 00:16:51.651 "write_zeroes": true, 00:16:51.651 "zcopy": true, 00:16:51.651 "get_zone_info": false, 00:16:51.651 "zone_management": false, 00:16:51.651 "zone_append": false, 00:16:51.651 "compare": false, 00:16:51.651 "compare_and_write": false, 00:16:51.651 "abort": true, 00:16:51.651 "seek_hole": false, 00:16:51.651 "seek_data": false, 00:16:51.651 "copy": true, 00:16:51.651 "nvme_iov_md": false 00:16:51.651 }, 00:16:51.651 "memory_domains": [ 00:16:51.651 { 00:16:51.651 "dma_device_id": "system", 00:16:51.651 "dma_device_type": 1 00:16:51.651 }, 00:16:51.651 { 00:16:51.651 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.651 "dma_device_type": 2 00:16:51.651 } 00:16:51.651 ], 00:16:51.651 "driver_specific": {} 00:16:51.651 }' 00:16:51.651 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.651 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.651 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:51.651 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:51.651 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:51.651 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:51.651 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:51.651 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:51.651 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:51.651 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.909 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.909 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:51.909 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:51.909 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:51.909 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:51.909 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:51.909 "name": "BaseBdev2", 00:16:51.909 "aliases": [ 00:16:51.909 "87d08b35-804e-47da-ab72-3095fbe21b82" 00:16:51.909 ], 00:16:51.909 "product_name": "Malloc disk", 00:16:51.909 "block_size": 512, 00:16:51.909 "num_blocks": 65536, 00:16:51.909 "uuid": "87d08b35-804e-47da-ab72-3095fbe21b82", 00:16:51.909 "assigned_rate_limits": { 00:16:51.909 "rw_ios_per_sec": 0, 00:16:51.909 "rw_mbytes_per_sec": 0, 00:16:51.909 "r_mbytes_per_sec": 0, 00:16:51.909 "w_mbytes_per_sec": 0 00:16:51.909 }, 00:16:51.909 "claimed": true, 00:16:51.909 "claim_type": "exclusive_write", 00:16:51.909 "zoned": false, 00:16:51.909 "supported_io_types": { 00:16:51.909 "read": true, 00:16:51.909 "write": true, 00:16:51.909 "unmap": true, 00:16:51.909 "flush": true, 00:16:51.909 "reset": true, 00:16:51.909 "nvme_admin": false, 00:16:51.909 "nvme_io": false, 00:16:51.909 "nvme_io_md": false, 00:16:51.909 "write_zeroes": true, 00:16:51.909 "zcopy": true, 00:16:51.909 "get_zone_info": false, 00:16:51.909 "zone_management": false, 00:16:51.909 "zone_append": false, 00:16:51.909 "compare": false, 00:16:51.909 "compare_and_write": false, 00:16:51.909 "abort": true, 00:16:51.909 "seek_hole": false, 00:16:51.909 "seek_data": false, 00:16:51.909 "copy": true, 00:16:51.909 "nvme_iov_md": false 00:16:51.909 }, 00:16:51.909 "memory_domains": [ 00:16:51.909 { 00:16:51.909 "dma_device_id": "system", 00:16:51.909 "dma_device_type": 1 00:16:51.909 }, 00:16:51.909 { 00:16:51.909 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.909 "dma_device_type": 2 00:16:51.909 } 00:16:51.909 ], 00:16:51.909 "driver_specific": {} 00:16:51.909 }' 00:16:51.909 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.168 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.168 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:52.168 18:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.168 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.168 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:52.168 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:52.168 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:52.168 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:52.168 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.168 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.427 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:52.427 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:52.427 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:52.427 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:52.427 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:52.427 "name": "BaseBdev3", 00:16:52.427 "aliases": [ 00:16:52.427 "02e43a62-63de-4066-9ffd-8e22b8af6e13" 00:16:52.427 ], 00:16:52.427 "product_name": "Malloc disk", 00:16:52.427 "block_size": 512, 00:16:52.427 "num_blocks": 65536, 00:16:52.427 "uuid": "02e43a62-63de-4066-9ffd-8e22b8af6e13", 00:16:52.427 "assigned_rate_limits": { 00:16:52.427 "rw_ios_per_sec": 0, 00:16:52.427 "rw_mbytes_per_sec": 0, 00:16:52.427 "r_mbytes_per_sec": 0, 00:16:52.427 "w_mbytes_per_sec": 0 00:16:52.427 }, 00:16:52.427 "claimed": true, 00:16:52.427 "claim_type": "exclusive_write", 00:16:52.427 "zoned": false, 00:16:52.427 "supported_io_types": { 00:16:52.427 "read": true, 00:16:52.427 "write": true, 00:16:52.427 "unmap": true, 00:16:52.427 "flush": true, 00:16:52.427 "reset": true, 00:16:52.427 "nvme_admin": false, 00:16:52.427 "nvme_io": false, 00:16:52.427 "nvme_io_md": false, 00:16:52.427 "write_zeroes": true, 00:16:52.427 "zcopy": true, 00:16:52.427 "get_zone_info": false, 00:16:52.427 "zone_management": false, 00:16:52.427 "zone_append": false, 00:16:52.427 "compare": false, 00:16:52.427 "compare_and_write": false, 00:16:52.427 "abort": true, 00:16:52.427 "seek_hole": false, 00:16:52.427 "seek_data": false, 00:16:52.427 "copy": true, 00:16:52.427 "nvme_iov_md": false 00:16:52.427 }, 00:16:52.427 "memory_domains": [ 00:16:52.427 { 00:16:52.427 "dma_device_id": "system", 00:16:52.427 "dma_device_type": 1 00:16:52.427 }, 00:16:52.427 { 00:16:52.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.427 "dma_device_type": 2 00:16:52.427 } 00:16:52.427 ], 00:16:52.427 "driver_specific": {} 00:16:52.427 }' 00:16:52.427 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.427 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.427 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:52.427 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.685 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.685 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:52.685 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:52.685 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:52.685 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:52.685 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.685 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.685 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:52.685 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:52.685 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:52.685 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:52.943 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:52.943 "name": "BaseBdev4", 00:16:52.943 "aliases": [ 00:16:52.943 "d1777556-518d-4899-afb3-5a8676d5ec1b" 00:16:52.943 ], 00:16:52.943 "product_name": "Malloc disk", 00:16:52.943 "block_size": 512, 00:16:52.943 "num_blocks": 65536, 00:16:52.943 "uuid": "d1777556-518d-4899-afb3-5a8676d5ec1b", 00:16:52.943 "assigned_rate_limits": { 00:16:52.943 "rw_ios_per_sec": 0, 00:16:52.943 "rw_mbytes_per_sec": 0, 00:16:52.943 "r_mbytes_per_sec": 0, 00:16:52.943 "w_mbytes_per_sec": 0 00:16:52.943 }, 00:16:52.943 "claimed": true, 00:16:52.943 "claim_type": "exclusive_write", 00:16:52.943 "zoned": false, 00:16:52.943 "supported_io_types": { 00:16:52.943 "read": true, 00:16:52.943 "write": true, 00:16:52.943 "unmap": true, 00:16:52.943 "flush": true, 00:16:52.943 "reset": true, 00:16:52.943 "nvme_admin": false, 00:16:52.943 "nvme_io": false, 00:16:52.943 "nvme_io_md": false, 00:16:52.943 "write_zeroes": true, 00:16:52.943 "zcopy": true, 00:16:52.943 "get_zone_info": false, 00:16:52.943 "zone_management": false, 00:16:52.943 "zone_append": false, 00:16:52.943 "compare": false, 00:16:52.943 "compare_and_write": false, 00:16:52.943 "abort": true, 00:16:52.943 "seek_hole": false, 00:16:52.944 "seek_data": false, 00:16:52.944 "copy": true, 00:16:52.944 "nvme_iov_md": false 00:16:52.944 }, 00:16:52.944 "memory_domains": [ 00:16:52.944 { 00:16:52.944 "dma_device_id": "system", 00:16:52.944 "dma_device_type": 1 00:16:52.944 }, 00:16:52.944 { 00:16:52.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.944 "dma_device_type": 2 00:16:52.944 } 00:16:52.944 ], 00:16:52.944 "driver_specific": {} 00:16:52.944 }' 00:16:52.944 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.944 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.944 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:52.944 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.944 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.202 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:53.202 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.202 18:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.202 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:53.202 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.202 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.202 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:53.203 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:53.461 [2024-07-24 18:53:38.230651] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:53.461 "name": "Existed_Raid", 00:16:53.461 "uuid": "a4ebd819-a475-4427-bfa3-e768d1dccda2", 00:16:53.461 "strip_size_kb": 0, 00:16:53.461 "state": "online", 00:16:53.461 "raid_level": "raid1", 00:16:53.461 "superblock": false, 00:16:53.461 "num_base_bdevs": 4, 00:16:53.461 "num_base_bdevs_discovered": 3, 00:16:53.461 "num_base_bdevs_operational": 3, 00:16:53.461 "base_bdevs_list": [ 00:16:53.461 { 00:16:53.461 "name": null, 00:16:53.461 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:53.461 "is_configured": false, 00:16:53.461 "data_offset": 0, 00:16:53.461 "data_size": 65536 00:16:53.461 }, 00:16:53.461 { 00:16:53.461 "name": "BaseBdev2", 00:16:53.461 "uuid": "87d08b35-804e-47da-ab72-3095fbe21b82", 00:16:53.461 "is_configured": true, 00:16:53.461 "data_offset": 0, 00:16:53.461 "data_size": 65536 00:16:53.461 }, 00:16:53.461 { 00:16:53.461 "name": "BaseBdev3", 00:16:53.461 "uuid": "02e43a62-63de-4066-9ffd-8e22b8af6e13", 00:16:53.461 "is_configured": true, 00:16:53.461 "data_offset": 0, 00:16:53.461 "data_size": 65536 00:16:53.461 }, 00:16:53.461 { 00:16:53.461 "name": "BaseBdev4", 00:16:53.461 "uuid": "d1777556-518d-4899-afb3-5a8676d5ec1b", 00:16:53.461 "is_configured": true, 00:16:53.461 "data_offset": 0, 00:16:53.461 "data_size": 65536 00:16:53.461 } 00:16:53.461 ] 00:16:53.461 }' 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:53.461 18:53:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:54.027 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:54.027 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:54.027 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.027 18:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:54.285 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:54.285 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:54.285 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:54.285 [2024-07-24 18:53:39.218066] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:54.285 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:54.285 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:54.285 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.285 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:54.543 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:54.543 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:54.543 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:54.543 [2024-07-24 18:53:39.552798] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:54.801 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:54.801 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:54.801 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.801 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:54.801 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:54.801 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:54.801 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:16:55.060 [2024-07-24 18:53:39.899470] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:16:55.060 [2024-07-24 18:53:39.899524] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:55.060 [2024-07-24 18:53:39.909536] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:55.060 [2024-07-24 18:53:39.909560] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:55.060 [2024-07-24 18:53:39.909565] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xacc490 name Existed_Raid, state offline 00:16:55.060 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:55.060 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:55.060 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.060 18:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:55.318 18:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:55.318 18:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:55.318 18:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:16:55.318 18:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:55.318 18:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:55.318 18:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:55.318 BaseBdev2 00:16:55.318 18:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:55.318 18:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:55.318 18:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:55.318 18:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:55.318 18:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:55.318 18:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:55.318 18:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:55.577 18:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:55.577 [ 00:16:55.577 { 00:16:55.577 "name": "BaseBdev2", 00:16:55.577 "aliases": [ 00:16:55.577 "ad920af4-910c-4133-a086-ae899b7efbc0" 00:16:55.577 ], 00:16:55.577 "product_name": "Malloc disk", 00:16:55.577 "block_size": 512, 00:16:55.577 "num_blocks": 65536, 00:16:55.577 "uuid": "ad920af4-910c-4133-a086-ae899b7efbc0", 00:16:55.577 "assigned_rate_limits": { 00:16:55.577 "rw_ios_per_sec": 0, 00:16:55.577 "rw_mbytes_per_sec": 0, 00:16:55.577 "r_mbytes_per_sec": 0, 00:16:55.577 "w_mbytes_per_sec": 0 00:16:55.577 }, 00:16:55.577 "claimed": false, 00:16:55.577 "zoned": false, 00:16:55.577 "supported_io_types": { 00:16:55.577 "read": true, 00:16:55.577 "write": true, 00:16:55.577 "unmap": true, 00:16:55.577 "flush": true, 00:16:55.577 "reset": true, 00:16:55.577 "nvme_admin": false, 00:16:55.577 "nvme_io": false, 00:16:55.577 "nvme_io_md": false, 00:16:55.577 "write_zeroes": true, 00:16:55.577 "zcopy": true, 00:16:55.577 "get_zone_info": false, 00:16:55.577 "zone_management": false, 00:16:55.577 "zone_append": false, 00:16:55.577 "compare": false, 00:16:55.577 "compare_and_write": false, 00:16:55.577 "abort": true, 00:16:55.577 "seek_hole": false, 00:16:55.577 "seek_data": false, 00:16:55.577 "copy": true, 00:16:55.577 "nvme_iov_md": false 00:16:55.577 }, 00:16:55.577 "memory_domains": [ 00:16:55.577 { 00:16:55.577 "dma_device_id": "system", 00:16:55.577 "dma_device_type": 1 00:16:55.577 }, 00:16:55.577 { 00:16:55.577 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:55.577 "dma_device_type": 2 00:16:55.577 } 00:16:55.577 ], 00:16:55.577 "driver_specific": {} 00:16:55.577 } 00:16:55.577 ] 00:16:55.836 18:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:55.836 18:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:55.836 18:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:55.836 18:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:55.836 BaseBdev3 00:16:55.836 18:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:55.836 18:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:55.836 18:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:55.836 18:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:55.836 18:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:55.836 18:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:55.836 18:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:56.095 18:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:56.095 [ 00:16:56.095 { 00:16:56.095 "name": "BaseBdev3", 00:16:56.095 "aliases": [ 00:16:56.095 "2206938c-eef7-4802-91e8-7c0ac6134e3a" 00:16:56.095 ], 00:16:56.095 "product_name": "Malloc disk", 00:16:56.095 "block_size": 512, 00:16:56.095 "num_blocks": 65536, 00:16:56.095 "uuid": "2206938c-eef7-4802-91e8-7c0ac6134e3a", 00:16:56.095 "assigned_rate_limits": { 00:16:56.095 "rw_ios_per_sec": 0, 00:16:56.095 "rw_mbytes_per_sec": 0, 00:16:56.095 "r_mbytes_per_sec": 0, 00:16:56.095 "w_mbytes_per_sec": 0 00:16:56.095 }, 00:16:56.095 "claimed": false, 00:16:56.095 "zoned": false, 00:16:56.095 "supported_io_types": { 00:16:56.095 "read": true, 00:16:56.095 "write": true, 00:16:56.095 "unmap": true, 00:16:56.095 "flush": true, 00:16:56.095 "reset": true, 00:16:56.095 "nvme_admin": false, 00:16:56.095 "nvme_io": false, 00:16:56.095 "nvme_io_md": false, 00:16:56.095 "write_zeroes": true, 00:16:56.095 "zcopy": true, 00:16:56.095 "get_zone_info": false, 00:16:56.095 "zone_management": false, 00:16:56.095 "zone_append": false, 00:16:56.095 "compare": false, 00:16:56.095 "compare_and_write": false, 00:16:56.095 "abort": true, 00:16:56.095 "seek_hole": false, 00:16:56.095 "seek_data": false, 00:16:56.095 "copy": true, 00:16:56.095 "nvme_iov_md": false 00:16:56.095 }, 00:16:56.095 "memory_domains": [ 00:16:56.095 { 00:16:56.095 "dma_device_id": "system", 00:16:56.095 "dma_device_type": 1 00:16:56.095 }, 00:16:56.095 { 00:16:56.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:56.095 "dma_device_type": 2 00:16:56.095 } 00:16:56.095 ], 00:16:56.095 "driver_specific": {} 00:16:56.095 } 00:16:56.095 ] 00:16:56.095 18:53:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:56.095 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:56.095 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:56.095 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:56.354 BaseBdev4 00:16:56.354 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:16:56.354 18:53:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:16:56.354 18:53:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:56.354 18:53:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:56.354 18:53:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:56.354 18:53:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:56.354 18:53:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:56.613 18:53:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:56.613 [ 00:16:56.613 { 00:16:56.613 "name": "BaseBdev4", 00:16:56.613 "aliases": [ 00:16:56.613 "aac84985-1871-4457-980a-72423f9e5080" 00:16:56.613 ], 00:16:56.613 "product_name": "Malloc disk", 00:16:56.613 "block_size": 512, 00:16:56.613 "num_blocks": 65536, 00:16:56.613 "uuid": "aac84985-1871-4457-980a-72423f9e5080", 00:16:56.613 "assigned_rate_limits": { 00:16:56.613 "rw_ios_per_sec": 0, 00:16:56.613 "rw_mbytes_per_sec": 0, 00:16:56.613 "r_mbytes_per_sec": 0, 00:16:56.613 "w_mbytes_per_sec": 0 00:16:56.613 }, 00:16:56.613 "claimed": false, 00:16:56.613 "zoned": false, 00:16:56.613 "supported_io_types": { 00:16:56.613 "read": true, 00:16:56.613 "write": true, 00:16:56.613 "unmap": true, 00:16:56.613 "flush": true, 00:16:56.613 "reset": true, 00:16:56.613 "nvme_admin": false, 00:16:56.613 "nvme_io": false, 00:16:56.613 "nvme_io_md": false, 00:16:56.613 "write_zeroes": true, 00:16:56.613 "zcopy": true, 00:16:56.613 "get_zone_info": false, 00:16:56.613 "zone_management": false, 00:16:56.613 "zone_append": false, 00:16:56.613 "compare": false, 00:16:56.613 "compare_and_write": false, 00:16:56.613 "abort": true, 00:16:56.613 "seek_hole": false, 00:16:56.613 "seek_data": false, 00:16:56.613 "copy": true, 00:16:56.613 "nvme_iov_md": false 00:16:56.613 }, 00:16:56.613 "memory_domains": [ 00:16:56.613 { 00:16:56.613 "dma_device_id": "system", 00:16:56.613 "dma_device_type": 1 00:16:56.613 }, 00:16:56.613 { 00:16:56.613 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:56.613 "dma_device_type": 2 00:16:56.613 } 00:16:56.613 ], 00:16:56.613 "driver_specific": {} 00:16:56.613 } 00:16:56.613 ] 00:16:56.613 18:53:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:56.613 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:56.613 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:56.613 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:56.872 [2024-07-24 18:53:41.729298] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:56.872 [2024-07-24 18:53:41.729330] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:56.872 [2024-07-24 18:53:41.729341] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:56.872 [2024-07-24 18:53:41.730309] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:56.872 [2024-07-24 18:53:41.730340] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:56.872 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:56.872 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:56.872 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:56.872 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:56.872 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:56.872 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:56.872 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:56.872 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:56.872 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:56.872 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:56.872 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.872 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:57.130 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:57.130 "name": "Existed_Raid", 00:16:57.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.130 "strip_size_kb": 0, 00:16:57.130 "state": "configuring", 00:16:57.130 "raid_level": "raid1", 00:16:57.130 "superblock": false, 00:16:57.130 "num_base_bdevs": 4, 00:16:57.130 "num_base_bdevs_discovered": 3, 00:16:57.130 "num_base_bdevs_operational": 4, 00:16:57.130 "base_bdevs_list": [ 00:16:57.130 { 00:16:57.130 "name": "BaseBdev1", 00:16:57.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.130 "is_configured": false, 00:16:57.130 "data_offset": 0, 00:16:57.130 "data_size": 0 00:16:57.130 }, 00:16:57.130 { 00:16:57.130 "name": "BaseBdev2", 00:16:57.130 "uuid": "ad920af4-910c-4133-a086-ae899b7efbc0", 00:16:57.130 "is_configured": true, 00:16:57.130 "data_offset": 0, 00:16:57.130 "data_size": 65536 00:16:57.130 }, 00:16:57.130 { 00:16:57.130 "name": "BaseBdev3", 00:16:57.130 "uuid": "2206938c-eef7-4802-91e8-7c0ac6134e3a", 00:16:57.130 "is_configured": true, 00:16:57.130 "data_offset": 0, 00:16:57.130 "data_size": 65536 00:16:57.130 }, 00:16:57.130 { 00:16:57.130 "name": "BaseBdev4", 00:16:57.130 "uuid": "aac84985-1871-4457-980a-72423f9e5080", 00:16:57.130 "is_configured": true, 00:16:57.130 "data_offset": 0, 00:16:57.130 "data_size": 65536 00:16:57.130 } 00:16:57.130 ] 00:16:57.130 }' 00:16:57.130 18:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:57.130 18:53:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:57.388 18:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:57.646 [2024-07-24 18:53:42.539384] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:57.646 18:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:57.646 18:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:57.646 18:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:57.646 18:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:57.646 18:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:57.646 18:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:57.646 18:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:57.646 18:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:57.646 18:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:57.646 18:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:57.646 18:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.646 18:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:57.905 18:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:57.905 "name": "Existed_Raid", 00:16:57.905 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.905 "strip_size_kb": 0, 00:16:57.905 "state": "configuring", 00:16:57.905 "raid_level": "raid1", 00:16:57.905 "superblock": false, 00:16:57.905 "num_base_bdevs": 4, 00:16:57.905 "num_base_bdevs_discovered": 2, 00:16:57.905 "num_base_bdevs_operational": 4, 00:16:57.905 "base_bdevs_list": [ 00:16:57.905 { 00:16:57.905 "name": "BaseBdev1", 00:16:57.905 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.905 "is_configured": false, 00:16:57.905 "data_offset": 0, 00:16:57.905 "data_size": 0 00:16:57.905 }, 00:16:57.905 { 00:16:57.905 "name": null, 00:16:57.905 "uuid": "ad920af4-910c-4133-a086-ae899b7efbc0", 00:16:57.905 "is_configured": false, 00:16:57.905 "data_offset": 0, 00:16:57.905 "data_size": 65536 00:16:57.905 }, 00:16:57.905 { 00:16:57.905 "name": "BaseBdev3", 00:16:57.905 "uuid": "2206938c-eef7-4802-91e8-7c0ac6134e3a", 00:16:57.905 "is_configured": true, 00:16:57.905 "data_offset": 0, 00:16:57.905 "data_size": 65536 00:16:57.905 }, 00:16:57.905 { 00:16:57.905 "name": "BaseBdev4", 00:16:57.905 "uuid": "aac84985-1871-4457-980a-72423f9e5080", 00:16:57.905 "is_configured": true, 00:16:57.905 "data_offset": 0, 00:16:57.905 "data_size": 65536 00:16:57.905 } 00:16:57.905 ] 00:16:57.905 }' 00:16:57.905 18:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:57.905 18:53:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:58.482 18:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:58.482 18:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.482 18:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:58.482 18:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:58.744 [2024-07-24 18:53:43.548713] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:58.744 BaseBdev1 00:16:58.744 18:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:58.744 18:53:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:58.744 18:53:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:58.744 18:53:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:58.744 18:53:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:58.744 18:53:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:58.744 18:53:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:58.744 18:53:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:59.003 [ 00:16:59.003 { 00:16:59.003 "name": "BaseBdev1", 00:16:59.003 "aliases": [ 00:16:59.003 "52000d7d-3055-45cc-81c3-b481ff28ca4d" 00:16:59.003 ], 00:16:59.003 "product_name": "Malloc disk", 00:16:59.003 "block_size": 512, 00:16:59.003 "num_blocks": 65536, 00:16:59.003 "uuid": "52000d7d-3055-45cc-81c3-b481ff28ca4d", 00:16:59.003 "assigned_rate_limits": { 00:16:59.003 "rw_ios_per_sec": 0, 00:16:59.003 "rw_mbytes_per_sec": 0, 00:16:59.003 "r_mbytes_per_sec": 0, 00:16:59.003 "w_mbytes_per_sec": 0 00:16:59.003 }, 00:16:59.003 "claimed": true, 00:16:59.003 "claim_type": "exclusive_write", 00:16:59.003 "zoned": false, 00:16:59.003 "supported_io_types": { 00:16:59.003 "read": true, 00:16:59.003 "write": true, 00:16:59.003 "unmap": true, 00:16:59.003 "flush": true, 00:16:59.003 "reset": true, 00:16:59.003 "nvme_admin": false, 00:16:59.003 "nvme_io": false, 00:16:59.003 "nvme_io_md": false, 00:16:59.003 "write_zeroes": true, 00:16:59.003 "zcopy": true, 00:16:59.003 "get_zone_info": false, 00:16:59.003 "zone_management": false, 00:16:59.003 "zone_append": false, 00:16:59.003 "compare": false, 00:16:59.003 "compare_and_write": false, 00:16:59.003 "abort": true, 00:16:59.003 "seek_hole": false, 00:16:59.003 "seek_data": false, 00:16:59.003 "copy": true, 00:16:59.003 "nvme_iov_md": false 00:16:59.003 }, 00:16:59.003 "memory_domains": [ 00:16:59.003 { 00:16:59.003 "dma_device_id": "system", 00:16:59.003 "dma_device_type": 1 00:16:59.003 }, 00:16:59.003 { 00:16:59.003 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.003 "dma_device_type": 2 00:16:59.003 } 00:16:59.003 ], 00:16:59.003 "driver_specific": {} 00:16:59.003 } 00:16:59.003 ] 00:16:59.003 18:53:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:59.003 18:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:59.003 18:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:59.003 18:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:59.003 18:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:59.003 18:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:59.003 18:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:59.003 18:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:59.003 18:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:59.003 18:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:59.003 18:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:59.003 18:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.003 18:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:59.261 18:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:59.261 "name": "Existed_Raid", 00:16:59.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:59.261 "strip_size_kb": 0, 00:16:59.261 "state": "configuring", 00:16:59.261 "raid_level": "raid1", 00:16:59.261 "superblock": false, 00:16:59.261 "num_base_bdevs": 4, 00:16:59.261 "num_base_bdevs_discovered": 3, 00:16:59.261 "num_base_bdevs_operational": 4, 00:16:59.261 "base_bdevs_list": [ 00:16:59.261 { 00:16:59.261 "name": "BaseBdev1", 00:16:59.261 "uuid": "52000d7d-3055-45cc-81c3-b481ff28ca4d", 00:16:59.261 "is_configured": true, 00:16:59.261 "data_offset": 0, 00:16:59.261 "data_size": 65536 00:16:59.261 }, 00:16:59.261 { 00:16:59.261 "name": null, 00:16:59.261 "uuid": "ad920af4-910c-4133-a086-ae899b7efbc0", 00:16:59.261 "is_configured": false, 00:16:59.261 "data_offset": 0, 00:16:59.261 "data_size": 65536 00:16:59.261 }, 00:16:59.261 { 00:16:59.261 "name": "BaseBdev3", 00:16:59.261 "uuid": "2206938c-eef7-4802-91e8-7c0ac6134e3a", 00:16:59.261 "is_configured": true, 00:16:59.261 "data_offset": 0, 00:16:59.262 "data_size": 65536 00:16:59.262 }, 00:16:59.262 { 00:16:59.262 "name": "BaseBdev4", 00:16:59.262 "uuid": "aac84985-1871-4457-980a-72423f9e5080", 00:16:59.262 "is_configured": true, 00:16:59.262 "data_offset": 0, 00:16:59.262 "data_size": 65536 00:16:59.262 } 00:16:59.262 ] 00:16:59.262 }' 00:16:59.262 18:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:59.262 18:53:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:59.828 18:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.828 18:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:59.828 18:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:59.828 18:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:00.087 [2024-07-24 18:53:44.840090] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:00.087 18:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:00.087 18:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:00.087 18:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:00.087 18:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:00.087 18:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:00.087 18:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:00.087 18:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:00.087 18:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:00.087 18:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:00.087 18:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:00.087 18:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.087 18:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:00.087 18:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:00.087 "name": "Existed_Raid", 00:17:00.087 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:00.087 "strip_size_kb": 0, 00:17:00.087 "state": "configuring", 00:17:00.087 "raid_level": "raid1", 00:17:00.087 "superblock": false, 00:17:00.087 "num_base_bdevs": 4, 00:17:00.087 "num_base_bdevs_discovered": 2, 00:17:00.087 "num_base_bdevs_operational": 4, 00:17:00.087 "base_bdevs_list": [ 00:17:00.087 { 00:17:00.087 "name": "BaseBdev1", 00:17:00.087 "uuid": "52000d7d-3055-45cc-81c3-b481ff28ca4d", 00:17:00.087 "is_configured": true, 00:17:00.087 "data_offset": 0, 00:17:00.087 "data_size": 65536 00:17:00.087 }, 00:17:00.087 { 00:17:00.087 "name": null, 00:17:00.087 "uuid": "ad920af4-910c-4133-a086-ae899b7efbc0", 00:17:00.087 "is_configured": false, 00:17:00.087 "data_offset": 0, 00:17:00.087 "data_size": 65536 00:17:00.087 }, 00:17:00.087 { 00:17:00.087 "name": null, 00:17:00.087 "uuid": "2206938c-eef7-4802-91e8-7c0ac6134e3a", 00:17:00.087 "is_configured": false, 00:17:00.087 "data_offset": 0, 00:17:00.087 "data_size": 65536 00:17:00.087 }, 00:17:00.087 { 00:17:00.087 "name": "BaseBdev4", 00:17:00.087 "uuid": "aac84985-1871-4457-980a-72423f9e5080", 00:17:00.087 "is_configured": true, 00:17:00.087 "data_offset": 0, 00:17:00.087 "data_size": 65536 00:17:00.087 } 00:17:00.087 ] 00:17:00.087 }' 00:17:00.087 18:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:00.087 18:53:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:00.654 18:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.654 18:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:00.912 18:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:00.912 18:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:00.912 [2024-07-24 18:53:45.818624] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:00.912 18:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:00.912 18:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:00.912 18:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:00.912 18:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:00.912 18:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:00.912 18:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:00.912 18:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:00.912 18:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:00.912 18:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:00.912 18:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:00.913 18:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.913 18:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:01.171 18:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:01.171 "name": "Existed_Raid", 00:17:01.171 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:01.171 "strip_size_kb": 0, 00:17:01.171 "state": "configuring", 00:17:01.171 "raid_level": "raid1", 00:17:01.171 "superblock": false, 00:17:01.171 "num_base_bdevs": 4, 00:17:01.171 "num_base_bdevs_discovered": 3, 00:17:01.171 "num_base_bdevs_operational": 4, 00:17:01.171 "base_bdevs_list": [ 00:17:01.171 { 00:17:01.171 "name": "BaseBdev1", 00:17:01.171 "uuid": "52000d7d-3055-45cc-81c3-b481ff28ca4d", 00:17:01.171 "is_configured": true, 00:17:01.171 "data_offset": 0, 00:17:01.171 "data_size": 65536 00:17:01.171 }, 00:17:01.171 { 00:17:01.171 "name": null, 00:17:01.171 "uuid": "ad920af4-910c-4133-a086-ae899b7efbc0", 00:17:01.171 "is_configured": false, 00:17:01.171 "data_offset": 0, 00:17:01.171 "data_size": 65536 00:17:01.171 }, 00:17:01.171 { 00:17:01.171 "name": "BaseBdev3", 00:17:01.171 "uuid": "2206938c-eef7-4802-91e8-7c0ac6134e3a", 00:17:01.171 "is_configured": true, 00:17:01.171 "data_offset": 0, 00:17:01.171 "data_size": 65536 00:17:01.171 }, 00:17:01.171 { 00:17:01.171 "name": "BaseBdev4", 00:17:01.171 "uuid": "aac84985-1871-4457-980a-72423f9e5080", 00:17:01.171 "is_configured": true, 00:17:01.171 "data_offset": 0, 00:17:01.171 "data_size": 65536 00:17:01.171 } 00:17:01.171 ] 00:17:01.171 }' 00:17:01.171 18:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:01.171 18:53:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:01.737 18:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.737 18:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:01.737 18:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:01.737 18:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:01.996 [2024-07-24 18:53:46.853305] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:01.996 18:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:01.996 18:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:01.996 18:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:01.996 18:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:01.996 18:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:01.996 18:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:01.996 18:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:01.996 18:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:01.996 18:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:01.996 18:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:01.996 18:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.996 18:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:02.254 18:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:02.254 "name": "Existed_Raid", 00:17:02.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:02.254 "strip_size_kb": 0, 00:17:02.254 "state": "configuring", 00:17:02.254 "raid_level": "raid1", 00:17:02.254 "superblock": false, 00:17:02.254 "num_base_bdevs": 4, 00:17:02.254 "num_base_bdevs_discovered": 2, 00:17:02.254 "num_base_bdevs_operational": 4, 00:17:02.254 "base_bdevs_list": [ 00:17:02.254 { 00:17:02.254 "name": null, 00:17:02.255 "uuid": "52000d7d-3055-45cc-81c3-b481ff28ca4d", 00:17:02.255 "is_configured": false, 00:17:02.255 "data_offset": 0, 00:17:02.255 "data_size": 65536 00:17:02.255 }, 00:17:02.255 { 00:17:02.255 "name": null, 00:17:02.255 "uuid": "ad920af4-910c-4133-a086-ae899b7efbc0", 00:17:02.255 "is_configured": false, 00:17:02.255 "data_offset": 0, 00:17:02.255 "data_size": 65536 00:17:02.255 }, 00:17:02.255 { 00:17:02.255 "name": "BaseBdev3", 00:17:02.255 "uuid": "2206938c-eef7-4802-91e8-7c0ac6134e3a", 00:17:02.255 "is_configured": true, 00:17:02.255 "data_offset": 0, 00:17:02.255 "data_size": 65536 00:17:02.255 }, 00:17:02.255 { 00:17:02.255 "name": "BaseBdev4", 00:17:02.255 "uuid": "aac84985-1871-4457-980a-72423f9e5080", 00:17:02.255 "is_configured": true, 00:17:02.255 "data_offset": 0, 00:17:02.255 "data_size": 65536 00:17:02.255 } 00:17:02.255 ] 00:17:02.255 }' 00:17:02.255 18:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:02.255 18:53:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:02.513 18:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:02.513 18:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.772 18:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:02.772 18:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:03.031 [2024-07-24 18:53:47.833791] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:03.031 18:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:03.031 18:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:03.031 18:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:03.031 18:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:03.031 18:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:03.031 18:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:03.031 18:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:03.031 18:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:03.031 18:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:03.031 18:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:03.031 18:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.031 18:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:03.031 18:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:03.031 "name": "Existed_Raid", 00:17:03.031 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:03.031 "strip_size_kb": 0, 00:17:03.031 "state": "configuring", 00:17:03.031 "raid_level": "raid1", 00:17:03.031 "superblock": false, 00:17:03.031 "num_base_bdevs": 4, 00:17:03.031 "num_base_bdevs_discovered": 3, 00:17:03.031 "num_base_bdevs_operational": 4, 00:17:03.031 "base_bdevs_list": [ 00:17:03.031 { 00:17:03.031 "name": null, 00:17:03.031 "uuid": "52000d7d-3055-45cc-81c3-b481ff28ca4d", 00:17:03.031 "is_configured": false, 00:17:03.031 "data_offset": 0, 00:17:03.031 "data_size": 65536 00:17:03.031 }, 00:17:03.031 { 00:17:03.031 "name": "BaseBdev2", 00:17:03.031 "uuid": "ad920af4-910c-4133-a086-ae899b7efbc0", 00:17:03.031 "is_configured": true, 00:17:03.031 "data_offset": 0, 00:17:03.031 "data_size": 65536 00:17:03.031 }, 00:17:03.031 { 00:17:03.031 "name": "BaseBdev3", 00:17:03.031 "uuid": "2206938c-eef7-4802-91e8-7c0ac6134e3a", 00:17:03.031 "is_configured": true, 00:17:03.031 "data_offset": 0, 00:17:03.031 "data_size": 65536 00:17:03.031 }, 00:17:03.031 { 00:17:03.031 "name": "BaseBdev4", 00:17:03.031 "uuid": "aac84985-1871-4457-980a-72423f9e5080", 00:17:03.031 "is_configured": true, 00:17:03.031 "data_offset": 0, 00:17:03.031 "data_size": 65536 00:17:03.031 } 00:17:03.031 ] 00:17:03.031 }' 00:17:03.031 18:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:03.031 18:53:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:03.598 18:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.598 18:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:03.857 18:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:03.857 18:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.857 18:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:03.857 18:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 52000d7d-3055-45cc-81c3-b481ff28ca4d 00:17:04.116 [2024-07-24 18:53:49.003395] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:04.116 [2024-07-24 18:53:49.003423] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xacdc00 00:17:04.116 [2024-07-24 18:53:49.003427] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:17:04.116 [2024-07-24 18:53:49.003563] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc7d5e0 00:17:04.116 [2024-07-24 18:53:49.003652] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xacdc00 00:17:04.116 [2024-07-24 18:53:49.003657] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xacdc00 00:17:04.116 [2024-07-24 18:53:49.003787] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:04.116 NewBaseBdev 00:17:04.116 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:04.116 18:53:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:04.116 18:53:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:04.116 18:53:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:04.116 18:53:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:04.116 18:53:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:04.116 18:53:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:04.375 18:53:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:04.375 [ 00:17:04.375 { 00:17:04.375 "name": "NewBaseBdev", 00:17:04.375 "aliases": [ 00:17:04.375 "52000d7d-3055-45cc-81c3-b481ff28ca4d" 00:17:04.375 ], 00:17:04.375 "product_name": "Malloc disk", 00:17:04.375 "block_size": 512, 00:17:04.375 "num_blocks": 65536, 00:17:04.375 "uuid": "52000d7d-3055-45cc-81c3-b481ff28ca4d", 00:17:04.375 "assigned_rate_limits": { 00:17:04.375 "rw_ios_per_sec": 0, 00:17:04.375 "rw_mbytes_per_sec": 0, 00:17:04.375 "r_mbytes_per_sec": 0, 00:17:04.375 "w_mbytes_per_sec": 0 00:17:04.375 }, 00:17:04.375 "claimed": true, 00:17:04.375 "claim_type": "exclusive_write", 00:17:04.375 "zoned": false, 00:17:04.375 "supported_io_types": { 00:17:04.375 "read": true, 00:17:04.375 "write": true, 00:17:04.375 "unmap": true, 00:17:04.375 "flush": true, 00:17:04.375 "reset": true, 00:17:04.375 "nvme_admin": false, 00:17:04.375 "nvme_io": false, 00:17:04.375 "nvme_io_md": false, 00:17:04.375 "write_zeroes": true, 00:17:04.375 "zcopy": true, 00:17:04.375 "get_zone_info": false, 00:17:04.375 "zone_management": false, 00:17:04.375 "zone_append": false, 00:17:04.375 "compare": false, 00:17:04.375 "compare_and_write": false, 00:17:04.375 "abort": true, 00:17:04.375 "seek_hole": false, 00:17:04.375 "seek_data": false, 00:17:04.375 "copy": true, 00:17:04.375 "nvme_iov_md": false 00:17:04.375 }, 00:17:04.375 "memory_domains": [ 00:17:04.375 { 00:17:04.375 "dma_device_id": "system", 00:17:04.375 "dma_device_type": 1 00:17:04.375 }, 00:17:04.375 { 00:17:04.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:04.375 "dma_device_type": 2 00:17:04.375 } 00:17:04.375 ], 00:17:04.375 "driver_specific": {} 00:17:04.375 } 00:17:04.375 ] 00:17:04.375 18:53:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:04.375 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:17:04.375 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:04.375 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:04.375 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:04.375 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:04.375 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:04.375 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:04.375 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:04.375 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:04.375 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:04.375 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:04.375 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.635 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:04.635 "name": "Existed_Raid", 00:17:04.635 "uuid": "e9b41e93-3d1d-48f7-bc7f-2a38c3cb2d4e", 00:17:04.635 "strip_size_kb": 0, 00:17:04.635 "state": "online", 00:17:04.635 "raid_level": "raid1", 00:17:04.635 "superblock": false, 00:17:04.635 "num_base_bdevs": 4, 00:17:04.635 "num_base_bdevs_discovered": 4, 00:17:04.635 "num_base_bdevs_operational": 4, 00:17:04.635 "base_bdevs_list": [ 00:17:04.635 { 00:17:04.635 "name": "NewBaseBdev", 00:17:04.635 "uuid": "52000d7d-3055-45cc-81c3-b481ff28ca4d", 00:17:04.635 "is_configured": true, 00:17:04.635 "data_offset": 0, 00:17:04.635 "data_size": 65536 00:17:04.635 }, 00:17:04.635 { 00:17:04.635 "name": "BaseBdev2", 00:17:04.635 "uuid": "ad920af4-910c-4133-a086-ae899b7efbc0", 00:17:04.635 "is_configured": true, 00:17:04.635 "data_offset": 0, 00:17:04.635 "data_size": 65536 00:17:04.635 }, 00:17:04.635 { 00:17:04.635 "name": "BaseBdev3", 00:17:04.635 "uuid": "2206938c-eef7-4802-91e8-7c0ac6134e3a", 00:17:04.635 "is_configured": true, 00:17:04.635 "data_offset": 0, 00:17:04.635 "data_size": 65536 00:17:04.635 }, 00:17:04.635 { 00:17:04.635 "name": "BaseBdev4", 00:17:04.635 "uuid": "aac84985-1871-4457-980a-72423f9e5080", 00:17:04.635 "is_configured": true, 00:17:04.635 "data_offset": 0, 00:17:04.635 "data_size": 65536 00:17:04.635 } 00:17:04.635 ] 00:17:04.635 }' 00:17:04.635 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:04.635 18:53:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:05.236 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:05.236 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:05.236 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:05.236 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:05.236 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:05.236 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:05.236 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:05.236 18:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:05.236 [2024-07-24 18:53:50.146636] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:05.236 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:05.236 "name": "Existed_Raid", 00:17:05.236 "aliases": [ 00:17:05.236 "e9b41e93-3d1d-48f7-bc7f-2a38c3cb2d4e" 00:17:05.236 ], 00:17:05.236 "product_name": "Raid Volume", 00:17:05.236 "block_size": 512, 00:17:05.236 "num_blocks": 65536, 00:17:05.236 "uuid": "e9b41e93-3d1d-48f7-bc7f-2a38c3cb2d4e", 00:17:05.236 "assigned_rate_limits": { 00:17:05.236 "rw_ios_per_sec": 0, 00:17:05.236 "rw_mbytes_per_sec": 0, 00:17:05.236 "r_mbytes_per_sec": 0, 00:17:05.236 "w_mbytes_per_sec": 0 00:17:05.236 }, 00:17:05.236 "claimed": false, 00:17:05.236 "zoned": false, 00:17:05.236 "supported_io_types": { 00:17:05.236 "read": true, 00:17:05.236 "write": true, 00:17:05.236 "unmap": false, 00:17:05.236 "flush": false, 00:17:05.236 "reset": true, 00:17:05.236 "nvme_admin": false, 00:17:05.236 "nvme_io": false, 00:17:05.236 "nvme_io_md": false, 00:17:05.236 "write_zeroes": true, 00:17:05.236 "zcopy": false, 00:17:05.236 "get_zone_info": false, 00:17:05.236 "zone_management": false, 00:17:05.236 "zone_append": false, 00:17:05.236 "compare": false, 00:17:05.236 "compare_and_write": false, 00:17:05.236 "abort": false, 00:17:05.237 "seek_hole": false, 00:17:05.237 "seek_data": false, 00:17:05.237 "copy": false, 00:17:05.237 "nvme_iov_md": false 00:17:05.237 }, 00:17:05.237 "memory_domains": [ 00:17:05.237 { 00:17:05.237 "dma_device_id": "system", 00:17:05.237 "dma_device_type": 1 00:17:05.237 }, 00:17:05.237 { 00:17:05.237 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.237 "dma_device_type": 2 00:17:05.237 }, 00:17:05.237 { 00:17:05.237 "dma_device_id": "system", 00:17:05.237 "dma_device_type": 1 00:17:05.237 }, 00:17:05.237 { 00:17:05.237 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.237 "dma_device_type": 2 00:17:05.237 }, 00:17:05.237 { 00:17:05.237 "dma_device_id": "system", 00:17:05.237 "dma_device_type": 1 00:17:05.237 }, 00:17:05.237 { 00:17:05.237 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.237 "dma_device_type": 2 00:17:05.237 }, 00:17:05.237 { 00:17:05.237 "dma_device_id": "system", 00:17:05.237 "dma_device_type": 1 00:17:05.237 }, 00:17:05.237 { 00:17:05.237 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.237 "dma_device_type": 2 00:17:05.237 } 00:17:05.237 ], 00:17:05.237 "driver_specific": { 00:17:05.237 "raid": { 00:17:05.237 "uuid": "e9b41e93-3d1d-48f7-bc7f-2a38c3cb2d4e", 00:17:05.237 "strip_size_kb": 0, 00:17:05.237 "state": "online", 00:17:05.237 "raid_level": "raid1", 00:17:05.237 "superblock": false, 00:17:05.237 "num_base_bdevs": 4, 00:17:05.237 "num_base_bdevs_discovered": 4, 00:17:05.237 "num_base_bdevs_operational": 4, 00:17:05.237 "base_bdevs_list": [ 00:17:05.237 { 00:17:05.237 "name": "NewBaseBdev", 00:17:05.237 "uuid": "52000d7d-3055-45cc-81c3-b481ff28ca4d", 00:17:05.237 "is_configured": true, 00:17:05.237 "data_offset": 0, 00:17:05.237 "data_size": 65536 00:17:05.237 }, 00:17:05.237 { 00:17:05.237 "name": "BaseBdev2", 00:17:05.237 "uuid": "ad920af4-910c-4133-a086-ae899b7efbc0", 00:17:05.237 "is_configured": true, 00:17:05.237 "data_offset": 0, 00:17:05.237 "data_size": 65536 00:17:05.237 }, 00:17:05.237 { 00:17:05.237 "name": "BaseBdev3", 00:17:05.237 "uuid": "2206938c-eef7-4802-91e8-7c0ac6134e3a", 00:17:05.237 "is_configured": true, 00:17:05.237 "data_offset": 0, 00:17:05.237 "data_size": 65536 00:17:05.237 }, 00:17:05.237 { 00:17:05.237 "name": "BaseBdev4", 00:17:05.237 "uuid": "aac84985-1871-4457-980a-72423f9e5080", 00:17:05.237 "is_configured": true, 00:17:05.237 "data_offset": 0, 00:17:05.237 "data_size": 65536 00:17:05.237 } 00:17:05.237 ] 00:17:05.237 } 00:17:05.237 } 00:17:05.237 }' 00:17:05.237 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:05.237 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:05.237 BaseBdev2 00:17:05.237 BaseBdev3 00:17:05.237 BaseBdev4' 00:17:05.237 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:05.237 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:05.237 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:05.496 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:05.496 "name": "NewBaseBdev", 00:17:05.496 "aliases": [ 00:17:05.496 "52000d7d-3055-45cc-81c3-b481ff28ca4d" 00:17:05.496 ], 00:17:05.496 "product_name": "Malloc disk", 00:17:05.496 "block_size": 512, 00:17:05.496 "num_blocks": 65536, 00:17:05.496 "uuid": "52000d7d-3055-45cc-81c3-b481ff28ca4d", 00:17:05.496 "assigned_rate_limits": { 00:17:05.496 "rw_ios_per_sec": 0, 00:17:05.496 "rw_mbytes_per_sec": 0, 00:17:05.496 "r_mbytes_per_sec": 0, 00:17:05.496 "w_mbytes_per_sec": 0 00:17:05.496 }, 00:17:05.496 "claimed": true, 00:17:05.496 "claim_type": "exclusive_write", 00:17:05.496 "zoned": false, 00:17:05.496 "supported_io_types": { 00:17:05.496 "read": true, 00:17:05.496 "write": true, 00:17:05.496 "unmap": true, 00:17:05.496 "flush": true, 00:17:05.496 "reset": true, 00:17:05.496 "nvme_admin": false, 00:17:05.496 "nvme_io": false, 00:17:05.496 "nvme_io_md": false, 00:17:05.496 "write_zeroes": true, 00:17:05.496 "zcopy": true, 00:17:05.496 "get_zone_info": false, 00:17:05.496 "zone_management": false, 00:17:05.496 "zone_append": false, 00:17:05.496 "compare": false, 00:17:05.496 "compare_and_write": false, 00:17:05.496 "abort": true, 00:17:05.496 "seek_hole": false, 00:17:05.496 "seek_data": false, 00:17:05.496 "copy": true, 00:17:05.496 "nvme_iov_md": false 00:17:05.496 }, 00:17:05.496 "memory_domains": [ 00:17:05.496 { 00:17:05.496 "dma_device_id": "system", 00:17:05.496 "dma_device_type": 1 00:17:05.496 }, 00:17:05.496 { 00:17:05.497 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.497 "dma_device_type": 2 00:17:05.497 } 00:17:05.497 ], 00:17:05.497 "driver_specific": {} 00:17:05.497 }' 00:17:05.497 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:05.497 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:05.497 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:05.497 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:05.497 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:05.755 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:05.755 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:05.755 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:05.755 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:05.755 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:05.755 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:05.755 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:05.755 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:05.755 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:05.755 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:06.014 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:06.014 "name": "BaseBdev2", 00:17:06.014 "aliases": [ 00:17:06.014 "ad920af4-910c-4133-a086-ae899b7efbc0" 00:17:06.014 ], 00:17:06.014 "product_name": "Malloc disk", 00:17:06.014 "block_size": 512, 00:17:06.014 "num_blocks": 65536, 00:17:06.014 "uuid": "ad920af4-910c-4133-a086-ae899b7efbc0", 00:17:06.014 "assigned_rate_limits": { 00:17:06.014 "rw_ios_per_sec": 0, 00:17:06.014 "rw_mbytes_per_sec": 0, 00:17:06.014 "r_mbytes_per_sec": 0, 00:17:06.014 "w_mbytes_per_sec": 0 00:17:06.014 }, 00:17:06.014 "claimed": true, 00:17:06.014 "claim_type": "exclusive_write", 00:17:06.014 "zoned": false, 00:17:06.014 "supported_io_types": { 00:17:06.014 "read": true, 00:17:06.014 "write": true, 00:17:06.014 "unmap": true, 00:17:06.014 "flush": true, 00:17:06.014 "reset": true, 00:17:06.014 "nvme_admin": false, 00:17:06.014 "nvme_io": false, 00:17:06.014 "nvme_io_md": false, 00:17:06.014 "write_zeroes": true, 00:17:06.014 "zcopy": true, 00:17:06.014 "get_zone_info": false, 00:17:06.014 "zone_management": false, 00:17:06.014 "zone_append": false, 00:17:06.014 "compare": false, 00:17:06.014 "compare_and_write": false, 00:17:06.014 "abort": true, 00:17:06.014 "seek_hole": false, 00:17:06.014 "seek_data": false, 00:17:06.014 "copy": true, 00:17:06.014 "nvme_iov_md": false 00:17:06.014 }, 00:17:06.014 "memory_domains": [ 00:17:06.014 { 00:17:06.014 "dma_device_id": "system", 00:17:06.014 "dma_device_type": 1 00:17:06.014 }, 00:17:06.014 { 00:17:06.014 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.014 "dma_device_type": 2 00:17:06.014 } 00:17:06.014 ], 00:17:06.014 "driver_specific": {} 00:17:06.014 }' 00:17:06.014 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.014 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.015 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:06.015 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.015 18:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.015 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:06.015 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.273 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.273 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:06.273 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:06.273 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:06.273 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:06.273 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:06.273 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:06.273 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:06.532 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:06.532 "name": "BaseBdev3", 00:17:06.532 "aliases": [ 00:17:06.532 "2206938c-eef7-4802-91e8-7c0ac6134e3a" 00:17:06.532 ], 00:17:06.532 "product_name": "Malloc disk", 00:17:06.532 "block_size": 512, 00:17:06.532 "num_blocks": 65536, 00:17:06.532 "uuid": "2206938c-eef7-4802-91e8-7c0ac6134e3a", 00:17:06.532 "assigned_rate_limits": { 00:17:06.532 "rw_ios_per_sec": 0, 00:17:06.532 "rw_mbytes_per_sec": 0, 00:17:06.532 "r_mbytes_per_sec": 0, 00:17:06.532 "w_mbytes_per_sec": 0 00:17:06.532 }, 00:17:06.532 "claimed": true, 00:17:06.532 "claim_type": "exclusive_write", 00:17:06.532 "zoned": false, 00:17:06.532 "supported_io_types": { 00:17:06.532 "read": true, 00:17:06.532 "write": true, 00:17:06.532 "unmap": true, 00:17:06.532 "flush": true, 00:17:06.532 "reset": true, 00:17:06.532 "nvme_admin": false, 00:17:06.532 "nvme_io": false, 00:17:06.532 "nvme_io_md": false, 00:17:06.532 "write_zeroes": true, 00:17:06.532 "zcopy": true, 00:17:06.532 "get_zone_info": false, 00:17:06.532 "zone_management": false, 00:17:06.532 "zone_append": false, 00:17:06.532 "compare": false, 00:17:06.532 "compare_and_write": false, 00:17:06.532 "abort": true, 00:17:06.532 "seek_hole": false, 00:17:06.532 "seek_data": false, 00:17:06.532 "copy": true, 00:17:06.532 "nvme_iov_md": false 00:17:06.532 }, 00:17:06.532 "memory_domains": [ 00:17:06.532 { 00:17:06.532 "dma_device_id": "system", 00:17:06.532 "dma_device_type": 1 00:17:06.532 }, 00:17:06.532 { 00:17:06.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.532 "dma_device_type": 2 00:17:06.532 } 00:17:06.532 ], 00:17:06.532 "driver_specific": {} 00:17:06.532 }' 00:17:06.532 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.532 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.532 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:06.532 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.533 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.533 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:06.533 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.533 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.533 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:06.533 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:06.792 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:06.792 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:06.792 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:06.792 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:06.792 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:06.792 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:06.792 "name": "BaseBdev4", 00:17:06.792 "aliases": [ 00:17:06.792 "aac84985-1871-4457-980a-72423f9e5080" 00:17:06.792 ], 00:17:06.792 "product_name": "Malloc disk", 00:17:06.792 "block_size": 512, 00:17:06.792 "num_blocks": 65536, 00:17:06.792 "uuid": "aac84985-1871-4457-980a-72423f9e5080", 00:17:06.792 "assigned_rate_limits": { 00:17:06.792 "rw_ios_per_sec": 0, 00:17:06.792 "rw_mbytes_per_sec": 0, 00:17:06.792 "r_mbytes_per_sec": 0, 00:17:06.792 "w_mbytes_per_sec": 0 00:17:06.792 }, 00:17:06.792 "claimed": true, 00:17:06.792 "claim_type": "exclusive_write", 00:17:06.792 "zoned": false, 00:17:06.792 "supported_io_types": { 00:17:06.792 "read": true, 00:17:06.792 "write": true, 00:17:06.792 "unmap": true, 00:17:06.792 "flush": true, 00:17:06.792 "reset": true, 00:17:06.792 "nvme_admin": false, 00:17:06.792 "nvme_io": false, 00:17:06.792 "nvme_io_md": false, 00:17:06.792 "write_zeroes": true, 00:17:06.792 "zcopy": true, 00:17:06.792 "get_zone_info": false, 00:17:06.792 "zone_management": false, 00:17:06.792 "zone_append": false, 00:17:06.792 "compare": false, 00:17:06.792 "compare_and_write": false, 00:17:06.792 "abort": true, 00:17:06.792 "seek_hole": false, 00:17:06.792 "seek_data": false, 00:17:06.792 "copy": true, 00:17:06.792 "nvme_iov_md": false 00:17:06.792 }, 00:17:06.792 "memory_domains": [ 00:17:06.792 { 00:17:06.792 "dma_device_id": "system", 00:17:06.792 "dma_device_type": 1 00:17:06.792 }, 00:17:06.792 { 00:17:06.792 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.792 "dma_device_type": 2 00:17:06.792 } 00:17:06.792 ], 00:17:06.792 "driver_specific": {} 00:17:06.792 }' 00:17:06.792 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.050 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.050 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:07.050 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:07.050 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:07.050 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:07.050 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.050 18:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.050 18:53:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:07.050 18:53:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.308 18:53:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.308 18:53:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:07.308 18:53:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:07.308 [2024-07-24 18:53:52.243828] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:07.308 [2024-07-24 18:53:52.243847] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:07.308 [2024-07-24 18:53:52.243884] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:07.308 [2024-07-24 18:53:52.244087] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:07.308 [2024-07-24 18:53:52.244093] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xacdc00 name Existed_Raid, state offline 00:17:07.308 18:53:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2134080 00:17:07.308 18:53:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2134080 ']' 00:17:07.308 18:53:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2134080 00:17:07.308 18:53:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:17:07.308 18:53:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:07.308 18:53:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2134080 00:17:07.308 18:53:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:07.308 18:53:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:07.308 18:53:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2134080' 00:17:07.308 killing process with pid 2134080 00:17:07.308 18:53:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2134080 00:17:07.308 [2024-07-24 18:53:52.299883] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:07.308 18:53:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2134080 00:17:07.567 [2024-07-24 18:53:52.331111] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:07.567 18:53:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:17:07.567 00:17:07.567 real 0m24.100s 00:17:07.567 user 0m44.906s 00:17:07.567 sys 0m3.754s 00:17:07.567 18:53:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:07.567 18:53:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:07.567 ************************************ 00:17:07.567 END TEST raid_state_function_test 00:17:07.567 ************************************ 00:17:07.567 18:53:52 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:17:07.567 18:53:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:07.567 18:53:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:07.567 18:53:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:07.568 ************************************ 00:17:07.568 START TEST raid_state_function_test_sb 00:17:07.568 ************************************ 00:17:07.568 18:53:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:17:07.568 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:17:07.568 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:07.568 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:17:07.568 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:07.568 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:07.568 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:07.568 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:07.568 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:07.826 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:07.826 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:07.826 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:07.826 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:07.826 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:07.826 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:07.826 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:07.826 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:07.826 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2138636 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2138636' 00:17:07.827 Process raid pid: 2138636 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2138636 /var/tmp/spdk-raid.sock 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2138636 ']' 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:07.827 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:07.827 18:53:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:07.827 [2024-07-24 18:53:52.630300] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:17:07.827 [2024-07-24 18:53:52.630338] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:07.827 [2024-07-24 18:53:52.693643] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:07.827 [2024-07-24 18:53:52.771336] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:07.827 [2024-07-24 18:53:52.821551] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:07.827 [2024-07-24 18:53:52.821575] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:08.761 18:53:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:08.761 18:53:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:17:08.761 18:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:08.761 [2024-07-24 18:53:53.584351] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:08.761 [2024-07-24 18:53:53.584381] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:08.761 [2024-07-24 18:53:53.584387] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:08.761 [2024-07-24 18:53:53.584393] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:08.761 [2024-07-24 18:53:53.584397] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:08.761 [2024-07-24 18:53:53.584402] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:08.761 [2024-07-24 18:53:53.584406] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:08.761 [2024-07-24 18:53:53.584411] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:08.761 18:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:08.761 18:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:08.761 18:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:08.761 18:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:08.761 18:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:08.761 18:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:08.762 18:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:08.762 18:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:08.762 18:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:08.762 18:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:08.762 18:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.762 18:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:09.019 18:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:09.019 "name": "Existed_Raid", 00:17:09.019 "uuid": "9628bfeb-b825-4d30-b27e-ed57056829cf", 00:17:09.019 "strip_size_kb": 0, 00:17:09.019 "state": "configuring", 00:17:09.019 "raid_level": "raid1", 00:17:09.019 "superblock": true, 00:17:09.019 "num_base_bdevs": 4, 00:17:09.019 "num_base_bdevs_discovered": 0, 00:17:09.019 "num_base_bdevs_operational": 4, 00:17:09.019 "base_bdevs_list": [ 00:17:09.019 { 00:17:09.019 "name": "BaseBdev1", 00:17:09.019 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:09.019 "is_configured": false, 00:17:09.019 "data_offset": 0, 00:17:09.019 "data_size": 0 00:17:09.019 }, 00:17:09.019 { 00:17:09.019 "name": "BaseBdev2", 00:17:09.019 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:09.019 "is_configured": false, 00:17:09.019 "data_offset": 0, 00:17:09.019 "data_size": 0 00:17:09.019 }, 00:17:09.019 { 00:17:09.019 "name": "BaseBdev3", 00:17:09.019 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:09.019 "is_configured": false, 00:17:09.019 "data_offset": 0, 00:17:09.019 "data_size": 0 00:17:09.019 }, 00:17:09.019 { 00:17:09.019 "name": "BaseBdev4", 00:17:09.019 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:09.019 "is_configured": false, 00:17:09.019 "data_offset": 0, 00:17:09.019 "data_size": 0 00:17:09.019 } 00:17:09.019 ] 00:17:09.019 }' 00:17:09.019 18:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:09.019 18:53:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:09.277 18:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:09.535 [2024-07-24 18:53:54.426438] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:09.535 [2024-07-24 18:53:54.426459] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2769bc0 name Existed_Raid, state configuring 00:17:09.535 18:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:09.793 [2024-07-24 18:53:54.578844] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:09.793 [2024-07-24 18:53:54.578861] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:09.793 [2024-07-24 18:53:54.578865] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:09.793 [2024-07-24 18:53:54.578870] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:09.793 [2024-07-24 18:53:54.578874] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:09.793 [2024-07-24 18:53:54.578879] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:09.793 [2024-07-24 18:53:54.578883] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:09.793 [2024-07-24 18:53:54.578887] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:09.793 18:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:09.793 [2024-07-24 18:53:54.743314] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:09.793 BaseBdev1 00:17:09.793 18:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:09.793 18:53:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:09.793 18:53:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:09.793 18:53:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:09.793 18:53:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:09.793 18:53:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:09.793 18:53:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:10.051 18:53:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:10.310 [ 00:17:10.310 { 00:17:10.310 "name": "BaseBdev1", 00:17:10.310 "aliases": [ 00:17:10.310 "146d8f07-e5fa-4e27-917e-47f0aa8ad51d" 00:17:10.310 ], 00:17:10.310 "product_name": "Malloc disk", 00:17:10.310 "block_size": 512, 00:17:10.310 "num_blocks": 65536, 00:17:10.310 "uuid": "146d8f07-e5fa-4e27-917e-47f0aa8ad51d", 00:17:10.310 "assigned_rate_limits": { 00:17:10.310 "rw_ios_per_sec": 0, 00:17:10.310 "rw_mbytes_per_sec": 0, 00:17:10.310 "r_mbytes_per_sec": 0, 00:17:10.310 "w_mbytes_per_sec": 0 00:17:10.310 }, 00:17:10.310 "claimed": true, 00:17:10.310 "claim_type": "exclusive_write", 00:17:10.310 "zoned": false, 00:17:10.310 "supported_io_types": { 00:17:10.310 "read": true, 00:17:10.310 "write": true, 00:17:10.310 "unmap": true, 00:17:10.310 "flush": true, 00:17:10.310 "reset": true, 00:17:10.310 "nvme_admin": false, 00:17:10.310 "nvme_io": false, 00:17:10.310 "nvme_io_md": false, 00:17:10.310 "write_zeroes": true, 00:17:10.310 "zcopy": true, 00:17:10.310 "get_zone_info": false, 00:17:10.310 "zone_management": false, 00:17:10.310 "zone_append": false, 00:17:10.310 "compare": false, 00:17:10.310 "compare_and_write": false, 00:17:10.310 "abort": true, 00:17:10.310 "seek_hole": false, 00:17:10.310 "seek_data": false, 00:17:10.310 "copy": true, 00:17:10.310 "nvme_iov_md": false 00:17:10.310 }, 00:17:10.310 "memory_domains": [ 00:17:10.310 { 00:17:10.310 "dma_device_id": "system", 00:17:10.310 "dma_device_type": 1 00:17:10.310 }, 00:17:10.310 { 00:17:10.310 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:10.310 "dma_device_type": 2 00:17:10.310 } 00:17:10.310 ], 00:17:10.310 "driver_specific": {} 00:17:10.310 } 00:17:10.310 ] 00:17:10.310 18:53:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:10.310 18:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:10.310 18:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:10.310 18:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:10.310 18:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:10.310 18:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:10.310 18:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:10.310 18:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:10.310 18:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:10.310 18:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:10.310 18:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:10.310 18:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:10.310 18:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:10.310 18:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:10.310 "name": "Existed_Raid", 00:17:10.310 "uuid": "deca90bd-2e60-445f-8db0-3e1603818486", 00:17:10.310 "strip_size_kb": 0, 00:17:10.310 "state": "configuring", 00:17:10.310 "raid_level": "raid1", 00:17:10.310 "superblock": true, 00:17:10.310 "num_base_bdevs": 4, 00:17:10.310 "num_base_bdevs_discovered": 1, 00:17:10.310 "num_base_bdevs_operational": 4, 00:17:10.310 "base_bdevs_list": [ 00:17:10.310 { 00:17:10.310 "name": "BaseBdev1", 00:17:10.310 "uuid": "146d8f07-e5fa-4e27-917e-47f0aa8ad51d", 00:17:10.310 "is_configured": true, 00:17:10.310 "data_offset": 2048, 00:17:10.310 "data_size": 63488 00:17:10.310 }, 00:17:10.310 { 00:17:10.310 "name": "BaseBdev2", 00:17:10.310 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:10.310 "is_configured": false, 00:17:10.310 "data_offset": 0, 00:17:10.310 "data_size": 0 00:17:10.310 }, 00:17:10.310 { 00:17:10.310 "name": "BaseBdev3", 00:17:10.310 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:10.310 "is_configured": false, 00:17:10.310 "data_offset": 0, 00:17:10.310 "data_size": 0 00:17:10.310 }, 00:17:10.310 { 00:17:10.310 "name": "BaseBdev4", 00:17:10.310 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:10.310 "is_configured": false, 00:17:10.310 "data_offset": 0, 00:17:10.310 "data_size": 0 00:17:10.310 } 00:17:10.310 ] 00:17:10.310 }' 00:17:10.310 18:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:10.310 18:53:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:10.876 18:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:11.134 [2024-07-24 18:53:55.914335] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:11.134 [2024-07-24 18:53:55.914369] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2769430 name Existed_Raid, state configuring 00:17:11.134 18:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:11.134 [2024-07-24 18:53:56.086800] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:11.134 [2024-07-24 18:53:56.087803] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:11.134 [2024-07-24 18:53:56.087827] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:11.134 [2024-07-24 18:53:56.087832] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:11.134 [2024-07-24 18:53:56.087837] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:11.134 [2024-07-24 18:53:56.087841] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:11.134 [2024-07-24 18:53:56.087846] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:11.134 18:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:11.134 18:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:11.134 18:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:11.134 18:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:11.134 18:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:11.134 18:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:11.134 18:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:11.134 18:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:11.134 18:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:11.134 18:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:11.134 18:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:11.134 18:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:11.134 18:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.134 18:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:11.394 18:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:11.394 "name": "Existed_Raid", 00:17:11.394 "uuid": "635dcb96-7127-4962-acbe-55b41b260ab4", 00:17:11.394 "strip_size_kb": 0, 00:17:11.394 "state": "configuring", 00:17:11.394 "raid_level": "raid1", 00:17:11.394 "superblock": true, 00:17:11.394 "num_base_bdevs": 4, 00:17:11.394 "num_base_bdevs_discovered": 1, 00:17:11.394 "num_base_bdevs_operational": 4, 00:17:11.394 "base_bdevs_list": [ 00:17:11.394 { 00:17:11.394 "name": "BaseBdev1", 00:17:11.394 "uuid": "146d8f07-e5fa-4e27-917e-47f0aa8ad51d", 00:17:11.394 "is_configured": true, 00:17:11.394 "data_offset": 2048, 00:17:11.394 "data_size": 63488 00:17:11.394 }, 00:17:11.394 { 00:17:11.394 "name": "BaseBdev2", 00:17:11.394 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:11.394 "is_configured": false, 00:17:11.394 "data_offset": 0, 00:17:11.394 "data_size": 0 00:17:11.394 }, 00:17:11.394 { 00:17:11.394 "name": "BaseBdev3", 00:17:11.394 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:11.394 "is_configured": false, 00:17:11.394 "data_offset": 0, 00:17:11.394 "data_size": 0 00:17:11.394 }, 00:17:11.394 { 00:17:11.394 "name": "BaseBdev4", 00:17:11.394 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:11.394 "is_configured": false, 00:17:11.394 "data_offset": 0, 00:17:11.394 "data_size": 0 00:17:11.394 } 00:17:11.394 ] 00:17:11.394 }' 00:17:11.394 18:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:11.394 18:53:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:11.960 18:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:11.960 [2024-07-24 18:53:56.883412] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:11.960 BaseBdev2 00:17:11.961 18:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:11.961 18:53:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:11.961 18:53:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:11.961 18:53:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:11.961 18:53:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:11.961 18:53:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:11.961 18:53:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:12.218 18:53:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:12.477 [ 00:17:12.477 { 00:17:12.477 "name": "BaseBdev2", 00:17:12.477 "aliases": [ 00:17:12.477 "81e8431b-fd2f-4202-a284-8e42c4ee5b00" 00:17:12.477 ], 00:17:12.477 "product_name": "Malloc disk", 00:17:12.477 "block_size": 512, 00:17:12.477 "num_blocks": 65536, 00:17:12.477 "uuid": "81e8431b-fd2f-4202-a284-8e42c4ee5b00", 00:17:12.477 "assigned_rate_limits": { 00:17:12.477 "rw_ios_per_sec": 0, 00:17:12.477 "rw_mbytes_per_sec": 0, 00:17:12.477 "r_mbytes_per_sec": 0, 00:17:12.477 "w_mbytes_per_sec": 0 00:17:12.477 }, 00:17:12.477 "claimed": true, 00:17:12.477 "claim_type": "exclusive_write", 00:17:12.477 "zoned": false, 00:17:12.477 "supported_io_types": { 00:17:12.477 "read": true, 00:17:12.477 "write": true, 00:17:12.477 "unmap": true, 00:17:12.477 "flush": true, 00:17:12.477 "reset": true, 00:17:12.477 "nvme_admin": false, 00:17:12.477 "nvme_io": false, 00:17:12.477 "nvme_io_md": false, 00:17:12.477 "write_zeroes": true, 00:17:12.477 "zcopy": true, 00:17:12.477 "get_zone_info": false, 00:17:12.477 "zone_management": false, 00:17:12.477 "zone_append": false, 00:17:12.477 "compare": false, 00:17:12.477 "compare_and_write": false, 00:17:12.477 "abort": true, 00:17:12.477 "seek_hole": false, 00:17:12.477 "seek_data": false, 00:17:12.477 "copy": true, 00:17:12.477 "nvme_iov_md": false 00:17:12.477 }, 00:17:12.477 "memory_domains": [ 00:17:12.477 { 00:17:12.477 "dma_device_id": "system", 00:17:12.477 "dma_device_type": 1 00:17:12.477 }, 00:17:12.477 { 00:17:12.477 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.477 "dma_device_type": 2 00:17:12.477 } 00:17:12.477 ], 00:17:12.477 "driver_specific": {} 00:17:12.477 } 00:17:12.477 ] 00:17:12.477 18:53:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:12.477 18:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:12.477 18:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:12.477 18:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:12.477 18:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:12.477 18:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:12.477 18:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:12.477 18:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:12.477 18:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:12.477 18:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:12.477 18:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:12.477 18:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:12.477 18:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:12.477 18:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:12.477 18:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.477 18:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:12.477 "name": "Existed_Raid", 00:17:12.477 "uuid": "635dcb96-7127-4962-acbe-55b41b260ab4", 00:17:12.477 "strip_size_kb": 0, 00:17:12.477 "state": "configuring", 00:17:12.477 "raid_level": "raid1", 00:17:12.477 "superblock": true, 00:17:12.477 "num_base_bdevs": 4, 00:17:12.477 "num_base_bdevs_discovered": 2, 00:17:12.477 "num_base_bdevs_operational": 4, 00:17:12.477 "base_bdevs_list": [ 00:17:12.478 { 00:17:12.478 "name": "BaseBdev1", 00:17:12.478 "uuid": "146d8f07-e5fa-4e27-917e-47f0aa8ad51d", 00:17:12.478 "is_configured": true, 00:17:12.478 "data_offset": 2048, 00:17:12.478 "data_size": 63488 00:17:12.478 }, 00:17:12.478 { 00:17:12.478 "name": "BaseBdev2", 00:17:12.478 "uuid": "81e8431b-fd2f-4202-a284-8e42c4ee5b00", 00:17:12.478 "is_configured": true, 00:17:12.478 "data_offset": 2048, 00:17:12.478 "data_size": 63488 00:17:12.478 }, 00:17:12.478 { 00:17:12.478 "name": "BaseBdev3", 00:17:12.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:12.478 "is_configured": false, 00:17:12.478 "data_offset": 0, 00:17:12.478 "data_size": 0 00:17:12.478 }, 00:17:12.478 { 00:17:12.478 "name": "BaseBdev4", 00:17:12.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:12.478 "is_configured": false, 00:17:12.478 "data_offset": 0, 00:17:12.478 "data_size": 0 00:17:12.478 } 00:17:12.478 ] 00:17:12.478 }' 00:17:12.478 18:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:12.478 18:53:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:13.045 18:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:13.304 [2024-07-24 18:53:58.093078] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:13.304 BaseBdev3 00:17:13.304 18:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:13.304 18:53:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:13.304 18:53:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:13.304 18:53:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:13.304 18:53:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:13.304 18:53:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:13.304 18:53:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:13.304 18:53:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:13.562 [ 00:17:13.562 { 00:17:13.562 "name": "BaseBdev3", 00:17:13.562 "aliases": [ 00:17:13.562 "f8fb9318-2550-4ff4-ae34-7486e99ab371" 00:17:13.562 ], 00:17:13.562 "product_name": "Malloc disk", 00:17:13.562 "block_size": 512, 00:17:13.562 "num_blocks": 65536, 00:17:13.562 "uuid": "f8fb9318-2550-4ff4-ae34-7486e99ab371", 00:17:13.562 "assigned_rate_limits": { 00:17:13.562 "rw_ios_per_sec": 0, 00:17:13.562 "rw_mbytes_per_sec": 0, 00:17:13.562 "r_mbytes_per_sec": 0, 00:17:13.562 "w_mbytes_per_sec": 0 00:17:13.562 }, 00:17:13.562 "claimed": true, 00:17:13.562 "claim_type": "exclusive_write", 00:17:13.562 "zoned": false, 00:17:13.562 "supported_io_types": { 00:17:13.562 "read": true, 00:17:13.562 "write": true, 00:17:13.562 "unmap": true, 00:17:13.562 "flush": true, 00:17:13.562 "reset": true, 00:17:13.562 "nvme_admin": false, 00:17:13.562 "nvme_io": false, 00:17:13.562 "nvme_io_md": false, 00:17:13.562 "write_zeroes": true, 00:17:13.562 "zcopy": true, 00:17:13.562 "get_zone_info": false, 00:17:13.562 "zone_management": false, 00:17:13.562 "zone_append": false, 00:17:13.562 "compare": false, 00:17:13.562 "compare_and_write": false, 00:17:13.562 "abort": true, 00:17:13.562 "seek_hole": false, 00:17:13.562 "seek_data": false, 00:17:13.562 "copy": true, 00:17:13.562 "nvme_iov_md": false 00:17:13.562 }, 00:17:13.562 "memory_domains": [ 00:17:13.562 { 00:17:13.562 "dma_device_id": "system", 00:17:13.562 "dma_device_type": 1 00:17:13.562 }, 00:17:13.562 { 00:17:13.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:13.562 "dma_device_type": 2 00:17:13.562 } 00:17:13.562 ], 00:17:13.562 "driver_specific": {} 00:17:13.562 } 00:17:13.562 ] 00:17:13.562 18:53:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:13.562 18:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:13.562 18:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:13.562 18:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:13.562 18:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:13.562 18:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:13.562 18:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:13.562 18:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:13.562 18:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:13.562 18:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:13.562 18:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:13.563 18:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:13.563 18:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:13.563 18:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.563 18:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:13.821 18:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:13.821 "name": "Existed_Raid", 00:17:13.821 "uuid": "635dcb96-7127-4962-acbe-55b41b260ab4", 00:17:13.821 "strip_size_kb": 0, 00:17:13.821 "state": "configuring", 00:17:13.821 "raid_level": "raid1", 00:17:13.821 "superblock": true, 00:17:13.821 "num_base_bdevs": 4, 00:17:13.821 "num_base_bdevs_discovered": 3, 00:17:13.821 "num_base_bdevs_operational": 4, 00:17:13.821 "base_bdevs_list": [ 00:17:13.821 { 00:17:13.821 "name": "BaseBdev1", 00:17:13.821 "uuid": "146d8f07-e5fa-4e27-917e-47f0aa8ad51d", 00:17:13.821 "is_configured": true, 00:17:13.821 "data_offset": 2048, 00:17:13.821 "data_size": 63488 00:17:13.821 }, 00:17:13.821 { 00:17:13.821 "name": "BaseBdev2", 00:17:13.821 "uuid": "81e8431b-fd2f-4202-a284-8e42c4ee5b00", 00:17:13.821 "is_configured": true, 00:17:13.821 "data_offset": 2048, 00:17:13.821 "data_size": 63488 00:17:13.821 }, 00:17:13.821 { 00:17:13.821 "name": "BaseBdev3", 00:17:13.821 "uuid": "f8fb9318-2550-4ff4-ae34-7486e99ab371", 00:17:13.821 "is_configured": true, 00:17:13.821 "data_offset": 2048, 00:17:13.821 "data_size": 63488 00:17:13.821 }, 00:17:13.821 { 00:17:13.821 "name": "BaseBdev4", 00:17:13.821 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:13.821 "is_configured": false, 00:17:13.821 "data_offset": 0, 00:17:13.821 "data_size": 0 00:17:13.821 } 00:17:13.821 ] 00:17:13.821 }' 00:17:13.821 18:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:13.821 18:53:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:14.079 18:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:14.338 [2024-07-24 18:53:59.230685] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:14.338 [2024-07-24 18:53:59.230806] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x276a490 00:17:14.338 [2024-07-24 18:53:59.230815] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:14.338 [2024-07-24 18:53:59.230930] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27562d0 00:17:14.338 [2024-07-24 18:53:59.231013] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x276a490 00:17:14.338 [2024-07-24 18:53:59.231018] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x276a490 00:17:14.338 [2024-07-24 18:53:59.231079] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:14.338 BaseBdev4 00:17:14.338 18:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:14.338 18:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:14.338 18:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:14.338 18:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:14.338 18:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:14.338 18:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:14.338 18:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:14.597 18:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:14.597 [ 00:17:14.597 { 00:17:14.597 "name": "BaseBdev4", 00:17:14.597 "aliases": [ 00:17:14.597 "c589e351-204c-4b80-b39d-d34f8f643623" 00:17:14.597 ], 00:17:14.597 "product_name": "Malloc disk", 00:17:14.597 "block_size": 512, 00:17:14.597 "num_blocks": 65536, 00:17:14.597 "uuid": "c589e351-204c-4b80-b39d-d34f8f643623", 00:17:14.597 "assigned_rate_limits": { 00:17:14.597 "rw_ios_per_sec": 0, 00:17:14.597 "rw_mbytes_per_sec": 0, 00:17:14.597 "r_mbytes_per_sec": 0, 00:17:14.597 "w_mbytes_per_sec": 0 00:17:14.597 }, 00:17:14.597 "claimed": true, 00:17:14.597 "claim_type": "exclusive_write", 00:17:14.597 "zoned": false, 00:17:14.597 "supported_io_types": { 00:17:14.597 "read": true, 00:17:14.597 "write": true, 00:17:14.597 "unmap": true, 00:17:14.597 "flush": true, 00:17:14.597 "reset": true, 00:17:14.597 "nvme_admin": false, 00:17:14.597 "nvme_io": false, 00:17:14.597 "nvme_io_md": false, 00:17:14.597 "write_zeroes": true, 00:17:14.597 "zcopy": true, 00:17:14.597 "get_zone_info": false, 00:17:14.597 "zone_management": false, 00:17:14.597 "zone_append": false, 00:17:14.597 "compare": false, 00:17:14.597 "compare_and_write": false, 00:17:14.597 "abort": true, 00:17:14.597 "seek_hole": false, 00:17:14.597 "seek_data": false, 00:17:14.597 "copy": true, 00:17:14.597 "nvme_iov_md": false 00:17:14.597 }, 00:17:14.597 "memory_domains": [ 00:17:14.597 { 00:17:14.597 "dma_device_id": "system", 00:17:14.597 "dma_device_type": 1 00:17:14.597 }, 00:17:14.597 { 00:17:14.597 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:14.597 "dma_device_type": 2 00:17:14.597 } 00:17:14.597 ], 00:17:14.597 "driver_specific": {} 00:17:14.597 } 00:17:14.597 ] 00:17:14.597 18:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:14.597 18:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:14.597 18:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:14.597 18:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:17:14.597 18:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:14.597 18:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:14.597 18:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:14.597 18:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:14.597 18:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:14.597 18:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:14.597 18:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:14.597 18:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:14.597 18:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:14.597 18:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.597 18:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:14.855 18:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:14.855 "name": "Existed_Raid", 00:17:14.855 "uuid": "635dcb96-7127-4962-acbe-55b41b260ab4", 00:17:14.855 "strip_size_kb": 0, 00:17:14.855 "state": "online", 00:17:14.855 "raid_level": "raid1", 00:17:14.855 "superblock": true, 00:17:14.855 "num_base_bdevs": 4, 00:17:14.855 "num_base_bdevs_discovered": 4, 00:17:14.855 "num_base_bdevs_operational": 4, 00:17:14.855 "base_bdevs_list": [ 00:17:14.855 { 00:17:14.855 "name": "BaseBdev1", 00:17:14.855 "uuid": "146d8f07-e5fa-4e27-917e-47f0aa8ad51d", 00:17:14.855 "is_configured": true, 00:17:14.855 "data_offset": 2048, 00:17:14.855 "data_size": 63488 00:17:14.855 }, 00:17:14.855 { 00:17:14.855 "name": "BaseBdev2", 00:17:14.855 "uuid": "81e8431b-fd2f-4202-a284-8e42c4ee5b00", 00:17:14.855 "is_configured": true, 00:17:14.855 "data_offset": 2048, 00:17:14.855 "data_size": 63488 00:17:14.855 }, 00:17:14.855 { 00:17:14.855 "name": "BaseBdev3", 00:17:14.855 "uuid": "f8fb9318-2550-4ff4-ae34-7486e99ab371", 00:17:14.855 "is_configured": true, 00:17:14.855 "data_offset": 2048, 00:17:14.855 "data_size": 63488 00:17:14.855 }, 00:17:14.855 { 00:17:14.855 "name": "BaseBdev4", 00:17:14.855 "uuid": "c589e351-204c-4b80-b39d-d34f8f643623", 00:17:14.855 "is_configured": true, 00:17:14.855 "data_offset": 2048, 00:17:14.855 "data_size": 63488 00:17:14.855 } 00:17:14.855 ] 00:17:14.855 }' 00:17:14.855 18:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:14.855 18:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:15.422 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:15.422 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:15.422 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:15.422 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:15.422 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:15.423 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:15.423 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:15.423 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:15.423 [2024-07-24 18:54:00.426030] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:15.681 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:15.681 "name": "Existed_Raid", 00:17:15.681 "aliases": [ 00:17:15.681 "635dcb96-7127-4962-acbe-55b41b260ab4" 00:17:15.681 ], 00:17:15.681 "product_name": "Raid Volume", 00:17:15.681 "block_size": 512, 00:17:15.681 "num_blocks": 63488, 00:17:15.681 "uuid": "635dcb96-7127-4962-acbe-55b41b260ab4", 00:17:15.681 "assigned_rate_limits": { 00:17:15.681 "rw_ios_per_sec": 0, 00:17:15.681 "rw_mbytes_per_sec": 0, 00:17:15.681 "r_mbytes_per_sec": 0, 00:17:15.681 "w_mbytes_per_sec": 0 00:17:15.681 }, 00:17:15.681 "claimed": false, 00:17:15.681 "zoned": false, 00:17:15.681 "supported_io_types": { 00:17:15.681 "read": true, 00:17:15.681 "write": true, 00:17:15.681 "unmap": false, 00:17:15.681 "flush": false, 00:17:15.681 "reset": true, 00:17:15.681 "nvme_admin": false, 00:17:15.681 "nvme_io": false, 00:17:15.681 "nvme_io_md": false, 00:17:15.681 "write_zeroes": true, 00:17:15.681 "zcopy": false, 00:17:15.681 "get_zone_info": false, 00:17:15.681 "zone_management": false, 00:17:15.681 "zone_append": false, 00:17:15.681 "compare": false, 00:17:15.681 "compare_and_write": false, 00:17:15.681 "abort": false, 00:17:15.681 "seek_hole": false, 00:17:15.681 "seek_data": false, 00:17:15.681 "copy": false, 00:17:15.681 "nvme_iov_md": false 00:17:15.681 }, 00:17:15.681 "memory_domains": [ 00:17:15.681 { 00:17:15.681 "dma_device_id": "system", 00:17:15.681 "dma_device_type": 1 00:17:15.681 }, 00:17:15.681 { 00:17:15.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:15.681 "dma_device_type": 2 00:17:15.681 }, 00:17:15.681 { 00:17:15.681 "dma_device_id": "system", 00:17:15.681 "dma_device_type": 1 00:17:15.681 }, 00:17:15.681 { 00:17:15.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:15.681 "dma_device_type": 2 00:17:15.681 }, 00:17:15.681 { 00:17:15.681 "dma_device_id": "system", 00:17:15.681 "dma_device_type": 1 00:17:15.681 }, 00:17:15.681 { 00:17:15.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:15.681 "dma_device_type": 2 00:17:15.681 }, 00:17:15.681 { 00:17:15.681 "dma_device_id": "system", 00:17:15.681 "dma_device_type": 1 00:17:15.681 }, 00:17:15.681 { 00:17:15.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:15.681 "dma_device_type": 2 00:17:15.681 } 00:17:15.681 ], 00:17:15.682 "driver_specific": { 00:17:15.682 "raid": { 00:17:15.682 "uuid": "635dcb96-7127-4962-acbe-55b41b260ab4", 00:17:15.682 "strip_size_kb": 0, 00:17:15.682 "state": "online", 00:17:15.682 "raid_level": "raid1", 00:17:15.682 "superblock": true, 00:17:15.682 "num_base_bdevs": 4, 00:17:15.682 "num_base_bdevs_discovered": 4, 00:17:15.682 "num_base_bdevs_operational": 4, 00:17:15.682 "base_bdevs_list": [ 00:17:15.682 { 00:17:15.682 "name": "BaseBdev1", 00:17:15.682 "uuid": "146d8f07-e5fa-4e27-917e-47f0aa8ad51d", 00:17:15.682 "is_configured": true, 00:17:15.682 "data_offset": 2048, 00:17:15.682 "data_size": 63488 00:17:15.682 }, 00:17:15.682 { 00:17:15.682 "name": "BaseBdev2", 00:17:15.682 "uuid": "81e8431b-fd2f-4202-a284-8e42c4ee5b00", 00:17:15.682 "is_configured": true, 00:17:15.682 "data_offset": 2048, 00:17:15.682 "data_size": 63488 00:17:15.682 }, 00:17:15.682 { 00:17:15.682 "name": "BaseBdev3", 00:17:15.682 "uuid": "f8fb9318-2550-4ff4-ae34-7486e99ab371", 00:17:15.682 "is_configured": true, 00:17:15.682 "data_offset": 2048, 00:17:15.682 "data_size": 63488 00:17:15.682 }, 00:17:15.682 { 00:17:15.682 "name": "BaseBdev4", 00:17:15.682 "uuid": "c589e351-204c-4b80-b39d-d34f8f643623", 00:17:15.682 "is_configured": true, 00:17:15.682 "data_offset": 2048, 00:17:15.682 "data_size": 63488 00:17:15.682 } 00:17:15.682 ] 00:17:15.682 } 00:17:15.682 } 00:17:15.682 }' 00:17:15.682 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:15.682 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:15.682 BaseBdev2 00:17:15.682 BaseBdev3 00:17:15.682 BaseBdev4' 00:17:15.682 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:15.682 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:15.682 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:15.682 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:15.682 "name": "BaseBdev1", 00:17:15.682 "aliases": [ 00:17:15.682 "146d8f07-e5fa-4e27-917e-47f0aa8ad51d" 00:17:15.682 ], 00:17:15.682 "product_name": "Malloc disk", 00:17:15.682 "block_size": 512, 00:17:15.682 "num_blocks": 65536, 00:17:15.682 "uuid": "146d8f07-e5fa-4e27-917e-47f0aa8ad51d", 00:17:15.682 "assigned_rate_limits": { 00:17:15.682 "rw_ios_per_sec": 0, 00:17:15.682 "rw_mbytes_per_sec": 0, 00:17:15.682 "r_mbytes_per_sec": 0, 00:17:15.682 "w_mbytes_per_sec": 0 00:17:15.682 }, 00:17:15.682 "claimed": true, 00:17:15.682 "claim_type": "exclusive_write", 00:17:15.682 "zoned": false, 00:17:15.682 "supported_io_types": { 00:17:15.682 "read": true, 00:17:15.682 "write": true, 00:17:15.682 "unmap": true, 00:17:15.682 "flush": true, 00:17:15.682 "reset": true, 00:17:15.682 "nvme_admin": false, 00:17:15.682 "nvme_io": false, 00:17:15.682 "nvme_io_md": false, 00:17:15.682 "write_zeroes": true, 00:17:15.682 "zcopy": true, 00:17:15.682 "get_zone_info": false, 00:17:15.682 "zone_management": false, 00:17:15.682 "zone_append": false, 00:17:15.682 "compare": false, 00:17:15.682 "compare_and_write": false, 00:17:15.682 "abort": true, 00:17:15.682 "seek_hole": false, 00:17:15.682 "seek_data": false, 00:17:15.682 "copy": true, 00:17:15.682 "nvme_iov_md": false 00:17:15.682 }, 00:17:15.682 "memory_domains": [ 00:17:15.682 { 00:17:15.682 "dma_device_id": "system", 00:17:15.682 "dma_device_type": 1 00:17:15.682 }, 00:17:15.682 { 00:17:15.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:15.682 "dma_device_type": 2 00:17:15.682 } 00:17:15.682 ], 00:17:15.682 "driver_specific": {} 00:17:15.682 }' 00:17:15.682 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:15.682 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:15.941 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:15.941 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:15.941 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:15.941 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:15.941 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:15.941 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:15.941 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:15.941 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:15.941 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:16.199 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:16.199 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:16.199 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:16.199 18:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:16.199 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:16.199 "name": "BaseBdev2", 00:17:16.200 "aliases": [ 00:17:16.200 "81e8431b-fd2f-4202-a284-8e42c4ee5b00" 00:17:16.200 ], 00:17:16.200 "product_name": "Malloc disk", 00:17:16.200 "block_size": 512, 00:17:16.200 "num_blocks": 65536, 00:17:16.200 "uuid": "81e8431b-fd2f-4202-a284-8e42c4ee5b00", 00:17:16.200 "assigned_rate_limits": { 00:17:16.200 "rw_ios_per_sec": 0, 00:17:16.200 "rw_mbytes_per_sec": 0, 00:17:16.200 "r_mbytes_per_sec": 0, 00:17:16.200 "w_mbytes_per_sec": 0 00:17:16.200 }, 00:17:16.200 "claimed": true, 00:17:16.200 "claim_type": "exclusive_write", 00:17:16.200 "zoned": false, 00:17:16.200 "supported_io_types": { 00:17:16.200 "read": true, 00:17:16.200 "write": true, 00:17:16.200 "unmap": true, 00:17:16.200 "flush": true, 00:17:16.200 "reset": true, 00:17:16.200 "nvme_admin": false, 00:17:16.200 "nvme_io": false, 00:17:16.200 "nvme_io_md": false, 00:17:16.200 "write_zeroes": true, 00:17:16.200 "zcopy": true, 00:17:16.200 "get_zone_info": false, 00:17:16.200 "zone_management": false, 00:17:16.200 "zone_append": false, 00:17:16.200 "compare": false, 00:17:16.200 "compare_and_write": false, 00:17:16.200 "abort": true, 00:17:16.200 "seek_hole": false, 00:17:16.200 "seek_data": false, 00:17:16.200 "copy": true, 00:17:16.200 "nvme_iov_md": false 00:17:16.200 }, 00:17:16.200 "memory_domains": [ 00:17:16.200 { 00:17:16.200 "dma_device_id": "system", 00:17:16.200 "dma_device_type": 1 00:17:16.200 }, 00:17:16.200 { 00:17:16.200 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:16.200 "dma_device_type": 2 00:17:16.200 } 00:17:16.200 ], 00:17:16.200 "driver_specific": {} 00:17:16.200 }' 00:17:16.200 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:16.200 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:16.200 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:16.200 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:16.458 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:16.459 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:16.459 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:16.459 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:16.459 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:16.459 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:16.459 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:16.459 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:16.459 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:16.459 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:16.459 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:16.717 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:16.717 "name": "BaseBdev3", 00:17:16.717 "aliases": [ 00:17:16.717 "f8fb9318-2550-4ff4-ae34-7486e99ab371" 00:17:16.717 ], 00:17:16.717 "product_name": "Malloc disk", 00:17:16.717 "block_size": 512, 00:17:16.717 "num_blocks": 65536, 00:17:16.717 "uuid": "f8fb9318-2550-4ff4-ae34-7486e99ab371", 00:17:16.717 "assigned_rate_limits": { 00:17:16.717 "rw_ios_per_sec": 0, 00:17:16.717 "rw_mbytes_per_sec": 0, 00:17:16.717 "r_mbytes_per_sec": 0, 00:17:16.717 "w_mbytes_per_sec": 0 00:17:16.717 }, 00:17:16.717 "claimed": true, 00:17:16.717 "claim_type": "exclusive_write", 00:17:16.717 "zoned": false, 00:17:16.717 "supported_io_types": { 00:17:16.717 "read": true, 00:17:16.717 "write": true, 00:17:16.717 "unmap": true, 00:17:16.717 "flush": true, 00:17:16.717 "reset": true, 00:17:16.717 "nvme_admin": false, 00:17:16.717 "nvme_io": false, 00:17:16.717 "nvme_io_md": false, 00:17:16.717 "write_zeroes": true, 00:17:16.717 "zcopy": true, 00:17:16.717 "get_zone_info": false, 00:17:16.717 "zone_management": false, 00:17:16.717 "zone_append": false, 00:17:16.717 "compare": false, 00:17:16.717 "compare_and_write": false, 00:17:16.717 "abort": true, 00:17:16.717 "seek_hole": false, 00:17:16.717 "seek_data": false, 00:17:16.717 "copy": true, 00:17:16.717 "nvme_iov_md": false 00:17:16.717 }, 00:17:16.717 "memory_domains": [ 00:17:16.717 { 00:17:16.717 "dma_device_id": "system", 00:17:16.717 "dma_device_type": 1 00:17:16.717 }, 00:17:16.717 { 00:17:16.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:16.717 "dma_device_type": 2 00:17:16.717 } 00:17:16.717 ], 00:17:16.717 "driver_specific": {} 00:17:16.717 }' 00:17:16.717 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:16.717 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:16.717 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:16.717 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:16.976 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:16.976 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:16.976 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:16.976 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:16.976 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:16.976 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:16.976 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:16.976 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:16.976 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:16.976 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:16.976 18:54:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:17.235 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:17.235 "name": "BaseBdev4", 00:17:17.235 "aliases": [ 00:17:17.235 "c589e351-204c-4b80-b39d-d34f8f643623" 00:17:17.235 ], 00:17:17.235 "product_name": "Malloc disk", 00:17:17.235 "block_size": 512, 00:17:17.235 "num_blocks": 65536, 00:17:17.235 "uuid": "c589e351-204c-4b80-b39d-d34f8f643623", 00:17:17.235 "assigned_rate_limits": { 00:17:17.235 "rw_ios_per_sec": 0, 00:17:17.235 "rw_mbytes_per_sec": 0, 00:17:17.235 "r_mbytes_per_sec": 0, 00:17:17.235 "w_mbytes_per_sec": 0 00:17:17.235 }, 00:17:17.235 "claimed": true, 00:17:17.235 "claim_type": "exclusive_write", 00:17:17.235 "zoned": false, 00:17:17.235 "supported_io_types": { 00:17:17.235 "read": true, 00:17:17.235 "write": true, 00:17:17.235 "unmap": true, 00:17:17.235 "flush": true, 00:17:17.235 "reset": true, 00:17:17.235 "nvme_admin": false, 00:17:17.235 "nvme_io": false, 00:17:17.235 "nvme_io_md": false, 00:17:17.235 "write_zeroes": true, 00:17:17.235 "zcopy": true, 00:17:17.235 "get_zone_info": false, 00:17:17.235 "zone_management": false, 00:17:17.235 "zone_append": false, 00:17:17.235 "compare": false, 00:17:17.235 "compare_and_write": false, 00:17:17.235 "abort": true, 00:17:17.235 "seek_hole": false, 00:17:17.235 "seek_data": false, 00:17:17.235 "copy": true, 00:17:17.235 "nvme_iov_md": false 00:17:17.235 }, 00:17:17.235 "memory_domains": [ 00:17:17.235 { 00:17:17.235 "dma_device_id": "system", 00:17:17.235 "dma_device_type": 1 00:17:17.235 }, 00:17:17.235 { 00:17:17.235 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:17.235 "dma_device_type": 2 00:17:17.235 } 00:17:17.235 ], 00:17:17.235 "driver_specific": {} 00:17:17.235 }' 00:17:17.235 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:17.235 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:17.235 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:17.235 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:17.235 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:17.235 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:17.235 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:17.493 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:17.493 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:17.493 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:17.493 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:17.493 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:17.493 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:17.752 [2024-07-24 18:54:02.535285] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:17.752 "name": "Existed_Raid", 00:17:17.752 "uuid": "635dcb96-7127-4962-acbe-55b41b260ab4", 00:17:17.752 "strip_size_kb": 0, 00:17:17.752 "state": "online", 00:17:17.752 "raid_level": "raid1", 00:17:17.752 "superblock": true, 00:17:17.752 "num_base_bdevs": 4, 00:17:17.752 "num_base_bdevs_discovered": 3, 00:17:17.752 "num_base_bdevs_operational": 3, 00:17:17.752 "base_bdevs_list": [ 00:17:17.752 { 00:17:17.752 "name": null, 00:17:17.752 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:17.752 "is_configured": false, 00:17:17.752 "data_offset": 2048, 00:17:17.752 "data_size": 63488 00:17:17.752 }, 00:17:17.752 { 00:17:17.752 "name": "BaseBdev2", 00:17:17.752 "uuid": "81e8431b-fd2f-4202-a284-8e42c4ee5b00", 00:17:17.752 "is_configured": true, 00:17:17.752 "data_offset": 2048, 00:17:17.752 "data_size": 63488 00:17:17.752 }, 00:17:17.752 { 00:17:17.752 "name": "BaseBdev3", 00:17:17.752 "uuid": "f8fb9318-2550-4ff4-ae34-7486e99ab371", 00:17:17.752 "is_configured": true, 00:17:17.752 "data_offset": 2048, 00:17:17.752 "data_size": 63488 00:17:17.752 }, 00:17:17.752 { 00:17:17.752 "name": "BaseBdev4", 00:17:17.752 "uuid": "c589e351-204c-4b80-b39d-d34f8f643623", 00:17:17.752 "is_configured": true, 00:17:17.752 "data_offset": 2048, 00:17:17.752 "data_size": 63488 00:17:17.752 } 00:17:17.752 ] 00:17:17.752 }' 00:17:17.752 18:54:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:17.753 18:54:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:18.316 18:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:18.316 18:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:18.316 18:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:18.316 18:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.572 18:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:18.573 18:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:18.573 18:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:18.573 [2024-07-24 18:54:03.498705] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:18.573 18:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:18.573 18:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:18.573 18:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.573 18:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:18.830 18:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:18.830 18:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:18.830 18:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:19.088 [2024-07-24 18:54:03.841559] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:19.088 18:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:19.088 18:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:19.088 18:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.088 18:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:19.088 18:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:19.088 18:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:19.088 18:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:19.347 [2024-07-24 18:54:04.196286] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:19.347 [2024-07-24 18:54:04.196347] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:19.347 [2024-07-24 18:54:04.206324] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:19.347 [2024-07-24 18:54:04.206363] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:19.347 [2024-07-24 18:54:04.206369] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x276a490 name Existed_Raid, state offline 00:17:19.347 18:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:19.347 18:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:19.347 18:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.347 18:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:19.605 18:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:19.605 18:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:19.605 18:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:19.605 18:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:19.605 18:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:19.605 18:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:19.605 BaseBdev2 00:17:19.605 18:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:19.605 18:54:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:19.605 18:54:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:19.605 18:54:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:19.605 18:54:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:19.605 18:54:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:19.605 18:54:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:19.863 18:54:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:20.122 [ 00:17:20.122 { 00:17:20.122 "name": "BaseBdev2", 00:17:20.122 "aliases": [ 00:17:20.122 "d216cc46-89ff-49d5-90f5-037e331e8645" 00:17:20.122 ], 00:17:20.122 "product_name": "Malloc disk", 00:17:20.122 "block_size": 512, 00:17:20.122 "num_blocks": 65536, 00:17:20.122 "uuid": "d216cc46-89ff-49d5-90f5-037e331e8645", 00:17:20.122 "assigned_rate_limits": { 00:17:20.122 "rw_ios_per_sec": 0, 00:17:20.122 "rw_mbytes_per_sec": 0, 00:17:20.122 "r_mbytes_per_sec": 0, 00:17:20.122 "w_mbytes_per_sec": 0 00:17:20.122 }, 00:17:20.122 "claimed": false, 00:17:20.122 "zoned": false, 00:17:20.122 "supported_io_types": { 00:17:20.122 "read": true, 00:17:20.122 "write": true, 00:17:20.122 "unmap": true, 00:17:20.122 "flush": true, 00:17:20.122 "reset": true, 00:17:20.122 "nvme_admin": false, 00:17:20.122 "nvme_io": false, 00:17:20.122 "nvme_io_md": false, 00:17:20.122 "write_zeroes": true, 00:17:20.122 "zcopy": true, 00:17:20.122 "get_zone_info": false, 00:17:20.122 "zone_management": false, 00:17:20.122 "zone_append": false, 00:17:20.122 "compare": false, 00:17:20.122 "compare_and_write": false, 00:17:20.122 "abort": true, 00:17:20.122 "seek_hole": false, 00:17:20.122 "seek_data": false, 00:17:20.122 "copy": true, 00:17:20.122 "nvme_iov_md": false 00:17:20.122 }, 00:17:20.122 "memory_domains": [ 00:17:20.122 { 00:17:20.122 "dma_device_id": "system", 00:17:20.122 "dma_device_type": 1 00:17:20.122 }, 00:17:20.122 { 00:17:20.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.122 "dma_device_type": 2 00:17:20.122 } 00:17:20.122 ], 00:17:20.122 "driver_specific": {} 00:17:20.122 } 00:17:20.123 ] 00:17:20.123 18:54:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:20.123 18:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:20.123 18:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:20.123 18:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:20.123 BaseBdev3 00:17:20.123 18:54:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:20.123 18:54:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:20.123 18:54:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:20.123 18:54:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:20.123 18:54:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:20.123 18:54:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:20.123 18:54:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:20.381 18:54:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:20.381 [ 00:17:20.381 { 00:17:20.381 "name": "BaseBdev3", 00:17:20.381 "aliases": [ 00:17:20.381 "372f735a-fba8-4346-9a37-ab89e3dc2a48" 00:17:20.381 ], 00:17:20.381 "product_name": "Malloc disk", 00:17:20.381 "block_size": 512, 00:17:20.381 "num_blocks": 65536, 00:17:20.381 "uuid": "372f735a-fba8-4346-9a37-ab89e3dc2a48", 00:17:20.381 "assigned_rate_limits": { 00:17:20.381 "rw_ios_per_sec": 0, 00:17:20.381 "rw_mbytes_per_sec": 0, 00:17:20.381 "r_mbytes_per_sec": 0, 00:17:20.381 "w_mbytes_per_sec": 0 00:17:20.381 }, 00:17:20.381 "claimed": false, 00:17:20.381 "zoned": false, 00:17:20.381 "supported_io_types": { 00:17:20.381 "read": true, 00:17:20.381 "write": true, 00:17:20.381 "unmap": true, 00:17:20.381 "flush": true, 00:17:20.381 "reset": true, 00:17:20.381 "nvme_admin": false, 00:17:20.381 "nvme_io": false, 00:17:20.381 "nvme_io_md": false, 00:17:20.381 "write_zeroes": true, 00:17:20.381 "zcopy": true, 00:17:20.381 "get_zone_info": false, 00:17:20.381 "zone_management": false, 00:17:20.381 "zone_append": false, 00:17:20.381 "compare": false, 00:17:20.381 "compare_and_write": false, 00:17:20.381 "abort": true, 00:17:20.381 "seek_hole": false, 00:17:20.381 "seek_data": false, 00:17:20.381 "copy": true, 00:17:20.381 "nvme_iov_md": false 00:17:20.381 }, 00:17:20.381 "memory_domains": [ 00:17:20.381 { 00:17:20.381 "dma_device_id": "system", 00:17:20.381 "dma_device_type": 1 00:17:20.381 }, 00:17:20.381 { 00:17:20.381 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.381 "dma_device_type": 2 00:17:20.381 } 00:17:20.381 ], 00:17:20.381 "driver_specific": {} 00:17:20.381 } 00:17:20.381 ] 00:17:20.381 18:54:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:20.381 18:54:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:20.381 18:54:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:20.381 18:54:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:20.639 BaseBdev4 00:17:20.639 18:54:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:20.639 18:54:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:20.639 18:54:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:20.639 18:54:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:20.639 18:54:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:20.639 18:54:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:20.639 18:54:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:20.898 18:54:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:20.898 [ 00:17:20.898 { 00:17:20.898 "name": "BaseBdev4", 00:17:20.898 "aliases": [ 00:17:20.898 "2c3555ff-4526-472c-9550-68a8ae694273" 00:17:20.898 ], 00:17:20.898 "product_name": "Malloc disk", 00:17:20.898 "block_size": 512, 00:17:20.898 "num_blocks": 65536, 00:17:20.898 "uuid": "2c3555ff-4526-472c-9550-68a8ae694273", 00:17:20.898 "assigned_rate_limits": { 00:17:20.898 "rw_ios_per_sec": 0, 00:17:20.898 "rw_mbytes_per_sec": 0, 00:17:20.898 "r_mbytes_per_sec": 0, 00:17:20.898 "w_mbytes_per_sec": 0 00:17:20.898 }, 00:17:20.898 "claimed": false, 00:17:20.898 "zoned": false, 00:17:20.898 "supported_io_types": { 00:17:20.898 "read": true, 00:17:20.898 "write": true, 00:17:20.898 "unmap": true, 00:17:20.898 "flush": true, 00:17:20.898 "reset": true, 00:17:20.898 "nvme_admin": false, 00:17:20.898 "nvme_io": false, 00:17:20.898 "nvme_io_md": false, 00:17:20.898 "write_zeroes": true, 00:17:20.898 "zcopy": true, 00:17:20.898 "get_zone_info": false, 00:17:20.898 "zone_management": false, 00:17:20.898 "zone_append": false, 00:17:20.898 "compare": false, 00:17:20.898 "compare_and_write": false, 00:17:20.898 "abort": true, 00:17:20.898 "seek_hole": false, 00:17:20.898 "seek_data": false, 00:17:20.898 "copy": true, 00:17:20.898 "nvme_iov_md": false 00:17:20.898 }, 00:17:20.898 "memory_domains": [ 00:17:20.898 { 00:17:20.898 "dma_device_id": "system", 00:17:20.898 "dma_device_type": 1 00:17:20.898 }, 00:17:20.898 { 00:17:20.898 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.898 "dma_device_type": 2 00:17:20.898 } 00:17:20.898 ], 00:17:20.898 "driver_specific": {} 00:17:20.898 } 00:17:20.898 ] 00:17:20.898 18:54:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:20.898 18:54:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:20.898 18:54:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:20.898 18:54:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:21.160 [2024-07-24 18:54:06.038141] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:21.160 [2024-07-24 18:54:06.038170] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:21.160 [2024-07-24 18:54:06.038186] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:21.160 [2024-07-24 18:54:06.039108] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:21.160 [2024-07-24 18:54:06.039135] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:21.160 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:21.160 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:21.160 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:21.160 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:21.160 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:21.160 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:21.160 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:21.160 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:21.160 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:21.160 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:21.160 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.160 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:21.487 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:21.487 "name": "Existed_Raid", 00:17:21.487 "uuid": "6645c20d-b819-4e66-a797-c91371014fd4", 00:17:21.487 "strip_size_kb": 0, 00:17:21.487 "state": "configuring", 00:17:21.487 "raid_level": "raid1", 00:17:21.487 "superblock": true, 00:17:21.487 "num_base_bdevs": 4, 00:17:21.487 "num_base_bdevs_discovered": 3, 00:17:21.487 "num_base_bdevs_operational": 4, 00:17:21.487 "base_bdevs_list": [ 00:17:21.487 { 00:17:21.487 "name": "BaseBdev1", 00:17:21.487 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.487 "is_configured": false, 00:17:21.487 "data_offset": 0, 00:17:21.487 "data_size": 0 00:17:21.487 }, 00:17:21.487 { 00:17:21.487 "name": "BaseBdev2", 00:17:21.487 "uuid": "d216cc46-89ff-49d5-90f5-037e331e8645", 00:17:21.487 "is_configured": true, 00:17:21.487 "data_offset": 2048, 00:17:21.487 "data_size": 63488 00:17:21.487 }, 00:17:21.487 { 00:17:21.487 "name": "BaseBdev3", 00:17:21.487 "uuid": "372f735a-fba8-4346-9a37-ab89e3dc2a48", 00:17:21.487 "is_configured": true, 00:17:21.487 "data_offset": 2048, 00:17:21.487 "data_size": 63488 00:17:21.487 }, 00:17:21.487 { 00:17:21.487 "name": "BaseBdev4", 00:17:21.487 "uuid": "2c3555ff-4526-472c-9550-68a8ae694273", 00:17:21.487 "is_configured": true, 00:17:21.487 "data_offset": 2048, 00:17:21.487 "data_size": 63488 00:17:21.487 } 00:17:21.487 ] 00:17:21.487 }' 00:17:21.487 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:21.487 18:54:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:21.747 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:22.007 [2024-07-24 18:54:06.840208] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:22.007 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:22.007 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:22.007 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:22.007 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:22.007 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:22.007 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:22.007 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:22.007 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:22.007 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:22.007 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:22.007 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:22.007 18:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:22.266 18:54:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:22.266 "name": "Existed_Raid", 00:17:22.266 "uuid": "6645c20d-b819-4e66-a797-c91371014fd4", 00:17:22.266 "strip_size_kb": 0, 00:17:22.266 "state": "configuring", 00:17:22.266 "raid_level": "raid1", 00:17:22.266 "superblock": true, 00:17:22.266 "num_base_bdevs": 4, 00:17:22.266 "num_base_bdevs_discovered": 2, 00:17:22.266 "num_base_bdevs_operational": 4, 00:17:22.266 "base_bdevs_list": [ 00:17:22.266 { 00:17:22.266 "name": "BaseBdev1", 00:17:22.266 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:22.266 "is_configured": false, 00:17:22.266 "data_offset": 0, 00:17:22.266 "data_size": 0 00:17:22.266 }, 00:17:22.266 { 00:17:22.266 "name": null, 00:17:22.266 "uuid": "d216cc46-89ff-49d5-90f5-037e331e8645", 00:17:22.266 "is_configured": false, 00:17:22.266 "data_offset": 2048, 00:17:22.266 "data_size": 63488 00:17:22.266 }, 00:17:22.266 { 00:17:22.266 "name": "BaseBdev3", 00:17:22.266 "uuid": "372f735a-fba8-4346-9a37-ab89e3dc2a48", 00:17:22.266 "is_configured": true, 00:17:22.266 "data_offset": 2048, 00:17:22.266 "data_size": 63488 00:17:22.266 }, 00:17:22.266 { 00:17:22.266 "name": "BaseBdev4", 00:17:22.266 "uuid": "2c3555ff-4526-472c-9550-68a8ae694273", 00:17:22.266 "is_configured": true, 00:17:22.266 "data_offset": 2048, 00:17:22.266 "data_size": 63488 00:17:22.266 } 00:17:22.266 ] 00:17:22.266 }' 00:17:22.266 18:54:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:22.266 18:54:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:22.528 18:54:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:22.528 18:54:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:22.787 18:54:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:22.787 18:54:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:23.045 [2024-07-24 18:54:07.861508] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:23.045 BaseBdev1 00:17:23.045 18:54:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:23.045 18:54:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:23.045 18:54:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:23.045 18:54:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:23.045 18:54:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:23.045 18:54:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:23.045 18:54:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:23.045 18:54:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:23.303 [ 00:17:23.303 { 00:17:23.303 "name": "BaseBdev1", 00:17:23.303 "aliases": [ 00:17:23.303 "e9257d58-b4f4-4ef8-8df5-dee4a2257dad" 00:17:23.303 ], 00:17:23.303 "product_name": "Malloc disk", 00:17:23.303 "block_size": 512, 00:17:23.303 "num_blocks": 65536, 00:17:23.303 "uuid": "e9257d58-b4f4-4ef8-8df5-dee4a2257dad", 00:17:23.303 "assigned_rate_limits": { 00:17:23.303 "rw_ios_per_sec": 0, 00:17:23.303 "rw_mbytes_per_sec": 0, 00:17:23.303 "r_mbytes_per_sec": 0, 00:17:23.303 "w_mbytes_per_sec": 0 00:17:23.303 }, 00:17:23.303 "claimed": true, 00:17:23.303 "claim_type": "exclusive_write", 00:17:23.303 "zoned": false, 00:17:23.303 "supported_io_types": { 00:17:23.303 "read": true, 00:17:23.303 "write": true, 00:17:23.303 "unmap": true, 00:17:23.303 "flush": true, 00:17:23.303 "reset": true, 00:17:23.303 "nvme_admin": false, 00:17:23.303 "nvme_io": false, 00:17:23.303 "nvme_io_md": false, 00:17:23.303 "write_zeroes": true, 00:17:23.303 "zcopy": true, 00:17:23.303 "get_zone_info": false, 00:17:23.303 "zone_management": false, 00:17:23.303 "zone_append": false, 00:17:23.303 "compare": false, 00:17:23.303 "compare_and_write": false, 00:17:23.303 "abort": true, 00:17:23.303 "seek_hole": false, 00:17:23.303 "seek_data": false, 00:17:23.303 "copy": true, 00:17:23.303 "nvme_iov_md": false 00:17:23.303 }, 00:17:23.303 "memory_domains": [ 00:17:23.303 { 00:17:23.303 "dma_device_id": "system", 00:17:23.303 "dma_device_type": 1 00:17:23.303 }, 00:17:23.303 { 00:17:23.303 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:23.303 "dma_device_type": 2 00:17:23.303 } 00:17:23.303 ], 00:17:23.303 "driver_specific": {} 00:17:23.303 } 00:17:23.303 ] 00:17:23.303 18:54:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:23.303 18:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:23.303 18:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:23.304 18:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:23.304 18:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:23.304 18:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:23.304 18:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:23.304 18:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:23.304 18:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:23.304 18:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:23.304 18:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:23.304 18:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.304 18:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:23.562 18:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:23.562 "name": "Existed_Raid", 00:17:23.562 "uuid": "6645c20d-b819-4e66-a797-c91371014fd4", 00:17:23.562 "strip_size_kb": 0, 00:17:23.562 "state": "configuring", 00:17:23.562 "raid_level": "raid1", 00:17:23.562 "superblock": true, 00:17:23.562 "num_base_bdevs": 4, 00:17:23.562 "num_base_bdevs_discovered": 3, 00:17:23.562 "num_base_bdevs_operational": 4, 00:17:23.562 "base_bdevs_list": [ 00:17:23.562 { 00:17:23.562 "name": "BaseBdev1", 00:17:23.562 "uuid": "e9257d58-b4f4-4ef8-8df5-dee4a2257dad", 00:17:23.562 "is_configured": true, 00:17:23.562 "data_offset": 2048, 00:17:23.562 "data_size": 63488 00:17:23.562 }, 00:17:23.562 { 00:17:23.562 "name": null, 00:17:23.562 "uuid": "d216cc46-89ff-49d5-90f5-037e331e8645", 00:17:23.562 "is_configured": false, 00:17:23.562 "data_offset": 2048, 00:17:23.562 "data_size": 63488 00:17:23.562 }, 00:17:23.562 { 00:17:23.562 "name": "BaseBdev3", 00:17:23.562 "uuid": "372f735a-fba8-4346-9a37-ab89e3dc2a48", 00:17:23.562 "is_configured": true, 00:17:23.562 "data_offset": 2048, 00:17:23.562 "data_size": 63488 00:17:23.562 }, 00:17:23.562 { 00:17:23.562 "name": "BaseBdev4", 00:17:23.562 "uuid": "2c3555ff-4526-472c-9550-68a8ae694273", 00:17:23.562 "is_configured": true, 00:17:23.562 "data_offset": 2048, 00:17:23.562 "data_size": 63488 00:17:23.562 } 00:17:23.562 ] 00:17:23.562 }' 00:17:23.562 18:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:23.562 18:54:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:24.131 18:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.131 18:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:24.131 18:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:24.131 18:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:24.389 [2024-07-24 18:54:09.196990] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:24.389 18:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:24.389 18:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:24.389 18:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:24.389 18:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:24.389 18:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:24.390 18:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:24.390 18:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:24.390 18:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:24.390 18:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:24.390 18:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:24.390 18:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.390 18:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:24.390 18:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:24.390 "name": "Existed_Raid", 00:17:24.390 "uuid": "6645c20d-b819-4e66-a797-c91371014fd4", 00:17:24.390 "strip_size_kb": 0, 00:17:24.390 "state": "configuring", 00:17:24.390 "raid_level": "raid1", 00:17:24.390 "superblock": true, 00:17:24.390 "num_base_bdevs": 4, 00:17:24.390 "num_base_bdevs_discovered": 2, 00:17:24.390 "num_base_bdevs_operational": 4, 00:17:24.390 "base_bdevs_list": [ 00:17:24.390 { 00:17:24.390 "name": "BaseBdev1", 00:17:24.390 "uuid": "e9257d58-b4f4-4ef8-8df5-dee4a2257dad", 00:17:24.390 "is_configured": true, 00:17:24.390 "data_offset": 2048, 00:17:24.390 "data_size": 63488 00:17:24.390 }, 00:17:24.390 { 00:17:24.390 "name": null, 00:17:24.390 "uuid": "d216cc46-89ff-49d5-90f5-037e331e8645", 00:17:24.390 "is_configured": false, 00:17:24.390 "data_offset": 2048, 00:17:24.390 "data_size": 63488 00:17:24.390 }, 00:17:24.390 { 00:17:24.390 "name": null, 00:17:24.390 "uuid": "372f735a-fba8-4346-9a37-ab89e3dc2a48", 00:17:24.390 "is_configured": false, 00:17:24.390 "data_offset": 2048, 00:17:24.390 "data_size": 63488 00:17:24.390 }, 00:17:24.390 { 00:17:24.390 "name": "BaseBdev4", 00:17:24.390 "uuid": "2c3555ff-4526-472c-9550-68a8ae694273", 00:17:24.390 "is_configured": true, 00:17:24.390 "data_offset": 2048, 00:17:24.390 "data_size": 63488 00:17:24.390 } 00:17:24.390 ] 00:17:24.390 }' 00:17:24.390 18:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:24.390 18:54:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:24.956 18:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.956 18:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:25.214 18:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:25.214 18:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:25.214 [2024-07-24 18:54:10.211628] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:25.473 18:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:25.473 18:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:25.473 18:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:25.473 18:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:25.473 18:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:25.473 18:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:25.473 18:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:25.473 18:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:25.473 18:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:25.473 18:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:25.473 18:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.473 18:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:25.473 18:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:25.473 "name": "Existed_Raid", 00:17:25.473 "uuid": "6645c20d-b819-4e66-a797-c91371014fd4", 00:17:25.473 "strip_size_kb": 0, 00:17:25.473 "state": "configuring", 00:17:25.473 "raid_level": "raid1", 00:17:25.473 "superblock": true, 00:17:25.473 "num_base_bdevs": 4, 00:17:25.473 "num_base_bdevs_discovered": 3, 00:17:25.473 "num_base_bdevs_operational": 4, 00:17:25.473 "base_bdevs_list": [ 00:17:25.473 { 00:17:25.473 "name": "BaseBdev1", 00:17:25.473 "uuid": "e9257d58-b4f4-4ef8-8df5-dee4a2257dad", 00:17:25.473 "is_configured": true, 00:17:25.473 "data_offset": 2048, 00:17:25.473 "data_size": 63488 00:17:25.473 }, 00:17:25.473 { 00:17:25.473 "name": null, 00:17:25.473 "uuid": "d216cc46-89ff-49d5-90f5-037e331e8645", 00:17:25.473 "is_configured": false, 00:17:25.473 "data_offset": 2048, 00:17:25.473 "data_size": 63488 00:17:25.473 }, 00:17:25.473 { 00:17:25.473 "name": "BaseBdev3", 00:17:25.473 "uuid": "372f735a-fba8-4346-9a37-ab89e3dc2a48", 00:17:25.473 "is_configured": true, 00:17:25.473 "data_offset": 2048, 00:17:25.473 "data_size": 63488 00:17:25.473 }, 00:17:25.473 { 00:17:25.473 "name": "BaseBdev4", 00:17:25.473 "uuid": "2c3555ff-4526-472c-9550-68a8ae694273", 00:17:25.473 "is_configured": true, 00:17:25.473 "data_offset": 2048, 00:17:25.473 "data_size": 63488 00:17:25.473 } 00:17:25.473 ] 00:17:25.473 }' 00:17:25.473 18:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:25.473 18:54:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:26.039 18:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.039 18:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:26.039 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:26.039 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:26.299 [2024-07-24 18:54:11.170122] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:26.299 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:26.299 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:26.299 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:26.299 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:26.299 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:26.299 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:26.299 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:26.299 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:26.299 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:26.299 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:26.299 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.299 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:26.557 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:26.557 "name": "Existed_Raid", 00:17:26.557 "uuid": "6645c20d-b819-4e66-a797-c91371014fd4", 00:17:26.557 "strip_size_kb": 0, 00:17:26.557 "state": "configuring", 00:17:26.557 "raid_level": "raid1", 00:17:26.557 "superblock": true, 00:17:26.557 "num_base_bdevs": 4, 00:17:26.557 "num_base_bdevs_discovered": 2, 00:17:26.557 "num_base_bdevs_operational": 4, 00:17:26.557 "base_bdevs_list": [ 00:17:26.557 { 00:17:26.557 "name": null, 00:17:26.557 "uuid": "e9257d58-b4f4-4ef8-8df5-dee4a2257dad", 00:17:26.557 "is_configured": false, 00:17:26.557 "data_offset": 2048, 00:17:26.557 "data_size": 63488 00:17:26.557 }, 00:17:26.557 { 00:17:26.557 "name": null, 00:17:26.557 "uuid": "d216cc46-89ff-49d5-90f5-037e331e8645", 00:17:26.557 "is_configured": false, 00:17:26.557 "data_offset": 2048, 00:17:26.557 "data_size": 63488 00:17:26.557 }, 00:17:26.557 { 00:17:26.557 "name": "BaseBdev3", 00:17:26.557 "uuid": "372f735a-fba8-4346-9a37-ab89e3dc2a48", 00:17:26.557 "is_configured": true, 00:17:26.557 "data_offset": 2048, 00:17:26.557 "data_size": 63488 00:17:26.557 }, 00:17:26.557 { 00:17:26.557 "name": "BaseBdev4", 00:17:26.557 "uuid": "2c3555ff-4526-472c-9550-68a8ae694273", 00:17:26.557 "is_configured": true, 00:17:26.557 "data_offset": 2048, 00:17:26.557 "data_size": 63488 00:17:26.557 } 00:17:26.557 ] 00:17:26.557 }' 00:17:26.557 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:26.557 18:54:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:26.815 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:26.815 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.073 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:27.073 18:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:27.332 [2024-07-24 18:54:12.126521] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:27.332 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:27.332 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:27.332 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:27.332 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:27.332 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:27.332 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:27.332 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:27.332 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:27.332 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:27.332 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:27.332 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.332 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:27.332 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:27.332 "name": "Existed_Raid", 00:17:27.332 "uuid": "6645c20d-b819-4e66-a797-c91371014fd4", 00:17:27.332 "strip_size_kb": 0, 00:17:27.332 "state": "configuring", 00:17:27.332 "raid_level": "raid1", 00:17:27.332 "superblock": true, 00:17:27.332 "num_base_bdevs": 4, 00:17:27.332 "num_base_bdevs_discovered": 3, 00:17:27.332 "num_base_bdevs_operational": 4, 00:17:27.332 "base_bdevs_list": [ 00:17:27.332 { 00:17:27.332 "name": null, 00:17:27.332 "uuid": "e9257d58-b4f4-4ef8-8df5-dee4a2257dad", 00:17:27.332 "is_configured": false, 00:17:27.332 "data_offset": 2048, 00:17:27.332 "data_size": 63488 00:17:27.332 }, 00:17:27.332 { 00:17:27.332 "name": "BaseBdev2", 00:17:27.332 "uuid": "d216cc46-89ff-49d5-90f5-037e331e8645", 00:17:27.332 "is_configured": true, 00:17:27.332 "data_offset": 2048, 00:17:27.332 "data_size": 63488 00:17:27.332 }, 00:17:27.332 { 00:17:27.332 "name": "BaseBdev3", 00:17:27.332 "uuid": "372f735a-fba8-4346-9a37-ab89e3dc2a48", 00:17:27.332 "is_configured": true, 00:17:27.332 "data_offset": 2048, 00:17:27.332 "data_size": 63488 00:17:27.332 }, 00:17:27.332 { 00:17:27.332 "name": "BaseBdev4", 00:17:27.332 "uuid": "2c3555ff-4526-472c-9550-68a8ae694273", 00:17:27.332 "is_configured": true, 00:17:27.332 "data_offset": 2048, 00:17:27.332 "data_size": 63488 00:17:27.332 } 00:17:27.332 ] 00:17:27.332 }' 00:17:27.332 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:27.332 18:54:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:27.900 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.900 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:27.900 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:28.158 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.159 18:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:28.159 18:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u e9257d58-b4f4-4ef8-8df5-dee4a2257dad 00:17:28.417 [2024-07-24 18:54:13.236087] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:28.417 [2024-07-24 18:54:13.236214] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x276b730 00:17:28.417 [2024-07-24 18:54:13.236222] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:28.417 [2024-07-24 18:54:13.236352] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27562d0 00:17:28.417 [2024-07-24 18:54:13.236434] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x276b730 00:17:28.417 [2024-07-24 18:54:13.236439] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x276b730 00:17:28.417 [2024-07-24 18:54:13.236511] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:28.417 NewBaseBdev 00:17:28.417 18:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:28.417 18:54:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:28.417 18:54:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:28.417 18:54:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:28.417 18:54:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:28.417 18:54:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:28.417 18:54:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:28.417 18:54:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:28.675 [ 00:17:28.675 { 00:17:28.675 "name": "NewBaseBdev", 00:17:28.675 "aliases": [ 00:17:28.675 "e9257d58-b4f4-4ef8-8df5-dee4a2257dad" 00:17:28.675 ], 00:17:28.675 "product_name": "Malloc disk", 00:17:28.675 "block_size": 512, 00:17:28.675 "num_blocks": 65536, 00:17:28.675 "uuid": "e9257d58-b4f4-4ef8-8df5-dee4a2257dad", 00:17:28.675 "assigned_rate_limits": { 00:17:28.675 "rw_ios_per_sec": 0, 00:17:28.675 "rw_mbytes_per_sec": 0, 00:17:28.675 "r_mbytes_per_sec": 0, 00:17:28.675 "w_mbytes_per_sec": 0 00:17:28.675 }, 00:17:28.675 "claimed": true, 00:17:28.675 "claim_type": "exclusive_write", 00:17:28.675 "zoned": false, 00:17:28.675 "supported_io_types": { 00:17:28.675 "read": true, 00:17:28.675 "write": true, 00:17:28.675 "unmap": true, 00:17:28.675 "flush": true, 00:17:28.675 "reset": true, 00:17:28.675 "nvme_admin": false, 00:17:28.675 "nvme_io": false, 00:17:28.675 "nvme_io_md": false, 00:17:28.675 "write_zeroes": true, 00:17:28.675 "zcopy": true, 00:17:28.675 "get_zone_info": false, 00:17:28.675 "zone_management": false, 00:17:28.675 "zone_append": false, 00:17:28.675 "compare": false, 00:17:28.675 "compare_and_write": false, 00:17:28.675 "abort": true, 00:17:28.675 "seek_hole": false, 00:17:28.675 "seek_data": false, 00:17:28.675 "copy": true, 00:17:28.675 "nvme_iov_md": false 00:17:28.675 }, 00:17:28.675 "memory_domains": [ 00:17:28.675 { 00:17:28.675 "dma_device_id": "system", 00:17:28.676 "dma_device_type": 1 00:17:28.676 }, 00:17:28.676 { 00:17:28.676 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:28.676 "dma_device_type": 2 00:17:28.676 } 00:17:28.676 ], 00:17:28.676 "driver_specific": {} 00:17:28.676 } 00:17:28.676 ] 00:17:28.676 18:54:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:28.676 18:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:17:28.676 18:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:28.676 18:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:28.676 18:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:28.676 18:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:28.676 18:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:28.676 18:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:28.676 18:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:28.676 18:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:28.676 18:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:28.676 18:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.676 18:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:28.934 18:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:28.934 "name": "Existed_Raid", 00:17:28.934 "uuid": "6645c20d-b819-4e66-a797-c91371014fd4", 00:17:28.934 "strip_size_kb": 0, 00:17:28.934 "state": "online", 00:17:28.934 "raid_level": "raid1", 00:17:28.934 "superblock": true, 00:17:28.934 "num_base_bdevs": 4, 00:17:28.934 "num_base_bdevs_discovered": 4, 00:17:28.934 "num_base_bdevs_operational": 4, 00:17:28.934 "base_bdevs_list": [ 00:17:28.934 { 00:17:28.934 "name": "NewBaseBdev", 00:17:28.934 "uuid": "e9257d58-b4f4-4ef8-8df5-dee4a2257dad", 00:17:28.934 "is_configured": true, 00:17:28.934 "data_offset": 2048, 00:17:28.934 "data_size": 63488 00:17:28.934 }, 00:17:28.934 { 00:17:28.934 "name": "BaseBdev2", 00:17:28.934 "uuid": "d216cc46-89ff-49d5-90f5-037e331e8645", 00:17:28.934 "is_configured": true, 00:17:28.934 "data_offset": 2048, 00:17:28.934 "data_size": 63488 00:17:28.934 }, 00:17:28.934 { 00:17:28.934 "name": "BaseBdev3", 00:17:28.934 "uuid": "372f735a-fba8-4346-9a37-ab89e3dc2a48", 00:17:28.934 "is_configured": true, 00:17:28.934 "data_offset": 2048, 00:17:28.934 "data_size": 63488 00:17:28.934 }, 00:17:28.934 { 00:17:28.934 "name": "BaseBdev4", 00:17:28.934 "uuid": "2c3555ff-4526-472c-9550-68a8ae694273", 00:17:28.934 "is_configured": true, 00:17:28.934 "data_offset": 2048, 00:17:28.934 "data_size": 63488 00:17:28.934 } 00:17:28.934 ] 00:17:28.934 }' 00:17:28.934 18:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:28.934 18:54:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:29.502 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:29.502 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:29.502 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:29.502 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:29.502 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:29.502 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:29.502 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:29.502 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:29.502 [2024-07-24 18:54:14.359197] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:29.502 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:29.502 "name": "Existed_Raid", 00:17:29.502 "aliases": [ 00:17:29.502 "6645c20d-b819-4e66-a797-c91371014fd4" 00:17:29.502 ], 00:17:29.502 "product_name": "Raid Volume", 00:17:29.502 "block_size": 512, 00:17:29.502 "num_blocks": 63488, 00:17:29.502 "uuid": "6645c20d-b819-4e66-a797-c91371014fd4", 00:17:29.502 "assigned_rate_limits": { 00:17:29.502 "rw_ios_per_sec": 0, 00:17:29.502 "rw_mbytes_per_sec": 0, 00:17:29.502 "r_mbytes_per_sec": 0, 00:17:29.502 "w_mbytes_per_sec": 0 00:17:29.502 }, 00:17:29.502 "claimed": false, 00:17:29.502 "zoned": false, 00:17:29.502 "supported_io_types": { 00:17:29.502 "read": true, 00:17:29.502 "write": true, 00:17:29.502 "unmap": false, 00:17:29.502 "flush": false, 00:17:29.502 "reset": true, 00:17:29.502 "nvme_admin": false, 00:17:29.502 "nvme_io": false, 00:17:29.502 "nvme_io_md": false, 00:17:29.502 "write_zeroes": true, 00:17:29.502 "zcopy": false, 00:17:29.502 "get_zone_info": false, 00:17:29.502 "zone_management": false, 00:17:29.502 "zone_append": false, 00:17:29.502 "compare": false, 00:17:29.502 "compare_and_write": false, 00:17:29.502 "abort": false, 00:17:29.502 "seek_hole": false, 00:17:29.502 "seek_data": false, 00:17:29.502 "copy": false, 00:17:29.502 "nvme_iov_md": false 00:17:29.502 }, 00:17:29.502 "memory_domains": [ 00:17:29.502 { 00:17:29.502 "dma_device_id": "system", 00:17:29.502 "dma_device_type": 1 00:17:29.502 }, 00:17:29.502 { 00:17:29.502 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.502 "dma_device_type": 2 00:17:29.502 }, 00:17:29.502 { 00:17:29.502 "dma_device_id": "system", 00:17:29.502 "dma_device_type": 1 00:17:29.502 }, 00:17:29.502 { 00:17:29.502 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.502 "dma_device_type": 2 00:17:29.502 }, 00:17:29.502 { 00:17:29.502 "dma_device_id": "system", 00:17:29.502 "dma_device_type": 1 00:17:29.502 }, 00:17:29.502 { 00:17:29.502 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.502 "dma_device_type": 2 00:17:29.502 }, 00:17:29.502 { 00:17:29.502 "dma_device_id": "system", 00:17:29.502 "dma_device_type": 1 00:17:29.502 }, 00:17:29.502 { 00:17:29.502 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.502 "dma_device_type": 2 00:17:29.502 } 00:17:29.502 ], 00:17:29.502 "driver_specific": { 00:17:29.502 "raid": { 00:17:29.502 "uuid": "6645c20d-b819-4e66-a797-c91371014fd4", 00:17:29.502 "strip_size_kb": 0, 00:17:29.502 "state": "online", 00:17:29.502 "raid_level": "raid1", 00:17:29.502 "superblock": true, 00:17:29.502 "num_base_bdevs": 4, 00:17:29.502 "num_base_bdevs_discovered": 4, 00:17:29.502 "num_base_bdevs_operational": 4, 00:17:29.502 "base_bdevs_list": [ 00:17:29.502 { 00:17:29.502 "name": "NewBaseBdev", 00:17:29.502 "uuid": "e9257d58-b4f4-4ef8-8df5-dee4a2257dad", 00:17:29.502 "is_configured": true, 00:17:29.502 "data_offset": 2048, 00:17:29.502 "data_size": 63488 00:17:29.502 }, 00:17:29.502 { 00:17:29.502 "name": "BaseBdev2", 00:17:29.502 "uuid": "d216cc46-89ff-49d5-90f5-037e331e8645", 00:17:29.502 "is_configured": true, 00:17:29.502 "data_offset": 2048, 00:17:29.502 "data_size": 63488 00:17:29.502 }, 00:17:29.502 { 00:17:29.502 "name": "BaseBdev3", 00:17:29.502 "uuid": "372f735a-fba8-4346-9a37-ab89e3dc2a48", 00:17:29.502 "is_configured": true, 00:17:29.502 "data_offset": 2048, 00:17:29.502 "data_size": 63488 00:17:29.502 }, 00:17:29.502 { 00:17:29.502 "name": "BaseBdev4", 00:17:29.502 "uuid": "2c3555ff-4526-472c-9550-68a8ae694273", 00:17:29.502 "is_configured": true, 00:17:29.502 "data_offset": 2048, 00:17:29.502 "data_size": 63488 00:17:29.502 } 00:17:29.502 ] 00:17:29.502 } 00:17:29.502 } 00:17:29.502 }' 00:17:29.502 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:29.502 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:29.502 BaseBdev2 00:17:29.502 BaseBdev3 00:17:29.502 BaseBdev4' 00:17:29.502 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:29.502 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:29.502 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:29.761 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:29.761 "name": "NewBaseBdev", 00:17:29.761 "aliases": [ 00:17:29.761 "e9257d58-b4f4-4ef8-8df5-dee4a2257dad" 00:17:29.761 ], 00:17:29.761 "product_name": "Malloc disk", 00:17:29.761 "block_size": 512, 00:17:29.761 "num_blocks": 65536, 00:17:29.761 "uuid": "e9257d58-b4f4-4ef8-8df5-dee4a2257dad", 00:17:29.761 "assigned_rate_limits": { 00:17:29.761 "rw_ios_per_sec": 0, 00:17:29.761 "rw_mbytes_per_sec": 0, 00:17:29.761 "r_mbytes_per_sec": 0, 00:17:29.761 "w_mbytes_per_sec": 0 00:17:29.761 }, 00:17:29.761 "claimed": true, 00:17:29.761 "claim_type": "exclusive_write", 00:17:29.761 "zoned": false, 00:17:29.761 "supported_io_types": { 00:17:29.761 "read": true, 00:17:29.761 "write": true, 00:17:29.761 "unmap": true, 00:17:29.761 "flush": true, 00:17:29.761 "reset": true, 00:17:29.761 "nvme_admin": false, 00:17:29.761 "nvme_io": false, 00:17:29.761 "nvme_io_md": false, 00:17:29.761 "write_zeroes": true, 00:17:29.761 "zcopy": true, 00:17:29.761 "get_zone_info": false, 00:17:29.761 "zone_management": false, 00:17:29.761 "zone_append": false, 00:17:29.761 "compare": false, 00:17:29.761 "compare_and_write": false, 00:17:29.761 "abort": true, 00:17:29.761 "seek_hole": false, 00:17:29.761 "seek_data": false, 00:17:29.761 "copy": true, 00:17:29.761 "nvme_iov_md": false 00:17:29.761 }, 00:17:29.761 "memory_domains": [ 00:17:29.761 { 00:17:29.761 "dma_device_id": "system", 00:17:29.761 "dma_device_type": 1 00:17:29.761 }, 00:17:29.761 { 00:17:29.761 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.761 "dma_device_type": 2 00:17:29.761 } 00:17:29.761 ], 00:17:29.761 "driver_specific": {} 00:17:29.761 }' 00:17:29.761 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:29.761 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:29.761 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:29.761 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:29.761 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:29.761 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:29.761 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.020 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.020 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:30.020 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.020 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.020 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:30.020 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:30.020 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:30.020 18:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:30.278 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:30.278 "name": "BaseBdev2", 00:17:30.278 "aliases": [ 00:17:30.278 "d216cc46-89ff-49d5-90f5-037e331e8645" 00:17:30.278 ], 00:17:30.278 "product_name": "Malloc disk", 00:17:30.278 "block_size": 512, 00:17:30.278 "num_blocks": 65536, 00:17:30.278 "uuid": "d216cc46-89ff-49d5-90f5-037e331e8645", 00:17:30.278 "assigned_rate_limits": { 00:17:30.278 "rw_ios_per_sec": 0, 00:17:30.278 "rw_mbytes_per_sec": 0, 00:17:30.278 "r_mbytes_per_sec": 0, 00:17:30.278 "w_mbytes_per_sec": 0 00:17:30.278 }, 00:17:30.278 "claimed": true, 00:17:30.278 "claim_type": "exclusive_write", 00:17:30.278 "zoned": false, 00:17:30.278 "supported_io_types": { 00:17:30.278 "read": true, 00:17:30.278 "write": true, 00:17:30.278 "unmap": true, 00:17:30.278 "flush": true, 00:17:30.278 "reset": true, 00:17:30.278 "nvme_admin": false, 00:17:30.278 "nvme_io": false, 00:17:30.278 "nvme_io_md": false, 00:17:30.278 "write_zeroes": true, 00:17:30.278 "zcopy": true, 00:17:30.278 "get_zone_info": false, 00:17:30.278 "zone_management": false, 00:17:30.278 "zone_append": false, 00:17:30.278 "compare": false, 00:17:30.278 "compare_and_write": false, 00:17:30.278 "abort": true, 00:17:30.278 "seek_hole": false, 00:17:30.278 "seek_data": false, 00:17:30.278 "copy": true, 00:17:30.278 "nvme_iov_md": false 00:17:30.278 }, 00:17:30.278 "memory_domains": [ 00:17:30.278 { 00:17:30.278 "dma_device_id": "system", 00:17:30.278 "dma_device_type": 1 00:17:30.278 }, 00:17:30.278 { 00:17:30.278 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.278 "dma_device_type": 2 00:17:30.278 } 00:17:30.278 ], 00:17:30.278 "driver_specific": {} 00:17:30.278 }' 00:17:30.278 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:30.278 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:30.278 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:30.278 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.278 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.278 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:30.278 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.278 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.536 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:30.536 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.536 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.536 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:30.536 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:30.536 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:30.536 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:30.536 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:30.536 "name": "BaseBdev3", 00:17:30.536 "aliases": [ 00:17:30.536 "372f735a-fba8-4346-9a37-ab89e3dc2a48" 00:17:30.536 ], 00:17:30.536 "product_name": "Malloc disk", 00:17:30.537 "block_size": 512, 00:17:30.537 "num_blocks": 65536, 00:17:30.537 "uuid": "372f735a-fba8-4346-9a37-ab89e3dc2a48", 00:17:30.537 "assigned_rate_limits": { 00:17:30.537 "rw_ios_per_sec": 0, 00:17:30.537 "rw_mbytes_per_sec": 0, 00:17:30.537 "r_mbytes_per_sec": 0, 00:17:30.537 "w_mbytes_per_sec": 0 00:17:30.537 }, 00:17:30.537 "claimed": true, 00:17:30.537 "claim_type": "exclusive_write", 00:17:30.537 "zoned": false, 00:17:30.537 "supported_io_types": { 00:17:30.537 "read": true, 00:17:30.537 "write": true, 00:17:30.537 "unmap": true, 00:17:30.537 "flush": true, 00:17:30.537 "reset": true, 00:17:30.537 "nvme_admin": false, 00:17:30.537 "nvme_io": false, 00:17:30.537 "nvme_io_md": false, 00:17:30.537 "write_zeroes": true, 00:17:30.537 "zcopy": true, 00:17:30.537 "get_zone_info": false, 00:17:30.537 "zone_management": false, 00:17:30.537 "zone_append": false, 00:17:30.537 "compare": false, 00:17:30.537 "compare_and_write": false, 00:17:30.537 "abort": true, 00:17:30.537 "seek_hole": false, 00:17:30.537 "seek_data": false, 00:17:30.537 "copy": true, 00:17:30.537 "nvme_iov_md": false 00:17:30.537 }, 00:17:30.537 "memory_domains": [ 00:17:30.537 { 00:17:30.537 "dma_device_id": "system", 00:17:30.537 "dma_device_type": 1 00:17:30.537 }, 00:17:30.537 { 00:17:30.537 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.537 "dma_device_type": 2 00:17:30.537 } 00:17:30.537 ], 00:17:30.537 "driver_specific": {} 00:17:30.537 }' 00:17:30.537 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:30.795 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:30.795 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:30.795 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.795 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.795 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:30.795 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.795 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.795 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:30.795 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.795 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.795 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:30.795 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:30.795 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:30.795 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:31.054 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:31.054 "name": "BaseBdev4", 00:17:31.054 "aliases": [ 00:17:31.054 "2c3555ff-4526-472c-9550-68a8ae694273" 00:17:31.054 ], 00:17:31.054 "product_name": "Malloc disk", 00:17:31.054 "block_size": 512, 00:17:31.054 "num_blocks": 65536, 00:17:31.054 "uuid": "2c3555ff-4526-472c-9550-68a8ae694273", 00:17:31.054 "assigned_rate_limits": { 00:17:31.054 "rw_ios_per_sec": 0, 00:17:31.054 "rw_mbytes_per_sec": 0, 00:17:31.054 "r_mbytes_per_sec": 0, 00:17:31.054 "w_mbytes_per_sec": 0 00:17:31.054 }, 00:17:31.054 "claimed": true, 00:17:31.054 "claim_type": "exclusive_write", 00:17:31.054 "zoned": false, 00:17:31.054 "supported_io_types": { 00:17:31.054 "read": true, 00:17:31.054 "write": true, 00:17:31.054 "unmap": true, 00:17:31.054 "flush": true, 00:17:31.054 "reset": true, 00:17:31.054 "nvme_admin": false, 00:17:31.054 "nvme_io": false, 00:17:31.054 "nvme_io_md": false, 00:17:31.054 "write_zeroes": true, 00:17:31.054 "zcopy": true, 00:17:31.054 "get_zone_info": false, 00:17:31.054 "zone_management": false, 00:17:31.054 "zone_append": false, 00:17:31.054 "compare": false, 00:17:31.054 "compare_and_write": false, 00:17:31.054 "abort": true, 00:17:31.054 "seek_hole": false, 00:17:31.054 "seek_data": false, 00:17:31.054 "copy": true, 00:17:31.054 "nvme_iov_md": false 00:17:31.054 }, 00:17:31.054 "memory_domains": [ 00:17:31.054 { 00:17:31.054 "dma_device_id": "system", 00:17:31.054 "dma_device_type": 1 00:17:31.054 }, 00:17:31.054 { 00:17:31.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.054 "dma_device_type": 2 00:17:31.054 } 00:17:31.054 ], 00:17:31.054 "driver_specific": {} 00:17:31.054 }' 00:17:31.054 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.054 18:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.054 18:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:31.054 18:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.313 18:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.313 18:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:31.313 18:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.313 18:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.313 18:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:31.313 18:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.313 18:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.313 18:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:31.313 18:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:31.572 [2024-07-24 18:54:16.420356] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:31.572 [2024-07-24 18:54:16.420375] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:31.572 [2024-07-24 18:54:16.420411] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:31.572 [2024-07-24 18:54:16.420597] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:31.572 [2024-07-24 18:54:16.420604] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x276b730 name Existed_Raid, state offline 00:17:31.572 18:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2138636 00:17:31.572 18:54:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2138636 ']' 00:17:31.572 18:54:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2138636 00:17:31.572 18:54:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:17:31.572 18:54:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:31.572 18:54:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2138636 00:17:31.572 18:54:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:31.572 18:54:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:31.572 18:54:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2138636' 00:17:31.572 killing process with pid 2138636 00:17:31.572 18:54:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2138636 00:17:31.572 [2024-07-24 18:54:16.484949] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:31.572 18:54:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2138636 00:17:31.572 [2024-07-24 18:54:16.515705] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:31.831 18:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:17:31.831 00:17:31.831 real 0m24.121s 00:17:31.831 user 0m44.874s 00:17:31.831 sys 0m3.765s 00:17:31.831 18:54:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:31.831 18:54:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:31.831 ************************************ 00:17:31.831 END TEST raid_state_function_test_sb 00:17:31.831 ************************************ 00:17:31.831 18:54:16 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:17:31.831 18:54:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:17:31.831 18:54:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:31.831 18:54:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:31.831 ************************************ 00:17:31.831 START TEST raid_superblock_test 00:17:31.831 ************************************ 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2143830 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2143830 /var/tmp/spdk-raid.sock 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2143830 ']' 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:31.831 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:31.831 18:54:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:31.832 [2024-07-24 18:54:16.812536] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:17:31.832 [2024-07-24 18:54:16.812582] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2143830 ] 00:17:32.090 [2024-07-24 18:54:16.882337] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:32.090 [2024-07-24 18:54:16.960787] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:32.090 [2024-07-24 18:54:17.012482] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:32.090 [2024-07-24 18:54:17.012509] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:32.657 18:54:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:32.657 18:54:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:17:32.657 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:17:32.657 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:32.657 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:17:32.657 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:17:32.657 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:32.657 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:32.657 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:32.657 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:32.657 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:32.916 malloc1 00:17:32.916 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:32.916 [2024-07-24 18:54:17.924280] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:32.916 [2024-07-24 18:54:17.924315] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:32.916 [2024-07-24 18:54:17.924329] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1324e20 00:17:32.916 [2024-07-24 18:54:17.924336] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:33.174 [2024-07-24 18:54:17.925561] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:33.174 [2024-07-24 18:54:17.925584] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:33.174 pt1 00:17:33.174 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:33.175 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:33.175 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:17:33.175 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:17:33.175 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:33.175 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:33.175 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:33.175 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:33.175 18:54:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:33.175 malloc2 00:17:33.175 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:33.433 [2024-07-24 18:54:18.252759] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:33.433 [2024-07-24 18:54:18.252791] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:33.433 [2024-07-24 18:54:18.252801] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14ceed0 00:17:33.433 [2024-07-24 18:54:18.252807] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:33.433 [2024-07-24 18:54:18.253824] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:33.433 [2024-07-24 18:54:18.253844] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:33.433 pt2 00:17:33.433 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:33.433 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:33.433 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:17:33.433 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:17:33.433 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:17:33.433 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:33.434 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:33.434 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:33.434 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:17:33.434 malloc3 00:17:33.434 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:33.692 [2024-07-24 18:54:18.589214] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:33.692 [2024-07-24 18:54:18.589247] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:33.692 [2024-07-24 18:54:18.589258] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d2a30 00:17:33.692 [2024-07-24 18:54:18.589263] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:33.692 [2024-07-24 18:54:18.590277] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:33.692 [2024-07-24 18:54:18.590298] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:33.692 pt3 00:17:33.692 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:33.692 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:33.692 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:17:33.692 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:17:33.692 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:17:33.692 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:33.692 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:33.692 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:33.692 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:17:33.950 malloc4 00:17:33.950 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:33.950 [2024-07-24 18:54:18.909457] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:33.950 [2024-07-24 18:54:18.909492] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:33.950 [2024-07-24 18:54:18.909502] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14cf900 00:17:33.950 [2024-07-24 18:54:18.909509] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:33.950 [2024-07-24 18:54:18.910531] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:33.950 [2024-07-24 18:54:18.910551] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:33.950 pt4 00:17:33.950 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:33.950 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:33.950 18:54:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:17:34.208 [2024-07-24 18:54:19.073890] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:34.208 [2024-07-24 18:54:19.074746] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:34.208 [2024-07-24 18:54:19.074785] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:34.208 [2024-07-24 18:54:19.074812] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:34.208 [2024-07-24 18:54:19.074923] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x14d2d40 00:17:34.208 [2024-07-24 18:54:19.074929] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:34.208 [2024-07-24 18:54:19.075061] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14d7140 00:17:34.208 [2024-07-24 18:54:19.075159] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14d2d40 00:17:34.208 [2024-07-24 18:54:19.075164] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14d2d40 00:17:34.208 [2024-07-24 18:54:19.075228] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:34.208 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:17:34.208 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:34.208 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:34.208 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:34.208 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:34.208 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:34.208 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:34.208 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:34.208 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:34.208 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:34.208 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.208 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:34.466 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:34.466 "name": "raid_bdev1", 00:17:34.466 "uuid": "34cf55d3-93dc-4fc0-824a-a3a127d964b7", 00:17:34.466 "strip_size_kb": 0, 00:17:34.466 "state": "online", 00:17:34.466 "raid_level": "raid1", 00:17:34.466 "superblock": true, 00:17:34.466 "num_base_bdevs": 4, 00:17:34.466 "num_base_bdevs_discovered": 4, 00:17:34.466 "num_base_bdevs_operational": 4, 00:17:34.466 "base_bdevs_list": [ 00:17:34.466 { 00:17:34.466 "name": "pt1", 00:17:34.466 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:34.466 "is_configured": true, 00:17:34.466 "data_offset": 2048, 00:17:34.466 "data_size": 63488 00:17:34.466 }, 00:17:34.466 { 00:17:34.466 "name": "pt2", 00:17:34.466 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:34.466 "is_configured": true, 00:17:34.466 "data_offset": 2048, 00:17:34.466 "data_size": 63488 00:17:34.466 }, 00:17:34.466 { 00:17:34.466 "name": "pt3", 00:17:34.466 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:34.466 "is_configured": true, 00:17:34.466 "data_offset": 2048, 00:17:34.466 "data_size": 63488 00:17:34.466 }, 00:17:34.466 { 00:17:34.466 "name": "pt4", 00:17:34.466 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:34.466 "is_configured": true, 00:17:34.466 "data_offset": 2048, 00:17:34.466 "data_size": 63488 00:17:34.466 } 00:17:34.466 ] 00:17:34.466 }' 00:17:34.466 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:34.466 18:54:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:35.033 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:17:35.033 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:35.033 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:35.033 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:35.033 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:35.033 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:35.033 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:35.033 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:35.033 [2024-07-24 18:54:19.884159] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:35.033 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:35.033 "name": "raid_bdev1", 00:17:35.033 "aliases": [ 00:17:35.033 "34cf55d3-93dc-4fc0-824a-a3a127d964b7" 00:17:35.033 ], 00:17:35.033 "product_name": "Raid Volume", 00:17:35.033 "block_size": 512, 00:17:35.033 "num_blocks": 63488, 00:17:35.033 "uuid": "34cf55d3-93dc-4fc0-824a-a3a127d964b7", 00:17:35.033 "assigned_rate_limits": { 00:17:35.033 "rw_ios_per_sec": 0, 00:17:35.033 "rw_mbytes_per_sec": 0, 00:17:35.033 "r_mbytes_per_sec": 0, 00:17:35.033 "w_mbytes_per_sec": 0 00:17:35.033 }, 00:17:35.033 "claimed": false, 00:17:35.033 "zoned": false, 00:17:35.033 "supported_io_types": { 00:17:35.033 "read": true, 00:17:35.033 "write": true, 00:17:35.033 "unmap": false, 00:17:35.033 "flush": false, 00:17:35.033 "reset": true, 00:17:35.033 "nvme_admin": false, 00:17:35.033 "nvme_io": false, 00:17:35.033 "nvme_io_md": false, 00:17:35.033 "write_zeroes": true, 00:17:35.033 "zcopy": false, 00:17:35.033 "get_zone_info": false, 00:17:35.033 "zone_management": false, 00:17:35.033 "zone_append": false, 00:17:35.033 "compare": false, 00:17:35.033 "compare_and_write": false, 00:17:35.033 "abort": false, 00:17:35.033 "seek_hole": false, 00:17:35.033 "seek_data": false, 00:17:35.033 "copy": false, 00:17:35.033 "nvme_iov_md": false 00:17:35.033 }, 00:17:35.033 "memory_domains": [ 00:17:35.033 { 00:17:35.033 "dma_device_id": "system", 00:17:35.033 "dma_device_type": 1 00:17:35.033 }, 00:17:35.033 { 00:17:35.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.033 "dma_device_type": 2 00:17:35.033 }, 00:17:35.033 { 00:17:35.033 "dma_device_id": "system", 00:17:35.033 "dma_device_type": 1 00:17:35.033 }, 00:17:35.033 { 00:17:35.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.033 "dma_device_type": 2 00:17:35.033 }, 00:17:35.033 { 00:17:35.033 "dma_device_id": "system", 00:17:35.033 "dma_device_type": 1 00:17:35.033 }, 00:17:35.033 { 00:17:35.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.033 "dma_device_type": 2 00:17:35.033 }, 00:17:35.033 { 00:17:35.033 "dma_device_id": "system", 00:17:35.033 "dma_device_type": 1 00:17:35.033 }, 00:17:35.033 { 00:17:35.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.033 "dma_device_type": 2 00:17:35.033 } 00:17:35.033 ], 00:17:35.033 "driver_specific": { 00:17:35.033 "raid": { 00:17:35.033 "uuid": "34cf55d3-93dc-4fc0-824a-a3a127d964b7", 00:17:35.033 "strip_size_kb": 0, 00:17:35.033 "state": "online", 00:17:35.033 "raid_level": "raid1", 00:17:35.033 "superblock": true, 00:17:35.033 "num_base_bdevs": 4, 00:17:35.033 "num_base_bdevs_discovered": 4, 00:17:35.034 "num_base_bdevs_operational": 4, 00:17:35.034 "base_bdevs_list": [ 00:17:35.034 { 00:17:35.034 "name": "pt1", 00:17:35.034 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:35.034 "is_configured": true, 00:17:35.034 "data_offset": 2048, 00:17:35.034 "data_size": 63488 00:17:35.034 }, 00:17:35.034 { 00:17:35.034 "name": "pt2", 00:17:35.034 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:35.034 "is_configured": true, 00:17:35.034 "data_offset": 2048, 00:17:35.034 "data_size": 63488 00:17:35.034 }, 00:17:35.034 { 00:17:35.034 "name": "pt3", 00:17:35.034 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:35.034 "is_configured": true, 00:17:35.034 "data_offset": 2048, 00:17:35.034 "data_size": 63488 00:17:35.034 }, 00:17:35.034 { 00:17:35.034 "name": "pt4", 00:17:35.034 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:35.034 "is_configured": true, 00:17:35.034 "data_offset": 2048, 00:17:35.034 "data_size": 63488 00:17:35.034 } 00:17:35.034 ] 00:17:35.034 } 00:17:35.034 } 00:17:35.034 }' 00:17:35.034 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:35.034 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:35.034 pt2 00:17:35.034 pt3 00:17:35.034 pt4' 00:17:35.034 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:35.034 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:35.034 18:54:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:35.292 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:35.292 "name": "pt1", 00:17:35.292 "aliases": [ 00:17:35.292 "00000000-0000-0000-0000-000000000001" 00:17:35.292 ], 00:17:35.292 "product_name": "passthru", 00:17:35.292 "block_size": 512, 00:17:35.292 "num_blocks": 65536, 00:17:35.292 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:35.292 "assigned_rate_limits": { 00:17:35.293 "rw_ios_per_sec": 0, 00:17:35.293 "rw_mbytes_per_sec": 0, 00:17:35.293 "r_mbytes_per_sec": 0, 00:17:35.293 "w_mbytes_per_sec": 0 00:17:35.293 }, 00:17:35.293 "claimed": true, 00:17:35.293 "claim_type": "exclusive_write", 00:17:35.293 "zoned": false, 00:17:35.293 "supported_io_types": { 00:17:35.293 "read": true, 00:17:35.293 "write": true, 00:17:35.293 "unmap": true, 00:17:35.293 "flush": true, 00:17:35.293 "reset": true, 00:17:35.293 "nvme_admin": false, 00:17:35.293 "nvme_io": false, 00:17:35.293 "nvme_io_md": false, 00:17:35.293 "write_zeroes": true, 00:17:35.293 "zcopy": true, 00:17:35.293 "get_zone_info": false, 00:17:35.293 "zone_management": false, 00:17:35.293 "zone_append": false, 00:17:35.293 "compare": false, 00:17:35.293 "compare_and_write": false, 00:17:35.293 "abort": true, 00:17:35.293 "seek_hole": false, 00:17:35.293 "seek_data": false, 00:17:35.293 "copy": true, 00:17:35.293 "nvme_iov_md": false 00:17:35.293 }, 00:17:35.293 "memory_domains": [ 00:17:35.293 { 00:17:35.293 "dma_device_id": "system", 00:17:35.293 "dma_device_type": 1 00:17:35.293 }, 00:17:35.293 { 00:17:35.293 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.293 "dma_device_type": 2 00:17:35.293 } 00:17:35.293 ], 00:17:35.293 "driver_specific": { 00:17:35.293 "passthru": { 00:17:35.293 "name": "pt1", 00:17:35.293 "base_bdev_name": "malloc1" 00:17:35.293 } 00:17:35.293 } 00:17:35.293 }' 00:17:35.293 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:35.293 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:35.293 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:35.293 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:35.293 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:35.293 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:35.293 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:35.552 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:35.552 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:35.552 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:35.552 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:35.552 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:35.552 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:35.552 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:35.552 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:35.811 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:35.811 "name": "pt2", 00:17:35.811 "aliases": [ 00:17:35.811 "00000000-0000-0000-0000-000000000002" 00:17:35.811 ], 00:17:35.811 "product_name": "passthru", 00:17:35.811 "block_size": 512, 00:17:35.811 "num_blocks": 65536, 00:17:35.811 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:35.811 "assigned_rate_limits": { 00:17:35.811 "rw_ios_per_sec": 0, 00:17:35.811 "rw_mbytes_per_sec": 0, 00:17:35.811 "r_mbytes_per_sec": 0, 00:17:35.811 "w_mbytes_per_sec": 0 00:17:35.811 }, 00:17:35.811 "claimed": true, 00:17:35.811 "claim_type": "exclusive_write", 00:17:35.811 "zoned": false, 00:17:35.811 "supported_io_types": { 00:17:35.811 "read": true, 00:17:35.811 "write": true, 00:17:35.811 "unmap": true, 00:17:35.811 "flush": true, 00:17:35.811 "reset": true, 00:17:35.811 "nvme_admin": false, 00:17:35.811 "nvme_io": false, 00:17:35.811 "nvme_io_md": false, 00:17:35.811 "write_zeroes": true, 00:17:35.811 "zcopy": true, 00:17:35.811 "get_zone_info": false, 00:17:35.811 "zone_management": false, 00:17:35.811 "zone_append": false, 00:17:35.811 "compare": false, 00:17:35.811 "compare_and_write": false, 00:17:35.811 "abort": true, 00:17:35.811 "seek_hole": false, 00:17:35.811 "seek_data": false, 00:17:35.811 "copy": true, 00:17:35.811 "nvme_iov_md": false 00:17:35.811 }, 00:17:35.811 "memory_domains": [ 00:17:35.811 { 00:17:35.811 "dma_device_id": "system", 00:17:35.811 "dma_device_type": 1 00:17:35.811 }, 00:17:35.811 { 00:17:35.811 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.811 "dma_device_type": 2 00:17:35.811 } 00:17:35.811 ], 00:17:35.811 "driver_specific": { 00:17:35.811 "passthru": { 00:17:35.811 "name": "pt2", 00:17:35.811 "base_bdev_name": "malloc2" 00:17:35.811 } 00:17:35.811 } 00:17:35.811 }' 00:17:35.811 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:35.811 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:35.811 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:35.811 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:35.811 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:35.811 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:35.811 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:35.811 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:36.069 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:36.069 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:36.069 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:36.069 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:36.069 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:36.069 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:36.069 18:54:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:36.326 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:36.326 "name": "pt3", 00:17:36.326 "aliases": [ 00:17:36.326 "00000000-0000-0000-0000-000000000003" 00:17:36.326 ], 00:17:36.326 "product_name": "passthru", 00:17:36.326 "block_size": 512, 00:17:36.326 "num_blocks": 65536, 00:17:36.326 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:36.326 "assigned_rate_limits": { 00:17:36.326 "rw_ios_per_sec": 0, 00:17:36.327 "rw_mbytes_per_sec": 0, 00:17:36.327 "r_mbytes_per_sec": 0, 00:17:36.327 "w_mbytes_per_sec": 0 00:17:36.327 }, 00:17:36.327 "claimed": true, 00:17:36.327 "claim_type": "exclusive_write", 00:17:36.327 "zoned": false, 00:17:36.327 "supported_io_types": { 00:17:36.327 "read": true, 00:17:36.327 "write": true, 00:17:36.327 "unmap": true, 00:17:36.327 "flush": true, 00:17:36.327 "reset": true, 00:17:36.327 "nvme_admin": false, 00:17:36.327 "nvme_io": false, 00:17:36.327 "nvme_io_md": false, 00:17:36.327 "write_zeroes": true, 00:17:36.327 "zcopy": true, 00:17:36.327 "get_zone_info": false, 00:17:36.327 "zone_management": false, 00:17:36.327 "zone_append": false, 00:17:36.327 "compare": false, 00:17:36.327 "compare_and_write": false, 00:17:36.327 "abort": true, 00:17:36.327 "seek_hole": false, 00:17:36.327 "seek_data": false, 00:17:36.327 "copy": true, 00:17:36.327 "nvme_iov_md": false 00:17:36.327 }, 00:17:36.327 "memory_domains": [ 00:17:36.327 { 00:17:36.327 "dma_device_id": "system", 00:17:36.327 "dma_device_type": 1 00:17:36.327 }, 00:17:36.327 { 00:17:36.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:36.327 "dma_device_type": 2 00:17:36.327 } 00:17:36.327 ], 00:17:36.327 "driver_specific": { 00:17:36.327 "passthru": { 00:17:36.327 "name": "pt3", 00:17:36.327 "base_bdev_name": "malloc3" 00:17:36.327 } 00:17:36.327 } 00:17:36.327 }' 00:17:36.327 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:36.327 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:36.327 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:36.327 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:36.327 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:36.327 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:36.327 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:36.327 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:36.585 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:36.585 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:36.585 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:36.585 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:36.585 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:36.585 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:36.585 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:36.843 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:36.843 "name": "pt4", 00:17:36.843 "aliases": [ 00:17:36.843 "00000000-0000-0000-0000-000000000004" 00:17:36.843 ], 00:17:36.843 "product_name": "passthru", 00:17:36.843 "block_size": 512, 00:17:36.843 "num_blocks": 65536, 00:17:36.843 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:36.843 "assigned_rate_limits": { 00:17:36.843 "rw_ios_per_sec": 0, 00:17:36.843 "rw_mbytes_per_sec": 0, 00:17:36.843 "r_mbytes_per_sec": 0, 00:17:36.843 "w_mbytes_per_sec": 0 00:17:36.843 }, 00:17:36.843 "claimed": true, 00:17:36.843 "claim_type": "exclusive_write", 00:17:36.843 "zoned": false, 00:17:36.843 "supported_io_types": { 00:17:36.843 "read": true, 00:17:36.843 "write": true, 00:17:36.843 "unmap": true, 00:17:36.843 "flush": true, 00:17:36.843 "reset": true, 00:17:36.843 "nvme_admin": false, 00:17:36.843 "nvme_io": false, 00:17:36.843 "nvme_io_md": false, 00:17:36.843 "write_zeroes": true, 00:17:36.843 "zcopy": true, 00:17:36.843 "get_zone_info": false, 00:17:36.843 "zone_management": false, 00:17:36.843 "zone_append": false, 00:17:36.843 "compare": false, 00:17:36.843 "compare_and_write": false, 00:17:36.843 "abort": true, 00:17:36.843 "seek_hole": false, 00:17:36.843 "seek_data": false, 00:17:36.843 "copy": true, 00:17:36.843 "nvme_iov_md": false 00:17:36.843 }, 00:17:36.843 "memory_domains": [ 00:17:36.843 { 00:17:36.843 "dma_device_id": "system", 00:17:36.843 "dma_device_type": 1 00:17:36.843 }, 00:17:36.843 { 00:17:36.843 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:36.843 "dma_device_type": 2 00:17:36.843 } 00:17:36.843 ], 00:17:36.843 "driver_specific": { 00:17:36.843 "passthru": { 00:17:36.843 "name": "pt4", 00:17:36.843 "base_bdev_name": "malloc4" 00:17:36.843 } 00:17:36.843 } 00:17:36.843 }' 00:17:36.843 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:36.843 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:36.843 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:36.844 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:36.844 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:36.844 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:36.844 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:36.844 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:36.844 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:36.844 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:37.128 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:37.128 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:37.128 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:37.128 18:54:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:37.128 [2024-07-24 18:54:22.049740] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:37.128 18:54:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=34cf55d3-93dc-4fc0-824a-a3a127d964b7 00:17:37.128 18:54:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 34cf55d3-93dc-4fc0-824a-a3a127d964b7 ']' 00:17:37.128 18:54:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:37.386 [2024-07-24 18:54:22.217972] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:37.386 [2024-07-24 18:54:22.217984] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:37.386 [2024-07-24 18:54:22.218023] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:37.386 [2024-07-24 18:54:22.218080] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:37.386 [2024-07-24 18:54:22.218087] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14d2d40 name raid_bdev1, state offline 00:17:37.386 18:54:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.386 18:54:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:37.643 18:54:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:37.643 18:54:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:37.643 18:54:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:37.643 18:54:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:37.644 18:54:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:37.644 18:54:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:37.901 18:54:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:37.901 18:54:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:37.901 18:54:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:37.901 18:54:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:17:38.160 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:38.160 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:38.419 [2024-07-24 18:54:23.344859] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:38.419 [2024-07-24 18:54:23.346059] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:38.419 [2024-07-24 18:54:23.346093] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:38.419 [2024-07-24 18:54:23.346114] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:17:38.419 [2024-07-24 18:54:23.346148] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:38.419 [2024-07-24 18:54:23.346179] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:38.419 [2024-07-24 18:54:23.346191] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:38.419 [2024-07-24 18:54:23.346203] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:17:38.419 [2024-07-24 18:54:23.346212] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:38.419 [2024-07-24 18:54:23.346217] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14cf100 name raid_bdev1, state configuring 00:17:38.419 request: 00:17:38.419 { 00:17:38.419 "name": "raid_bdev1", 00:17:38.419 "raid_level": "raid1", 00:17:38.419 "base_bdevs": [ 00:17:38.419 "malloc1", 00:17:38.419 "malloc2", 00:17:38.419 "malloc3", 00:17:38.419 "malloc4" 00:17:38.419 ], 00:17:38.419 "superblock": false, 00:17:38.419 "method": "bdev_raid_create", 00:17:38.419 "req_id": 1 00:17:38.419 } 00:17:38.419 Got JSON-RPC error response 00:17:38.419 response: 00:17:38.419 { 00:17:38.419 "code": -17, 00:17:38.419 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:38.419 } 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.419 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:38.677 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:38.677 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:38.677 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:38.677 [2024-07-24 18:54:23.669665] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:38.677 [2024-07-24 18:54:23.669689] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:38.677 [2024-07-24 18:54:23.669700] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d0c60 00:17:38.677 [2024-07-24 18:54:23.669706] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:38.677 [2024-07-24 18:54:23.671093] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:38.677 [2024-07-24 18:54:23.671115] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:38.677 [2024-07-24 18:54:23.671167] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:38.677 [2024-07-24 18:54:23.671186] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:38.677 pt1 00:17:38.677 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:17:38.677 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:38.677 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:38.677 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:38.677 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:38.677 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:38.677 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:38.677 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:38.677 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:38.677 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:38.936 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.936 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:38.936 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:38.936 "name": "raid_bdev1", 00:17:38.936 "uuid": "34cf55d3-93dc-4fc0-824a-a3a127d964b7", 00:17:38.936 "strip_size_kb": 0, 00:17:38.936 "state": "configuring", 00:17:38.936 "raid_level": "raid1", 00:17:38.936 "superblock": true, 00:17:38.936 "num_base_bdevs": 4, 00:17:38.936 "num_base_bdevs_discovered": 1, 00:17:38.936 "num_base_bdevs_operational": 4, 00:17:38.936 "base_bdevs_list": [ 00:17:38.936 { 00:17:38.936 "name": "pt1", 00:17:38.936 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:38.936 "is_configured": true, 00:17:38.936 "data_offset": 2048, 00:17:38.936 "data_size": 63488 00:17:38.936 }, 00:17:38.936 { 00:17:38.936 "name": null, 00:17:38.936 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:38.936 "is_configured": false, 00:17:38.936 "data_offset": 2048, 00:17:38.936 "data_size": 63488 00:17:38.936 }, 00:17:38.936 { 00:17:38.936 "name": null, 00:17:38.936 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:38.936 "is_configured": false, 00:17:38.936 "data_offset": 2048, 00:17:38.936 "data_size": 63488 00:17:38.936 }, 00:17:38.936 { 00:17:38.936 "name": null, 00:17:38.936 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:38.936 "is_configured": false, 00:17:38.936 "data_offset": 2048, 00:17:38.936 "data_size": 63488 00:17:38.936 } 00:17:38.936 ] 00:17:38.936 }' 00:17:38.936 18:54:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:38.936 18:54:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:39.503 18:54:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:17:39.503 18:54:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:39.503 [2024-07-24 18:54:24.471739] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:39.503 [2024-07-24 18:54:24.471776] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:39.503 [2024-07-24 18:54:24.471789] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1323aa0 00:17:39.503 [2024-07-24 18:54:24.471796] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:39.503 [2024-07-24 18:54:24.472075] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:39.503 [2024-07-24 18:54:24.472086] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:39.503 [2024-07-24 18:54:24.472135] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:39.503 [2024-07-24 18:54:24.472149] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:39.503 pt2 00:17:39.503 18:54:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:39.762 [2024-07-24 18:54:24.636173] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:39.762 18:54:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:17:39.762 18:54:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:39.762 18:54:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:39.762 18:54:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:39.762 18:54:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:39.762 18:54:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:39.762 18:54:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:39.762 18:54:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:39.762 18:54:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:39.762 18:54:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:39.762 18:54:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:39.762 18:54:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.021 18:54:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:40.021 "name": "raid_bdev1", 00:17:40.021 "uuid": "34cf55d3-93dc-4fc0-824a-a3a127d964b7", 00:17:40.021 "strip_size_kb": 0, 00:17:40.021 "state": "configuring", 00:17:40.021 "raid_level": "raid1", 00:17:40.021 "superblock": true, 00:17:40.021 "num_base_bdevs": 4, 00:17:40.021 "num_base_bdevs_discovered": 1, 00:17:40.021 "num_base_bdevs_operational": 4, 00:17:40.021 "base_bdevs_list": [ 00:17:40.021 { 00:17:40.021 "name": "pt1", 00:17:40.021 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:40.021 "is_configured": true, 00:17:40.021 "data_offset": 2048, 00:17:40.021 "data_size": 63488 00:17:40.021 }, 00:17:40.021 { 00:17:40.021 "name": null, 00:17:40.021 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:40.021 "is_configured": false, 00:17:40.021 "data_offset": 2048, 00:17:40.021 "data_size": 63488 00:17:40.021 }, 00:17:40.021 { 00:17:40.021 "name": null, 00:17:40.021 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:40.021 "is_configured": false, 00:17:40.021 "data_offset": 2048, 00:17:40.021 "data_size": 63488 00:17:40.021 }, 00:17:40.021 { 00:17:40.021 "name": null, 00:17:40.021 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:40.021 "is_configured": false, 00:17:40.021 "data_offset": 2048, 00:17:40.021 "data_size": 63488 00:17:40.021 } 00:17:40.021 ] 00:17:40.021 }' 00:17:40.021 18:54:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:40.021 18:54:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:40.279 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:40.279 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:40.279 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:40.537 [2024-07-24 18:54:25.426205] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:40.537 [2024-07-24 18:54:25.426239] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:40.537 [2024-07-24 18:54:25.426250] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1323e30 00:17:40.537 [2024-07-24 18:54:25.426255] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:40.537 [2024-07-24 18:54:25.426556] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:40.537 [2024-07-24 18:54:25.426568] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:40.537 [2024-07-24 18:54:25.426616] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:40.537 [2024-07-24 18:54:25.426629] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:40.537 pt2 00:17:40.537 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:40.537 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:40.537 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:40.795 [2024-07-24 18:54:25.598654] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:40.795 [2024-07-24 18:54:25.598675] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:40.795 [2024-07-24 18:54:25.598685] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d2fc0 00:17:40.795 [2024-07-24 18:54:25.598690] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:40.795 [2024-07-24 18:54:25.598890] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:40.795 [2024-07-24 18:54:25.598898] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:40.795 [2024-07-24 18:54:25.598934] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:40.795 [2024-07-24 18:54:25.598944] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:40.795 pt3 00:17:40.795 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:40.795 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:40.795 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:40.795 [2024-07-24 18:54:25.779128] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:40.795 [2024-07-24 18:54:25.779148] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:40.795 [2024-07-24 18:54:25.779156] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d1030 00:17:40.795 [2024-07-24 18:54:25.779161] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:40.795 [2024-07-24 18:54:25.779398] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:40.795 [2024-07-24 18:54:25.779407] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:40.795 [2024-07-24 18:54:25.779443] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:17:40.795 [2024-07-24 18:54:25.779454] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:40.795 [2024-07-24 18:54:25.779555] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x14d1ee0 00:17:40.795 [2024-07-24 18:54:25.779561] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:40.795 [2024-07-24 18:54:25.779703] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14d51b0 00:17:40.795 [2024-07-24 18:54:25.779803] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14d1ee0 00:17:40.795 [2024-07-24 18:54:25.779808] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14d1ee0 00:17:40.795 [2024-07-24 18:54:25.779875] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:40.795 pt4 00:17:40.795 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:40.795 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:40.795 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:17:40.795 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:40.795 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:40.795 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:40.795 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:40.795 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:40.796 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:40.796 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:40.796 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:40.796 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:40.796 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.796 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:41.054 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:41.054 "name": "raid_bdev1", 00:17:41.054 "uuid": "34cf55d3-93dc-4fc0-824a-a3a127d964b7", 00:17:41.054 "strip_size_kb": 0, 00:17:41.054 "state": "online", 00:17:41.054 "raid_level": "raid1", 00:17:41.054 "superblock": true, 00:17:41.054 "num_base_bdevs": 4, 00:17:41.054 "num_base_bdevs_discovered": 4, 00:17:41.054 "num_base_bdevs_operational": 4, 00:17:41.054 "base_bdevs_list": [ 00:17:41.054 { 00:17:41.054 "name": "pt1", 00:17:41.054 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:41.054 "is_configured": true, 00:17:41.054 "data_offset": 2048, 00:17:41.054 "data_size": 63488 00:17:41.054 }, 00:17:41.054 { 00:17:41.054 "name": "pt2", 00:17:41.054 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:41.054 "is_configured": true, 00:17:41.054 "data_offset": 2048, 00:17:41.054 "data_size": 63488 00:17:41.054 }, 00:17:41.054 { 00:17:41.054 "name": "pt3", 00:17:41.054 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:41.054 "is_configured": true, 00:17:41.054 "data_offset": 2048, 00:17:41.054 "data_size": 63488 00:17:41.054 }, 00:17:41.054 { 00:17:41.054 "name": "pt4", 00:17:41.054 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:41.054 "is_configured": true, 00:17:41.054 "data_offset": 2048, 00:17:41.054 "data_size": 63488 00:17:41.054 } 00:17:41.054 ] 00:17:41.054 }' 00:17:41.054 18:54:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:41.054 18:54:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:41.620 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:41.620 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:41.620 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:41.620 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:41.620 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:41.620 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:41.620 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:41.620 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:41.620 [2024-07-24 18:54:26.573393] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:41.620 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:41.620 "name": "raid_bdev1", 00:17:41.620 "aliases": [ 00:17:41.620 "34cf55d3-93dc-4fc0-824a-a3a127d964b7" 00:17:41.620 ], 00:17:41.620 "product_name": "Raid Volume", 00:17:41.620 "block_size": 512, 00:17:41.620 "num_blocks": 63488, 00:17:41.620 "uuid": "34cf55d3-93dc-4fc0-824a-a3a127d964b7", 00:17:41.620 "assigned_rate_limits": { 00:17:41.620 "rw_ios_per_sec": 0, 00:17:41.620 "rw_mbytes_per_sec": 0, 00:17:41.620 "r_mbytes_per_sec": 0, 00:17:41.620 "w_mbytes_per_sec": 0 00:17:41.620 }, 00:17:41.620 "claimed": false, 00:17:41.620 "zoned": false, 00:17:41.620 "supported_io_types": { 00:17:41.620 "read": true, 00:17:41.620 "write": true, 00:17:41.620 "unmap": false, 00:17:41.620 "flush": false, 00:17:41.620 "reset": true, 00:17:41.620 "nvme_admin": false, 00:17:41.620 "nvme_io": false, 00:17:41.620 "nvme_io_md": false, 00:17:41.620 "write_zeroes": true, 00:17:41.620 "zcopy": false, 00:17:41.620 "get_zone_info": false, 00:17:41.620 "zone_management": false, 00:17:41.620 "zone_append": false, 00:17:41.620 "compare": false, 00:17:41.620 "compare_and_write": false, 00:17:41.620 "abort": false, 00:17:41.620 "seek_hole": false, 00:17:41.620 "seek_data": false, 00:17:41.620 "copy": false, 00:17:41.620 "nvme_iov_md": false 00:17:41.620 }, 00:17:41.620 "memory_domains": [ 00:17:41.620 { 00:17:41.620 "dma_device_id": "system", 00:17:41.620 "dma_device_type": 1 00:17:41.620 }, 00:17:41.620 { 00:17:41.620 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:41.620 "dma_device_type": 2 00:17:41.620 }, 00:17:41.620 { 00:17:41.620 "dma_device_id": "system", 00:17:41.620 "dma_device_type": 1 00:17:41.620 }, 00:17:41.620 { 00:17:41.620 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:41.620 "dma_device_type": 2 00:17:41.620 }, 00:17:41.620 { 00:17:41.620 "dma_device_id": "system", 00:17:41.620 "dma_device_type": 1 00:17:41.620 }, 00:17:41.620 { 00:17:41.620 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:41.620 "dma_device_type": 2 00:17:41.620 }, 00:17:41.620 { 00:17:41.620 "dma_device_id": "system", 00:17:41.620 "dma_device_type": 1 00:17:41.620 }, 00:17:41.620 { 00:17:41.620 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:41.620 "dma_device_type": 2 00:17:41.620 } 00:17:41.620 ], 00:17:41.620 "driver_specific": { 00:17:41.620 "raid": { 00:17:41.620 "uuid": "34cf55d3-93dc-4fc0-824a-a3a127d964b7", 00:17:41.620 "strip_size_kb": 0, 00:17:41.620 "state": "online", 00:17:41.620 "raid_level": "raid1", 00:17:41.620 "superblock": true, 00:17:41.620 "num_base_bdevs": 4, 00:17:41.620 "num_base_bdevs_discovered": 4, 00:17:41.620 "num_base_bdevs_operational": 4, 00:17:41.620 "base_bdevs_list": [ 00:17:41.620 { 00:17:41.621 "name": "pt1", 00:17:41.621 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:41.621 "is_configured": true, 00:17:41.621 "data_offset": 2048, 00:17:41.621 "data_size": 63488 00:17:41.621 }, 00:17:41.621 { 00:17:41.621 "name": "pt2", 00:17:41.621 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:41.621 "is_configured": true, 00:17:41.621 "data_offset": 2048, 00:17:41.621 "data_size": 63488 00:17:41.621 }, 00:17:41.621 { 00:17:41.621 "name": "pt3", 00:17:41.621 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:41.621 "is_configured": true, 00:17:41.621 "data_offset": 2048, 00:17:41.621 "data_size": 63488 00:17:41.621 }, 00:17:41.621 { 00:17:41.621 "name": "pt4", 00:17:41.621 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:41.621 "is_configured": true, 00:17:41.621 "data_offset": 2048, 00:17:41.621 "data_size": 63488 00:17:41.621 } 00:17:41.621 ] 00:17:41.621 } 00:17:41.621 } 00:17:41.621 }' 00:17:41.621 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:41.879 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:41.879 pt2 00:17:41.879 pt3 00:17:41.879 pt4' 00:17:41.879 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:41.879 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:41.879 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:41.879 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:41.879 "name": "pt1", 00:17:41.879 "aliases": [ 00:17:41.879 "00000000-0000-0000-0000-000000000001" 00:17:41.879 ], 00:17:41.879 "product_name": "passthru", 00:17:41.879 "block_size": 512, 00:17:41.879 "num_blocks": 65536, 00:17:41.879 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:41.879 "assigned_rate_limits": { 00:17:41.879 "rw_ios_per_sec": 0, 00:17:41.879 "rw_mbytes_per_sec": 0, 00:17:41.879 "r_mbytes_per_sec": 0, 00:17:41.879 "w_mbytes_per_sec": 0 00:17:41.879 }, 00:17:41.879 "claimed": true, 00:17:41.879 "claim_type": "exclusive_write", 00:17:41.879 "zoned": false, 00:17:41.879 "supported_io_types": { 00:17:41.879 "read": true, 00:17:41.879 "write": true, 00:17:41.879 "unmap": true, 00:17:41.879 "flush": true, 00:17:41.879 "reset": true, 00:17:41.879 "nvme_admin": false, 00:17:41.879 "nvme_io": false, 00:17:41.879 "nvme_io_md": false, 00:17:41.879 "write_zeroes": true, 00:17:41.879 "zcopy": true, 00:17:41.879 "get_zone_info": false, 00:17:41.879 "zone_management": false, 00:17:41.879 "zone_append": false, 00:17:41.879 "compare": false, 00:17:41.879 "compare_and_write": false, 00:17:41.879 "abort": true, 00:17:41.879 "seek_hole": false, 00:17:41.879 "seek_data": false, 00:17:41.879 "copy": true, 00:17:41.879 "nvme_iov_md": false 00:17:41.879 }, 00:17:41.879 "memory_domains": [ 00:17:41.879 { 00:17:41.879 "dma_device_id": "system", 00:17:41.879 "dma_device_type": 1 00:17:41.879 }, 00:17:41.879 { 00:17:41.879 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:41.879 "dma_device_type": 2 00:17:41.879 } 00:17:41.879 ], 00:17:41.879 "driver_specific": { 00:17:41.879 "passthru": { 00:17:41.879 "name": "pt1", 00:17:41.879 "base_bdev_name": "malloc1" 00:17:41.879 } 00:17:41.879 } 00:17:41.879 }' 00:17:41.879 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:41.879 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:41.879 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:41.879 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:42.137 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:42.137 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:42.137 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:42.137 18:54:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:42.137 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:42.137 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:42.137 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:42.137 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:42.137 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:42.137 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:42.137 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:42.394 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:42.395 "name": "pt2", 00:17:42.395 "aliases": [ 00:17:42.395 "00000000-0000-0000-0000-000000000002" 00:17:42.395 ], 00:17:42.395 "product_name": "passthru", 00:17:42.395 "block_size": 512, 00:17:42.395 "num_blocks": 65536, 00:17:42.395 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:42.395 "assigned_rate_limits": { 00:17:42.395 "rw_ios_per_sec": 0, 00:17:42.395 "rw_mbytes_per_sec": 0, 00:17:42.395 "r_mbytes_per_sec": 0, 00:17:42.395 "w_mbytes_per_sec": 0 00:17:42.395 }, 00:17:42.395 "claimed": true, 00:17:42.395 "claim_type": "exclusive_write", 00:17:42.395 "zoned": false, 00:17:42.395 "supported_io_types": { 00:17:42.395 "read": true, 00:17:42.395 "write": true, 00:17:42.395 "unmap": true, 00:17:42.395 "flush": true, 00:17:42.395 "reset": true, 00:17:42.395 "nvme_admin": false, 00:17:42.395 "nvme_io": false, 00:17:42.395 "nvme_io_md": false, 00:17:42.395 "write_zeroes": true, 00:17:42.395 "zcopy": true, 00:17:42.395 "get_zone_info": false, 00:17:42.395 "zone_management": false, 00:17:42.395 "zone_append": false, 00:17:42.395 "compare": false, 00:17:42.395 "compare_and_write": false, 00:17:42.395 "abort": true, 00:17:42.395 "seek_hole": false, 00:17:42.395 "seek_data": false, 00:17:42.395 "copy": true, 00:17:42.395 "nvme_iov_md": false 00:17:42.395 }, 00:17:42.395 "memory_domains": [ 00:17:42.395 { 00:17:42.395 "dma_device_id": "system", 00:17:42.395 "dma_device_type": 1 00:17:42.395 }, 00:17:42.395 { 00:17:42.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.395 "dma_device_type": 2 00:17:42.395 } 00:17:42.395 ], 00:17:42.395 "driver_specific": { 00:17:42.395 "passthru": { 00:17:42.395 "name": "pt2", 00:17:42.395 "base_bdev_name": "malloc2" 00:17:42.395 } 00:17:42.395 } 00:17:42.395 }' 00:17:42.395 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:42.395 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:42.395 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:42.395 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:42.395 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:42.653 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:42.653 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:42.653 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:42.653 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:42.653 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:42.653 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:42.653 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:42.653 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:42.653 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:42.653 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:42.911 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:42.911 "name": "pt3", 00:17:42.911 "aliases": [ 00:17:42.911 "00000000-0000-0000-0000-000000000003" 00:17:42.911 ], 00:17:42.911 "product_name": "passthru", 00:17:42.911 "block_size": 512, 00:17:42.911 "num_blocks": 65536, 00:17:42.912 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:42.912 "assigned_rate_limits": { 00:17:42.912 "rw_ios_per_sec": 0, 00:17:42.912 "rw_mbytes_per_sec": 0, 00:17:42.912 "r_mbytes_per_sec": 0, 00:17:42.912 "w_mbytes_per_sec": 0 00:17:42.912 }, 00:17:42.912 "claimed": true, 00:17:42.912 "claim_type": "exclusive_write", 00:17:42.912 "zoned": false, 00:17:42.912 "supported_io_types": { 00:17:42.912 "read": true, 00:17:42.912 "write": true, 00:17:42.912 "unmap": true, 00:17:42.912 "flush": true, 00:17:42.912 "reset": true, 00:17:42.912 "nvme_admin": false, 00:17:42.912 "nvme_io": false, 00:17:42.912 "nvme_io_md": false, 00:17:42.912 "write_zeroes": true, 00:17:42.912 "zcopy": true, 00:17:42.912 "get_zone_info": false, 00:17:42.912 "zone_management": false, 00:17:42.912 "zone_append": false, 00:17:42.912 "compare": false, 00:17:42.912 "compare_and_write": false, 00:17:42.912 "abort": true, 00:17:42.912 "seek_hole": false, 00:17:42.912 "seek_data": false, 00:17:42.912 "copy": true, 00:17:42.912 "nvme_iov_md": false 00:17:42.912 }, 00:17:42.912 "memory_domains": [ 00:17:42.912 { 00:17:42.912 "dma_device_id": "system", 00:17:42.912 "dma_device_type": 1 00:17:42.912 }, 00:17:42.912 { 00:17:42.912 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.912 "dma_device_type": 2 00:17:42.912 } 00:17:42.912 ], 00:17:42.912 "driver_specific": { 00:17:42.912 "passthru": { 00:17:42.912 "name": "pt3", 00:17:42.912 "base_bdev_name": "malloc3" 00:17:42.912 } 00:17:42.912 } 00:17:42.912 }' 00:17:42.912 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:42.912 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:42.912 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:42.912 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:42.912 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:42.912 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:42.912 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.170 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.170 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:43.170 18:54:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:43.170 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:43.170 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:43.170 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:43.170 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:43.170 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:43.428 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:43.428 "name": "pt4", 00:17:43.428 "aliases": [ 00:17:43.428 "00000000-0000-0000-0000-000000000004" 00:17:43.428 ], 00:17:43.428 "product_name": "passthru", 00:17:43.428 "block_size": 512, 00:17:43.428 "num_blocks": 65536, 00:17:43.428 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:43.428 "assigned_rate_limits": { 00:17:43.428 "rw_ios_per_sec": 0, 00:17:43.428 "rw_mbytes_per_sec": 0, 00:17:43.428 "r_mbytes_per_sec": 0, 00:17:43.428 "w_mbytes_per_sec": 0 00:17:43.428 }, 00:17:43.428 "claimed": true, 00:17:43.428 "claim_type": "exclusive_write", 00:17:43.428 "zoned": false, 00:17:43.428 "supported_io_types": { 00:17:43.428 "read": true, 00:17:43.428 "write": true, 00:17:43.428 "unmap": true, 00:17:43.429 "flush": true, 00:17:43.429 "reset": true, 00:17:43.429 "nvme_admin": false, 00:17:43.429 "nvme_io": false, 00:17:43.429 "nvme_io_md": false, 00:17:43.429 "write_zeroes": true, 00:17:43.429 "zcopy": true, 00:17:43.429 "get_zone_info": false, 00:17:43.429 "zone_management": false, 00:17:43.429 "zone_append": false, 00:17:43.429 "compare": false, 00:17:43.429 "compare_and_write": false, 00:17:43.429 "abort": true, 00:17:43.429 "seek_hole": false, 00:17:43.429 "seek_data": false, 00:17:43.429 "copy": true, 00:17:43.429 "nvme_iov_md": false 00:17:43.429 }, 00:17:43.429 "memory_domains": [ 00:17:43.429 { 00:17:43.429 "dma_device_id": "system", 00:17:43.429 "dma_device_type": 1 00:17:43.429 }, 00:17:43.429 { 00:17:43.429 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.429 "dma_device_type": 2 00:17:43.429 } 00:17:43.429 ], 00:17:43.429 "driver_specific": { 00:17:43.429 "passthru": { 00:17:43.429 "name": "pt4", 00:17:43.429 "base_bdev_name": "malloc4" 00:17:43.429 } 00:17:43.429 } 00:17:43.429 }' 00:17:43.429 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.429 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.429 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:43.429 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.429 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.429 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:43.429 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.429 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.703 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:43.703 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:43.703 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:43.703 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:43.703 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:43.703 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:43.703 [2024-07-24 18:54:28.682827] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:43.703 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 34cf55d3-93dc-4fc0-824a-a3a127d964b7 '!=' 34cf55d3-93dc-4fc0-824a-a3a127d964b7 ']' 00:17:43.703 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:17:43.703 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:43.703 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:43.703 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:43.962 [2024-07-24 18:54:28.847105] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:17:43.962 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:43.962 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:43.962 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:43.962 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:43.962 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:43.962 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:43.962 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:43.962 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:43.962 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:43.962 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:43.962 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.962 18:54:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:44.220 18:54:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:44.220 "name": "raid_bdev1", 00:17:44.220 "uuid": "34cf55d3-93dc-4fc0-824a-a3a127d964b7", 00:17:44.220 "strip_size_kb": 0, 00:17:44.220 "state": "online", 00:17:44.220 "raid_level": "raid1", 00:17:44.220 "superblock": true, 00:17:44.220 "num_base_bdevs": 4, 00:17:44.220 "num_base_bdevs_discovered": 3, 00:17:44.220 "num_base_bdevs_operational": 3, 00:17:44.220 "base_bdevs_list": [ 00:17:44.220 { 00:17:44.220 "name": null, 00:17:44.220 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:44.220 "is_configured": false, 00:17:44.220 "data_offset": 2048, 00:17:44.220 "data_size": 63488 00:17:44.220 }, 00:17:44.220 { 00:17:44.220 "name": "pt2", 00:17:44.220 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:44.220 "is_configured": true, 00:17:44.220 "data_offset": 2048, 00:17:44.220 "data_size": 63488 00:17:44.220 }, 00:17:44.220 { 00:17:44.220 "name": "pt3", 00:17:44.220 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:44.220 "is_configured": true, 00:17:44.220 "data_offset": 2048, 00:17:44.220 "data_size": 63488 00:17:44.220 }, 00:17:44.220 { 00:17:44.220 "name": "pt4", 00:17:44.220 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:44.220 "is_configured": true, 00:17:44.220 "data_offset": 2048, 00:17:44.220 "data_size": 63488 00:17:44.220 } 00:17:44.220 ] 00:17:44.220 }' 00:17:44.220 18:54:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:44.220 18:54:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:44.786 18:54:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:44.786 [2024-07-24 18:54:29.673218] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:44.786 [2024-07-24 18:54:29.673234] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:44.786 [2024-07-24 18:54:29.673267] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:44.786 [2024-07-24 18:54:29.673313] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:44.786 [2024-07-24 18:54:29.673319] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14d1ee0 name raid_bdev1, state offline 00:17:44.786 18:54:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.786 18:54:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:17:45.044 18:54:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:17:45.044 18:54:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:17:45.044 18:54:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:17:45.044 18:54:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:45.044 18:54:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:45.044 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:45.044 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:45.044 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:45.302 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:45.302 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:45.302 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:17:45.561 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:45.561 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:45.561 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:17:45.561 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:45.561 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:45.561 [2024-07-24 18:54:30.459231] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:45.561 [2024-07-24 18:54:30.459264] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:45.561 [2024-07-24 18:54:30.459274] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d4eb0 00:17:45.561 [2024-07-24 18:54:30.459280] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:45.561 [2024-07-24 18:54:30.460753] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:45.561 [2024-07-24 18:54:30.460775] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:45.561 [2024-07-24 18:54:30.460828] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:45.561 [2024-07-24 18:54:30.460849] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:45.561 pt2 00:17:45.561 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:45.561 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:45.561 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:45.561 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:45.561 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:45.561 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:45.561 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:45.561 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:45.561 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:45.561 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:45.561 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.561 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:45.820 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:45.820 "name": "raid_bdev1", 00:17:45.820 "uuid": "34cf55d3-93dc-4fc0-824a-a3a127d964b7", 00:17:45.820 "strip_size_kb": 0, 00:17:45.820 "state": "configuring", 00:17:45.820 "raid_level": "raid1", 00:17:45.820 "superblock": true, 00:17:45.820 "num_base_bdevs": 4, 00:17:45.820 "num_base_bdevs_discovered": 1, 00:17:45.820 "num_base_bdevs_operational": 3, 00:17:45.820 "base_bdevs_list": [ 00:17:45.820 { 00:17:45.820 "name": null, 00:17:45.820 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:45.820 "is_configured": false, 00:17:45.820 "data_offset": 2048, 00:17:45.820 "data_size": 63488 00:17:45.820 }, 00:17:45.820 { 00:17:45.820 "name": "pt2", 00:17:45.820 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:45.820 "is_configured": true, 00:17:45.820 "data_offset": 2048, 00:17:45.820 "data_size": 63488 00:17:45.820 }, 00:17:45.820 { 00:17:45.820 "name": null, 00:17:45.820 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:45.820 "is_configured": false, 00:17:45.820 "data_offset": 2048, 00:17:45.820 "data_size": 63488 00:17:45.820 }, 00:17:45.820 { 00:17:45.820 "name": null, 00:17:45.820 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:45.820 "is_configured": false, 00:17:45.820 "data_offset": 2048, 00:17:45.820 "data_size": 63488 00:17:45.820 } 00:17:45.820 ] 00:17:45.820 }' 00:17:45.820 18:54:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:45.820 18:54:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:46.387 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:17:46.387 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:46.387 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:46.387 [2024-07-24 18:54:31.285385] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:46.387 [2024-07-24 18:54:31.285423] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:46.387 [2024-07-24 18:54:31.285450] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1325a40 00:17:46.387 [2024-07-24 18:54:31.285456] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:46.387 [2024-07-24 18:54:31.285742] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:46.387 [2024-07-24 18:54:31.285753] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:46.387 [2024-07-24 18:54:31.285800] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:46.387 [2024-07-24 18:54:31.285813] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:46.387 pt3 00:17:46.387 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:46.387 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:46.387 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:46.387 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:46.387 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:46.387 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:46.387 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:46.387 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:46.387 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:46.387 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:46.387 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:46.387 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:46.645 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:46.645 "name": "raid_bdev1", 00:17:46.645 "uuid": "34cf55d3-93dc-4fc0-824a-a3a127d964b7", 00:17:46.645 "strip_size_kb": 0, 00:17:46.645 "state": "configuring", 00:17:46.645 "raid_level": "raid1", 00:17:46.645 "superblock": true, 00:17:46.645 "num_base_bdevs": 4, 00:17:46.645 "num_base_bdevs_discovered": 2, 00:17:46.645 "num_base_bdevs_operational": 3, 00:17:46.645 "base_bdevs_list": [ 00:17:46.645 { 00:17:46.645 "name": null, 00:17:46.645 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:46.645 "is_configured": false, 00:17:46.645 "data_offset": 2048, 00:17:46.645 "data_size": 63488 00:17:46.645 }, 00:17:46.645 { 00:17:46.645 "name": "pt2", 00:17:46.645 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:46.645 "is_configured": true, 00:17:46.645 "data_offset": 2048, 00:17:46.645 "data_size": 63488 00:17:46.645 }, 00:17:46.645 { 00:17:46.645 "name": "pt3", 00:17:46.645 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:46.645 "is_configured": true, 00:17:46.645 "data_offset": 2048, 00:17:46.645 "data_size": 63488 00:17:46.645 }, 00:17:46.645 { 00:17:46.645 "name": null, 00:17:46.645 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:46.645 "is_configured": false, 00:17:46.645 "data_offset": 2048, 00:17:46.645 "data_size": 63488 00:17:46.645 } 00:17:46.645 ] 00:17:46.645 }' 00:17:46.645 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:46.645 18:54:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:47.212 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:17:47.212 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:47.212 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:17:47.212 18:54:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:47.212 [2024-07-24 18:54:32.091458] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:47.212 [2024-07-24 18:54:32.091496] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:47.212 [2024-07-24 18:54:32.091507] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d4600 00:17:47.212 [2024-07-24 18:54:32.091529] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:47.212 [2024-07-24 18:54:32.091798] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:47.212 [2024-07-24 18:54:32.091808] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:47.212 [2024-07-24 18:54:32.091850] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:17:47.212 [2024-07-24 18:54:32.091862] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:47.212 [2024-07-24 18:54:32.091943] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1323d50 00:17:47.212 [2024-07-24 18:54:32.091949] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:47.212 [2024-07-24 18:54:32.092066] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13bb9f0 00:17:47.212 [2024-07-24 18:54:32.092158] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1323d50 00:17:47.212 [2024-07-24 18:54:32.092164] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1323d50 00:17:47.212 [2024-07-24 18:54:32.092227] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:47.212 pt4 00:17:47.212 18:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:47.212 18:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:47.212 18:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:47.212 18:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:47.212 18:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:47.212 18:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:47.212 18:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:47.212 18:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:47.212 18:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:47.212 18:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:47.212 18:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:47.212 18:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:47.471 18:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:47.471 "name": "raid_bdev1", 00:17:47.471 "uuid": "34cf55d3-93dc-4fc0-824a-a3a127d964b7", 00:17:47.471 "strip_size_kb": 0, 00:17:47.471 "state": "online", 00:17:47.471 "raid_level": "raid1", 00:17:47.471 "superblock": true, 00:17:47.471 "num_base_bdevs": 4, 00:17:47.471 "num_base_bdevs_discovered": 3, 00:17:47.471 "num_base_bdevs_operational": 3, 00:17:47.471 "base_bdevs_list": [ 00:17:47.471 { 00:17:47.471 "name": null, 00:17:47.471 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:47.471 "is_configured": false, 00:17:47.471 "data_offset": 2048, 00:17:47.471 "data_size": 63488 00:17:47.471 }, 00:17:47.471 { 00:17:47.471 "name": "pt2", 00:17:47.471 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:47.471 "is_configured": true, 00:17:47.471 "data_offset": 2048, 00:17:47.471 "data_size": 63488 00:17:47.471 }, 00:17:47.471 { 00:17:47.471 "name": "pt3", 00:17:47.471 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:47.471 "is_configured": true, 00:17:47.471 "data_offset": 2048, 00:17:47.471 "data_size": 63488 00:17:47.471 }, 00:17:47.471 { 00:17:47.471 "name": "pt4", 00:17:47.471 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:47.471 "is_configured": true, 00:17:47.471 "data_offset": 2048, 00:17:47.471 "data_size": 63488 00:17:47.471 } 00:17:47.471 ] 00:17:47.471 }' 00:17:47.471 18:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:47.471 18:54:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:48.038 18:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:48.038 [2024-07-24 18:54:32.933617] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:48.038 [2024-07-24 18:54:32.933636] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:48.038 [2024-07-24 18:54:32.933672] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:48.038 [2024-07-24 18:54:32.933717] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:48.038 [2024-07-24 18:54:32.933722] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1323d50 name raid_bdev1, state offline 00:17:48.038 18:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:48.038 18:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:17:48.296 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:17:48.296 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:17:48.296 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:17:48.296 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:17:48.297 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:17:48.555 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:48.555 [2024-07-24 18:54:33.479015] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:48.555 [2024-07-24 18:54:33.479044] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:48.555 [2024-07-24 18:54:33.479055] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d56a0 00:17:48.555 [2024-07-24 18:54:33.479077] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:48.555 [2024-07-24 18:54:33.480515] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:48.555 [2024-07-24 18:54:33.480536] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:48.555 [2024-07-24 18:54:33.480583] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:48.555 [2024-07-24 18:54:33.480602] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:48.555 [2024-07-24 18:54:33.480678] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:17:48.555 [2024-07-24 18:54:33.480685] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:48.555 [2024-07-24 18:54:33.480693] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13bbdd0 name raid_bdev1, state configuring 00:17:48.555 [2024-07-24 18:54:33.480710] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:48.555 [2024-07-24 18:54:33.480762] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:48.555 pt1 00:17:48.555 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:17:48.555 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:48.555 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:48.555 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:48.555 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:48.555 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:48.555 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:48.555 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:48.555 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:48.555 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:48.555 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:48.555 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:48.555 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:48.813 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:48.813 "name": "raid_bdev1", 00:17:48.813 "uuid": "34cf55d3-93dc-4fc0-824a-a3a127d964b7", 00:17:48.813 "strip_size_kb": 0, 00:17:48.813 "state": "configuring", 00:17:48.813 "raid_level": "raid1", 00:17:48.813 "superblock": true, 00:17:48.813 "num_base_bdevs": 4, 00:17:48.813 "num_base_bdevs_discovered": 2, 00:17:48.813 "num_base_bdevs_operational": 3, 00:17:48.813 "base_bdevs_list": [ 00:17:48.813 { 00:17:48.813 "name": null, 00:17:48.813 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:48.813 "is_configured": false, 00:17:48.813 "data_offset": 2048, 00:17:48.813 "data_size": 63488 00:17:48.813 }, 00:17:48.813 { 00:17:48.813 "name": "pt2", 00:17:48.813 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:48.813 "is_configured": true, 00:17:48.813 "data_offset": 2048, 00:17:48.813 "data_size": 63488 00:17:48.813 }, 00:17:48.813 { 00:17:48.813 "name": "pt3", 00:17:48.813 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:48.813 "is_configured": true, 00:17:48.813 "data_offset": 2048, 00:17:48.813 "data_size": 63488 00:17:48.813 }, 00:17:48.813 { 00:17:48.813 "name": null, 00:17:48.813 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:48.813 "is_configured": false, 00:17:48.814 "data_offset": 2048, 00:17:48.814 "data_size": 63488 00:17:48.814 } 00:17:48.814 ] 00:17:48.814 }' 00:17:48.814 18:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:48.814 18:54:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:49.379 18:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:49.379 18:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:17:49.379 18:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:17:49.379 18:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:49.637 [2024-07-24 18:54:34.489644] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:49.637 [2024-07-24 18:54:34.489677] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:49.637 [2024-07-24 18:54:34.489688] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d4070 00:17:49.638 [2024-07-24 18:54:34.489695] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:49.638 [2024-07-24 18:54:34.489983] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:49.638 [2024-07-24 18:54:34.489994] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:49.638 [2024-07-24 18:54:34.490038] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:17:49.638 [2024-07-24 18:54:34.490051] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:49.638 [2024-07-24 18:54:34.490141] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1325050 00:17:49.638 [2024-07-24 18:54:34.490151] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:49.638 [2024-07-24 18:54:34.490278] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13bb910 00:17:49.638 [2024-07-24 18:54:34.490382] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1325050 00:17:49.638 [2024-07-24 18:54:34.490388] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1325050 00:17:49.638 [2024-07-24 18:54:34.490461] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:49.638 pt4 00:17:49.638 18:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:49.638 18:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:49.638 18:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:49.638 18:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:49.638 18:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:49.638 18:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:49.638 18:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:49.638 18:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:49.638 18:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:49.638 18:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:49.638 18:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.638 18:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:49.897 18:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:49.897 "name": "raid_bdev1", 00:17:49.897 "uuid": "34cf55d3-93dc-4fc0-824a-a3a127d964b7", 00:17:49.897 "strip_size_kb": 0, 00:17:49.897 "state": "online", 00:17:49.897 "raid_level": "raid1", 00:17:49.897 "superblock": true, 00:17:49.897 "num_base_bdevs": 4, 00:17:49.897 "num_base_bdevs_discovered": 3, 00:17:49.897 "num_base_bdevs_operational": 3, 00:17:49.897 "base_bdevs_list": [ 00:17:49.897 { 00:17:49.897 "name": null, 00:17:49.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:49.897 "is_configured": false, 00:17:49.897 "data_offset": 2048, 00:17:49.897 "data_size": 63488 00:17:49.897 }, 00:17:49.897 { 00:17:49.897 "name": "pt2", 00:17:49.897 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:49.897 "is_configured": true, 00:17:49.897 "data_offset": 2048, 00:17:49.897 "data_size": 63488 00:17:49.897 }, 00:17:49.897 { 00:17:49.897 "name": "pt3", 00:17:49.897 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:49.897 "is_configured": true, 00:17:49.897 "data_offset": 2048, 00:17:49.897 "data_size": 63488 00:17:49.897 }, 00:17:49.897 { 00:17:49.897 "name": "pt4", 00:17:49.897 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:49.897 "is_configured": true, 00:17:49.897 "data_offset": 2048, 00:17:49.897 "data_size": 63488 00:17:49.897 } 00:17:49.897 ] 00:17:49.897 }' 00:17:49.897 18:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:49.897 18:54:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:50.464 18:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:17:50.464 18:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:50.464 18:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:17:50.464 18:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:50.464 18:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:17:50.723 [2024-07-24 18:54:35.512480] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:50.723 18:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 34cf55d3-93dc-4fc0-824a-a3a127d964b7 '!=' 34cf55d3-93dc-4fc0-824a-a3a127d964b7 ']' 00:17:50.723 18:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2143830 00:17:50.723 18:54:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2143830 ']' 00:17:50.723 18:54:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2143830 00:17:50.723 18:54:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:17:50.723 18:54:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:50.723 18:54:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2143830 00:17:50.723 18:54:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:50.723 18:54:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:50.723 18:54:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2143830' 00:17:50.723 killing process with pid 2143830 00:17:50.723 18:54:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2143830 00:17:50.723 [2024-07-24 18:54:35.568877] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:50.723 [2024-07-24 18:54:35.568919] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:50.723 [2024-07-24 18:54:35.568970] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:50.723 [2024-07-24 18:54:35.568976] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1325050 name raid_bdev1, state offline 00:17:50.723 18:54:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2143830 00:17:50.723 [2024-07-24 18:54:35.624988] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:50.982 18:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:50.982 00:17:50.982 real 0m19.153s 00:17:50.982 user 0m35.479s 00:17:50.982 sys 0m2.830s 00:17:50.982 18:54:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:50.982 18:54:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:50.982 ************************************ 00:17:50.982 END TEST raid_superblock_test 00:17:50.982 ************************************ 00:17:50.982 18:54:35 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:17:50.982 18:54:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:50.982 18:54:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:50.982 18:54:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:50.982 ************************************ 00:17:50.982 START TEST raid_read_error_test 00:17:50.982 ************************************ 00:17:50.982 18:54:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:17:50.982 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:50.982 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:17:50.982 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:50.982 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:50.982 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:50.982 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:50.982 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:50.982 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:50.982 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:50.982 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:50.982 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:50.982 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:50.982 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:50.982 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:50.982 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:17:50.982 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:51.241 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:51.241 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:51.241 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:51.241 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:51.241 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:51.241 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:51.241 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:51.241 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:51.241 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:51.241 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:51.241 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:51.241 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.RPI4Ja3s6O 00:17:51.241 18:54:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2147513 00:17:51.241 18:54:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2147513 /var/tmp/spdk-raid.sock 00:17:51.241 18:54:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:51.241 18:54:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2147513 ']' 00:17:51.241 18:54:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:51.241 18:54:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:51.241 18:54:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:51.241 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:51.241 18:54:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:51.241 18:54:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:51.241 [2024-07-24 18:54:36.048120] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:17:51.241 [2024-07-24 18:54:36.048156] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2147513 ] 00:17:51.241 [2024-07-24 18:54:36.114400] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:51.241 [2024-07-24 18:54:36.190457] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:51.242 [2024-07-24 18:54:36.239743] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:51.242 [2024-07-24 18:54:36.239767] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:52.178 18:54:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:52.178 18:54:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:52.178 18:54:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:52.178 18:54:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:52.178 BaseBdev1_malloc 00:17:52.178 18:54:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:52.178 true 00:17:52.178 18:54:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:52.437 [2024-07-24 18:54:37.343996] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:52.437 [2024-07-24 18:54:37.344036] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:52.437 [2024-07-24 18:54:37.344046] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2857d20 00:17:52.437 [2024-07-24 18:54:37.344052] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:52.437 [2024-07-24 18:54:37.345165] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:52.437 [2024-07-24 18:54:37.345184] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:52.437 BaseBdev1 00:17:52.437 18:54:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:52.437 18:54:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:52.696 BaseBdev2_malloc 00:17:52.696 18:54:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:52.696 true 00:17:52.955 18:54:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:52.955 [2024-07-24 18:54:37.860673] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:52.955 [2024-07-24 18:54:37.860698] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:52.955 [2024-07-24 18:54:37.860708] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x285cd50 00:17:52.955 [2024-07-24 18:54:37.860714] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:52.955 [2024-07-24 18:54:37.861683] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:52.955 [2024-07-24 18:54:37.861702] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:52.955 BaseBdev2 00:17:52.955 18:54:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:52.955 18:54:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:53.213 BaseBdev3_malloc 00:17:53.214 18:54:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:53.214 true 00:17:53.214 18:54:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:53.472 [2024-07-24 18:54:38.369476] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:53.472 [2024-07-24 18:54:38.369511] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:53.472 [2024-07-24 18:54:38.369521] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x285bef0 00:17:53.472 [2024-07-24 18:54:38.369528] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:53.472 [2024-07-24 18:54:38.370544] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:53.472 [2024-07-24 18:54:38.370566] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:53.472 BaseBdev3 00:17:53.472 18:54:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:53.472 18:54:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:17:53.760 BaseBdev4_malloc 00:17:53.760 18:54:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:17:53.760 true 00:17:53.760 18:54:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:17:54.019 [2024-07-24 18:54:38.890348] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:17:54.019 [2024-07-24 18:54:38.890382] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:54.019 [2024-07-24 18:54:38.890395] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2860280 00:17:54.019 [2024-07-24 18:54:38.890401] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:54.019 [2024-07-24 18:54:38.891515] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:54.019 [2024-07-24 18:54:38.891537] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:17:54.019 BaseBdev4 00:17:54.019 18:54:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:17:54.278 [2024-07-24 18:54:39.058817] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:54.278 [2024-07-24 18:54:39.059744] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:54.279 [2024-07-24 18:54:39.059790] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:54.279 [2024-07-24 18:54:39.059826] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:54.279 [2024-07-24 18:54:39.059985] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2861d90 00:17:54.279 [2024-07-24 18:54:39.059991] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:54.279 [2024-07-24 18:54:39.060128] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2861aa0 00:17:54.279 [2024-07-24 18:54:39.060237] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2861d90 00:17:54.279 [2024-07-24 18:54:39.060242] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2861d90 00:17:54.279 [2024-07-24 18:54:39.060309] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:54.279 18:54:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:17:54.279 18:54:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:54.279 18:54:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:54.279 18:54:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:54.279 18:54:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:54.279 18:54:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:54.279 18:54:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:54.279 18:54:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:54.279 18:54:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:54.279 18:54:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:54.279 18:54:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.279 18:54:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:54.279 18:54:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:54.279 "name": "raid_bdev1", 00:17:54.279 "uuid": "554f7274-1784-4328-bbfe-bea23e3f4498", 00:17:54.279 "strip_size_kb": 0, 00:17:54.279 "state": "online", 00:17:54.279 "raid_level": "raid1", 00:17:54.279 "superblock": true, 00:17:54.279 "num_base_bdevs": 4, 00:17:54.279 "num_base_bdevs_discovered": 4, 00:17:54.279 "num_base_bdevs_operational": 4, 00:17:54.279 "base_bdevs_list": [ 00:17:54.279 { 00:17:54.279 "name": "BaseBdev1", 00:17:54.279 "uuid": "96752b27-1a4d-56bd-b208-09009d3f4cb1", 00:17:54.279 "is_configured": true, 00:17:54.279 "data_offset": 2048, 00:17:54.279 "data_size": 63488 00:17:54.279 }, 00:17:54.279 { 00:17:54.279 "name": "BaseBdev2", 00:17:54.279 "uuid": "cdbdf6aa-b20d-5883-b11e-1278d9ff9243", 00:17:54.279 "is_configured": true, 00:17:54.279 "data_offset": 2048, 00:17:54.279 "data_size": 63488 00:17:54.279 }, 00:17:54.279 { 00:17:54.279 "name": "BaseBdev3", 00:17:54.279 "uuid": "9bae1f9c-e61f-5855-975e-ec83e1adfc8b", 00:17:54.279 "is_configured": true, 00:17:54.279 "data_offset": 2048, 00:17:54.279 "data_size": 63488 00:17:54.279 }, 00:17:54.279 { 00:17:54.279 "name": "BaseBdev4", 00:17:54.279 "uuid": "de62eb75-98ce-5979-9d71-402aeb30b031", 00:17:54.279 "is_configured": true, 00:17:54.279 "data_offset": 2048, 00:17:54.279 "data_size": 63488 00:17:54.279 } 00:17:54.279 ] 00:17:54.279 }' 00:17:54.279 18:54:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:54.279 18:54:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:54.846 18:54:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:54.846 18:54:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:54.846 [2024-07-24 18:54:39.784893] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26b3430 00:17:55.781 18:54:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:56.040 18:54:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:56.040 18:54:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:56.040 18:54:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:17:56.040 18:54:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:17:56.040 18:54:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:17:56.040 18:54:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:56.040 18:54:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:56.040 18:54:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:56.040 18:54:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:56.041 18:54:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:56.041 18:54:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:56.041 18:54:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:56.041 18:54:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:56.041 18:54:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:56.041 18:54:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.041 18:54:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:56.300 18:54:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:56.300 "name": "raid_bdev1", 00:17:56.300 "uuid": "554f7274-1784-4328-bbfe-bea23e3f4498", 00:17:56.300 "strip_size_kb": 0, 00:17:56.300 "state": "online", 00:17:56.300 "raid_level": "raid1", 00:17:56.300 "superblock": true, 00:17:56.300 "num_base_bdevs": 4, 00:17:56.300 "num_base_bdevs_discovered": 4, 00:17:56.300 "num_base_bdevs_operational": 4, 00:17:56.300 "base_bdevs_list": [ 00:17:56.300 { 00:17:56.300 "name": "BaseBdev1", 00:17:56.300 "uuid": "96752b27-1a4d-56bd-b208-09009d3f4cb1", 00:17:56.300 "is_configured": true, 00:17:56.300 "data_offset": 2048, 00:17:56.300 "data_size": 63488 00:17:56.300 }, 00:17:56.300 { 00:17:56.300 "name": "BaseBdev2", 00:17:56.300 "uuid": "cdbdf6aa-b20d-5883-b11e-1278d9ff9243", 00:17:56.300 "is_configured": true, 00:17:56.300 "data_offset": 2048, 00:17:56.300 "data_size": 63488 00:17:56.300 }, 00:17:56.300 { 00:17:56.300 "name": "BaseBdev3", 00:17:56.300 "uuid": "9bae1f9c-e61f-5855-975e-ec83e1adfc8b", 00:17:56.300 "is_configured": true, 00:17:56.300 "data_offset": 2048, 00:17:56.300 "data_size": 63488 00:17:56.300 }, 00:17:56.300 { 00:17:56.300 "name": "BaseBdev4", 00:17:56.300 "uuid": "de62eb75-98ce-5979-9d71-402aeb30b031", 00:17:56.300 "is_configured": true, 00:17:56.300 "data_offset": 2048, 00:17:56.300 "data_size": 63488 00:17:56.300 } 00:17:56.300 ] 00:17:56.300 }' 00:17:56.300 18:54:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:56.300 18:54:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:56.867 18:54:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:56.867 [2024-07-24 18:54:41.720697] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:56.867 [2024-07-24 18:54:41.720728] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:56.867 [2024-07-24 18:54:41.723003] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:56.867 [2024-07-24 18:54:41.723034] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:56.867 [2024-07-24 18:54:41.723123] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:56.867 [2024-07-24 18:54:41.723130] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2861d90 name raid_bdev1, state offline 00:17:56.867 0 00:17:56.867 18:54:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2147513 00:17:56.867 18:54:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2147513 ']' 00:17:56.867 18:54:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2147513 00:17:56.867 18:54:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:17:56.867 18:54:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:56.867 18:54:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2147513 00:17:56.867 18:54:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:56.867 18:54:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:56.867 18:54:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2147513' 00:17:56.867 killing process with pid 2147513 00:17:56.867 18:54:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2147513 00:17:56.867 [2024-07-24 18:54:41.784056] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:56.867 18:54:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2147513 00:17:56.867 [2024-07-24 18:54:41.810092] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:57.126 18:54:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.RPI4Ja3s6O 00:17:57.126 18:54:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:57.126 18:54:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:57.126 18:54:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:57.126 18:54:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:57.126 18:54:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:57.126 18:54:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:57.126 18:54:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:57.126 00:17:57.126 real 0m6.009s 00:17:57.126 user 0m9.465s 00:17:57.126 sys 0m0.899s 00:17:57.126 18:54:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:57.126 18:54:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:57.126 ************************************ 00:17:57.126 END TEST raid_read_error_test 00:17:57.126 ************************************ 00:17:57.126 18:54:42 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:17:57.126 18:54:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:57.126 18:54:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:57.126 18:54:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:57.126 ************************************ 00:17:57.126 START TEST raid_write_error_test 00:17:57.126 ************************************ 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.xntcTU7Jmo 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2148634 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2148634 /var/tmp/spdk-raid.sock 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2148634 ']' 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:57.126 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:57.126 18:54:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:57.126 [2024-07-24 18:54:42.120349] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:17:57.126 [2024-07-24 18:54:42.120386] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2148634 ] 00:17:57.385 [2024-07-24 18:54:42.182472] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:57.385 [2024-07-24 18:54:42.259651] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:57.385 [2024-07-24 18:54:42.310139] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:57.385 [2024-07-24 18:54:42.310167] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:57.952 18:54:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:57.952 18:54:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:57.952 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:57.952 18:54:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:58.210 BaseBdev1_malloc 00:17:58.210 18:54:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:58.468 true 00:17:58.468 18:54:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:58.468 [2024-07-24 18:54:43.401933] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:58.468 [2024-07-24 18:54:43.401966] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:58.468 [2024-07-24 18:54:43.401977] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x160ed20 00:17:58.468 [2024-07-24 18:54:43.401983] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:58.468 [2024-07-24 18:54:43.403187] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:58.468 [2024-07-24 18:54:43.403208] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:58.468 BaseBdev1 00:17:58.468 18:54:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:58.468 18:54:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:58.726 BaseBdev2_malloc 00:17:58.726 18:54:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:58.726 true 00:17:58.985 18:54:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:58.985 [2024-07-24 18:54:43.894672] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:58.985 [2024-07-24 18:54:43.894702] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:58.985 [2024-07-24 18:54:43.894714] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1613d50 00:17:58.985 [2024-07-24 18:54:43.894719] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:58.985 [2024-07-24 18:54:43.895788] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:58.985 [2024-07-24 18:54:43.895809] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:58.985 BaseBdev2 00:17:58.985 18:54:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:58.985 18:54:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:59.243 BaseBdev3_malloc 00:17:59.243 18:54:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:59.243 true 00:17:59.243 18:54:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:59.502 [2024-07-24 18:54:44.391551] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:59.502 [2024-07-24 18:54:44.391583] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:59.502 [2024-07-24 18:54:44.391600] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1612ef0 00:17:59.502 [2024-07-24 18:54:44.391621] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:59.502 [2024-07-24 18:54:44.392687] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:59.502 [2024-07-24 18:54:44.392707] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:59.502 BaseBdev3 00:17:59.502 18:54:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:59.502 18:54:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:17:59.760 BaseBdev4_malloc 00:17:59.760 18:54:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:17:59.760 true 00:17:59.760 18:54:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:18:00.019 [2024-07-24 18:54:44.876290] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:18:00.019 [2024-07-24 18:54:44.876318] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:00.019 [2024-07-24 18:54:44.876330] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1617280 00:18:00.019 [2024-07-24 18:54:44.876336] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:00.019 [2024-07-24 18:54:44.877365] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:00.019 [2024-07-24 18:54:44.877385] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:18:00.019 BaseBdev4 00:18:00.019 18:54:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:18:00.277 [2024-07-24 18:54:45.028710] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:00.277 [2024-07-24 18:54:45.029551] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:00.277 [2024-07-24 18:54:45.029596] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:00.277 [2024-07-24 18:54:45.029633] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:00.277 [2024-07-24 18:54:45.029791] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1618d90 00:18:00.277 [2024-07-24 18:54:45.029797] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:00.277 [2024-07-24 18:54:45.029926] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1618aa0 00:18:00.277 [2024-07-24 18:54:45.030033] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1618d90 00:18:00.277 [2024-07-24 18:54:45.030038] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1618d90 00:18:00.277 [2024-07-24 18:54:45.030106] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:00.277 18:54:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:18:00.277 18:54:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:00.277 18:54:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:00.277 18:54:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:00.277 18:54:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:00.277 18:54:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:00.277 18:54:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:00.277 18:54:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:00.277 18:54:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:00.277 18:54:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:00.277 18:54:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.277 18:54:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:00.277 18:54:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:00.277 "name": "raid_bdev1", 00:18:00.277 "uuid": "151a6bda-711b-4aaa-911a-ba0137b98af3", 00:18:00.277 "strip_size_kb": 0, 00:18:00.277 "state": "online", 00:18:00.277 "raid_level": "raid1", 00:18:00.277 "superblock": true, 00:18:00.277 "num_base_bdevs": 4, 00:18:00.277 "num_base_bdevs_discovered": 4, 00:18:00.277 "num_base_bdevs_operational": 4, 00:18:00.277 "base_bdevs_list": [ 00:18:00.277 { 00:18:00.277 "name": "BaseBdev1", 00:18:00.277 "uuid": "15821a5f-803c-5fd1-919e-a435b33dc239", 00:18:00.277 "is_configured": true, 00:18:00.277 "data_offset": 2048, 00:18:00.277 "data_size": 63488 00:18:00.277 }, 00:18:00.277 { 00:18:00.277 "name": "BaseBdev2", 00:18:00.277 "uuid": "622ec2b1-4482-553d-a7a6-1681b35cf045", 00:18:00.277 "is_configured": true, 00:18:00.277 "data_offset": 2048, 00:18:00.277 "data_size": 63488 00:18:00.277 }, 00:18:00.277 { 00:18:00.277 "name": "BaseBdev3", 00:18:00.277 "uuid": "afa22c22-a048-52aa-83a8-13b2a4449b2a", 00:18:00.277 "is_configured": true, 00:18:00.277 "data_offset": 2048, 00:18:00.277 "data_size": 63488 00:18:00.277 }, 00:18:00.277 { 00:18:00.277 "name": "BaseBdev4", 00:18:00.277 "uuid": "566a2e06-051b-5a99-b740-0c9650a4fd9c", 00:18:00.277 "is_configured": true, 00:18:00.277 "data_offset": 2048, 00:18:00.277 "data_size": 63488 00:18:00.277 } 00:18:00.277 ] 00:18:00.277 }' 00:18:00.277 18:54:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:00.277 18:54:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:00.844 18:54:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:00.844 18:54:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:00.844 [2024-07-24 18:54:45.790899] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x146a430 00:18:01.781 18:54:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:18:02.039 [2024-07-24 18:54:46.870421] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:18:02.039 [2024-07-24 18:54:46.870463] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:02.039 [2024-07-24 18:54:46.870662] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x146a430 00:18:02.039 18:54:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:02.039 18:54:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:18:02.039 18:54:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:18:02.039 18:54:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:18:02.039 18:54:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:02.039 18:54:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:02.039 18:54:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:02.039 18:54:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:02.039 18:54:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:02.039 18:54:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:02.039 18:54:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:02.039 18:54:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:02.039 18:54:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:02.039 18:54:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:02.039 18:54:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.039 18:54:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:02.298 18:54:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:02.298 "name": "raid_bdev1", 00:18:02.298 "uuid": "151a6bda-711b-4aaa-911a-ba0137b98af3", 00:18:02.298 "strip_size_kb": 0, 00:18:02.298 "state": "online", 00:18:02.298 "raid_level": "raid1", 00:18:02.298 "superblock": true, 00:18:02.298 "num_base_bdevs": 4, 00:18:02.298 "num_base_bdevs_discovered": 3, 00:18:02.298 "num_base_bdevs_operational": 3, 00:18:02.298 "base_bdevs_list": [ 00:18:02.298 { 00:18:02.298 "name": null, 00:18:02.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:02.298 "is_configured": false, 00:18:02.298 "data_offset": 2048, 00:18:02.298 "data_size": 63488 00:18:02.298 }, 00:18:02.298 { 00:18:02.298 "name": "BaseBdev2", 00:18:02.298 "uuid": "622ec2b1-4482-553d-a7a6-1681b35cf045", 00:18:02.298 "is_configured": true, 00:18:02.298 "data_offset": 2048, 00:18:02.298 "data_size": 63488 00:18:02.298 }, 00:18:02.298 { 00:18:02.298 "name": "BaseBdev3", 00:18:02.298 "uuid": "afa22c22-a048-52aa-83a8-13b2a4449b2a", 00:18:02.298 "is_configured": true, 00:18:02.298 "data_offset": 2048, 00:18:02.298 "data_size": 63488 00:18:02.298 }, 00:18:02.298 { 00:18:02.298 "name": "BaseBdev4", 00:18:02.298 "uuid": "566a2e06-051b-5a99-b740-0c9650a4fd9c", 00:18:02.298 "is_configured": true, 00:18:02.298 "data_offset": 2048, 00:18:02.298 "data_size": 63488 00:18:02.298 } 00:18:02.298 ] 00:18:02.298 }' 00:18:02.298 18:54:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:02.298 18:54:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:02.556 18:54:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:02.815 [2024-07-24 18:54:47.719616] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:02.815 [2024-07-24 18:54:47.719646] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:02.815 [2024-07-24 18:54:47.721756] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:02.815 [2024-07-24 18:54:47.721779] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:02.815 [2024-07-24 18:54:47.721840] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:02.815 [2024-07-24 18:54:47.721846] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1618d90 name raid_bdev1, state offline 00:18:02.815 0 00:18:02.815 18:54:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2148634 00:18:02.815 18:54:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2148634 ']' 00:18:02.815 18:54:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2148634 00:18:02.815 18:54:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:18:02.815 18:54:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:02.815 18:54:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2148634 00:18:02.815 18:54:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:02.815 18:54:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:02.815 18:54:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2148634' 00:18:02.815 killing process with pid 2148634 00:18:02.815 18:54:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2148634 00:18:02.815 [2024-07-24 18:54:47.788572] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:02.815 18:54:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2148634 00:18:02.815 [2024-07-24 18:54:47.815458] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:03.074 18:54:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.xntcTU7Jmo 00:18:03.074 18:54:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:03.074 18:54:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:03.074 18:54:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:18:03.074 18:54:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:18:03.074 18:54:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:03.074 18:54:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:03.074 18:54:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:18:03.074 00:18:03.074 real 0m5.948s 00:18:03.074 user 0m9.389s 00:18:03.074 sys 0m0.842s 00:18:03.074 18:54:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:03.074 18:54:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:03.074 ************************************ 00:18:03.074 END TEST raid_write_error_test 00:18:03.074 ************************************ 00:18:03.074 18:54:48 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:18:03.074 18:54:48 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:18:03.074 18:54:48 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:18:03.074 18:54:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:18:03.074 18:54:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:03.074 18:54:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:03.074 ************************************ 00:18:03.074 START TEST raid_rebuild_test 00:18:03.074 ************************************ 00:18:03.074 18:54:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:18:03.074 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:18:03.074 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:18:03.074 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:18:03.074 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:18:03.074 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=2149647 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 2149647 /var/tmp/spdk-raid.sock 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 2149647 ']' 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:03.333 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:03.333 18:54:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:03.333 [2024-07-24 18:54:48.139501] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:18:03.333 [2024-07-24 18:54:48.139542] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2149647 ] 00:18:03.333 I/O size of 3145728 is greater than zero copy threshold (65536). 00:18:03.333 Zero copy mechanism will not be used. 00:18:03.333 [2024-07-24 18:54:48.203721] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:03.333 [2024-07-24 18:54:48.274666] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:03.333 [2024-07-24 18:54:48.328280] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:03.333 [2024-07-24 18:54:48.328305] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:04.270 18:54:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:04.270 18:54:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:18:04.270 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:04.270 18:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:04.270 BaseBdev1_malloc 00:18:04.270 18:54:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:18:04.270 [2024-07-24 18:54:49.240051] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:18:04.270 [2024-07-24 18:54:49.240088] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:04.270 [2024-07-24 18:54:49.240100] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x173f130 00:18:04.270 [2024-07-24 18:54:49.240122] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:04.270 [2024-07-24 18:54:49.241150] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:04.270 [2024-07-24 18:54:49.241169] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:04.270 BaseBdev1 00:18:04.270 18:54:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:04.270 18:54:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:04.528 BaseBdev2_malloc 00:18:04.528 18:54:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:18:04.787 [2024-07-24 18:54:49.584274] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:18:04.787 [2024-07-24 18:54:49.584304] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:04.787 [2024-07-24 18:54:49.584316] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e4fa0 00:18:04.787 [2024-07-24 18:54:49.584321] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:04.787 [2024-07-24 18:54:49.585328] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:04.787 [2024-07-24 18:54:49.585346] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:04.787 BaseBdev2 00:18:04.787 18:54:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:18:04.787 spare_malloc 00:18:04.787 18:54:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:18:05.045 spare_delay 00:18:05.045 18:54:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:05.304 [2024-07-24 18:54:50.121100] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:05.304 [2024-07-24 18:54:50.121142] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:05.304 [2024-07-24 18:54:50.121155] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e6f40 00:18:05.304 [2024-07-24 18:54:50.121161] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:05.304 [2024-07-24 18:54:50.122272] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:05.304 [2024-07-24 18:54:50.122294] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:05.304 spare 00:18:05.304 18:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:18:05.304 [2024-07-24 18:54:50.297561] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:05.304 [2024-07-24 18:54:50.298420] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:05.304 [2024-07-24 18:54:50.298478] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x18e8370 00:18:05.304 [2024-07-24 18:54:50.298485] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:18:05.304 [2024-07-24 18:54:50.298614] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18e71d0 00:18:05.304 [2024-07-24 18:54:50.298707] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18e8370 00:18:05.304 [2024-07-24 18:54:50.298713] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18e8370 00:18:05.304 [2024-07-24 18:54:50.298781] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:05.563 18:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:05.563 18:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:05.563 18:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:05.563 18:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:05.563 18:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:05.563 18:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:05.563 18:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:05.563 18:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:05.563 18:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:05.563 18:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:05.563 18:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.563 18:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:05.563 18:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:05.563 "name": "raid_bdev1", 00:18:05.563 "uuid": "954ce811-1b96-422a-ab02-de733d5ea49c", 00:18:05.563 "strip_size_kb": 0, 00:18:05.563 "state": "online", 00:18:05.563 "raid_level": "raid1", 00:18:05.563 "superblock": false, 00:18:05.563 "num_base_bdevs": 2, 00:18:05.563 "num_base_bdevs_discovered": 2, 00:18:05.563 "num_base_bdevs_operational": 2, 00:18:05.563 "base_bdevs_list": [ 00:18:05.563 { 00:18:05.563 "name": "BaseBdev1", 00:18:05.563 "uuid": "32072301-9185-5144-a5ae-ea7870a71c71", 00:18:05.563 "is_configured": true, 00:18:05.563 "data_offset": 0, 00:18:05.563 "data_size": 65536 00:18:05.563 }, 00:18:05.563 { 00:18:05.563 "name": "BaseBdev2", 00:18:05.563 "uuid": "2fb38ab8-88aa-5f98-a9b8-08f0e1b2f835", 00:18:05.563 "is_configured": true, 00:18:05.563 "data_offset": 0, 00:18:05.563 "data_size": 65536 00:18:05.563 } 00:18:05.563 ] 00:18:05.563 }' 00:18:05.563 18:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:05.563 18:54:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:06.130 18:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:06.130 18:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:18:06.130 [2024-07-24 18:54:51.111887] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:06.130 18:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:18:06.130 18:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:06.130 18:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:18:06.407 18:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:18:06.407 18:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:18:06.407 18:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:18:06.407 18:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:18:06.407 18:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:18:06.407 18:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:06.407 18:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:18:06.407 18:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:06.407 18:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:18:06.407 18:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:06.407 18:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:18:06.407 18:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:06.407 18:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:06.407 18:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:18:06.665 [2024-07-24 18:54:51.436577] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18f0cf0 00:18:06.665 /dev/nbd0 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:06.665 1+0 records in 00:18:06.665 1+0 records out 00:18:06.665 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268206 s, 15.3 MB/s 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:18:06.665 18:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:18:09.957 65536+0 records in 00:18:09.957 65536+0 records out 00:18:09.957 33554432 bytes (34 MB, 32 MiB) copied, 3.46736 s, 9.7 MB/s 00:18:09.957 18:54:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:18:09.957 18:54:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:09.957 18:54:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:18:09.957 18:54:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:09.957 18:54:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:18:09.957 18:54:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:09.957 18:54:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:18:10.216 [2024-07-24 18:54:55.134941] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:10.216 18:54:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:10.216 18:54:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:10.216 18:54:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:10.216 18:54:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:10.216 18:54:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:10.216 18:54:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:10.216 18:54:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:18:10.216 18:54:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:18:10.216 18:54:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:18:10.474 [2024-07-24 18:54:55.299403] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:10.474 18:54:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:10.474 18:54:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:10.474 18:54:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:10.474 18:54:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:10.474 18:54:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:10.474 18:54:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:10.474 18:54:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:10.474 18:54:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:10.474 18:54:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:10.474 18:54:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:10.474 18:54:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.474 18:54:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:10.732 18:54:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:10.732 "name": "raid_bdev1", 00:18:10.732 "uuid": "954ce811-1b96-422a-ab02-de733d5ea49c", 00:18:10.732 "strip_size_kb": 0, 00:18:10.732 "state": "online", 00:18:10.732 "raid_level": "raid1", 00:18:10.732 "superblock": false, 00:18:10.732 "num_base_bdevs": 2, 00:18:10.732 "num_base_bdevs_discovered": 1, 00:18:10.732 "num_base_bdevs_operational": 1, 00:18:10.732 "base_bdevs_list": [ 00:18:10.732 { 00:18:10.732 "name": null, 00:18:10.732 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.732 "is_configured": false, 00:18:10.732 "data_offset": 0, 00:18:10.732 "data_size": 65536 00:18:10.732 }, 00:18:10.732 { 00:18:10.732 "name": "BaseBdev2", 00:18:10.732 "uuid": "2fb38ab8-88aa-5f98-a9b8-08f0e1b2f835", 00:18:10.732 "is_configured": true, 00:18:10.732 "data_offset": 0, 00:18:10.732 "data_size": 65536 00:18:10.732 } 00:18:10.732 ] 00:18:10.732 }' 00:18:10.732 18:54:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:10.732 18:54:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:10.990 18:54:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:11.248 [2024-07-24 18:54:56.149596] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:11.248 [2024-07-24 18:54:56.153897] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1737520 00:18:11.248 [2024-07-24 18:54:56.155297] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:11.248 18:54:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:18:12.188 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:12.188 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:12.188 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:12.188 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:12.188 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:12.188 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.188 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:12.502 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:12.502 "name": "raid_bdev1", 00:18:12.502 "uuid": "954ce811-1b96-422a-ab02-de733d5ea49c", 00:18:12.502 "strip_size_kb": 0, 00:18:12.502 "state": "online", 00:18:12.502 "raid_level": "raid1", 00:18:12.502 "superblock": false, 00:18:12.502 "num_base_bdevs": 2, 00:18:12.502 "num_base_bdevs_discovered": 2, 00:18:12.502 "num_base_bdevs_operational": 2, 00:18:12.502 "process": { 00:18:12.502 "type": "rebuild", 00:18:12.502 "target": "spare", 00:18:12.502 "progress": { 00:18:12.502 "blocks": 22528, 00:18:12.502 "percent": 34 00:18:12.502 } 00:18:12.502 }, 00:18:12.502 "base_bdevs_list": [ 00:18:12.502 { 00:18:12.502 "name": "spare", 00:18:12.502 "uuid": "55754d53-afe8-5827-b81c-3cd9986640f6", 00:18:12.502 "is_configured": true, 00:18:12.502 "data_offset": 0, 00:18:12.502 "data_size": 65536 00:18:12.502 }, 00:18:12.502 { 00:18:12.502 "name": "BaseBdev2", 00:18:12.502 "uuid": "2fb38ab8-88aa-5f98-a9b8-08f0e1b2f835", 00:18:12.502 "is_configured": true, 00:18:12.502 "data_offset": 0, 00:18:12.502 "data_size": 65536 00:18:12.502 } 00:18:12.502 ] 00:18:12.502 }' 00:18:12.502 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:12.502 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:12.502 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:12.502 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:12.502 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:18:12.766 [2024-07-24 18:54:57.574404] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:12.766 [2024-07-24 18:54:57.665778] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:12.766 [2024-07-24 18:54:57.665807] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:12.766 [2024-07-24 18:54:57.665816] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:12.766 [2024-07-24 18:54:57.665820] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:18:12.766 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:12.766 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:12.766 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:12.766 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:12.766 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:12.766 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:12.766 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:12.766 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:12.766 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:12.766 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:12.766 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.766 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:13.025 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:13.025 "name": "raid_bdev1", 00:18:13.025 "uuid": "954ce811-1b96-422a-ab02-de733d5ea49c", 00:18:13.025 "strip_size_kb": 0, 00:18:13.025 "state": "online", 00:18:13.025 "raid_level": "raid1", 00:18:13.025 "superblock": false, 00:18:13.025 "num_base_bdevs": 2, 00:18:13.025 "num_base_bdevs_discovered": 1, 00:18:13.025 "num_base_bdevs_operational": 1, 00:18:13.025 "base_bdevs_list": [ 00:18:13.025 { 00:18:13.025 "name": null, 00:18:13.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:13.025 "is_configured": false, 00:18:13.025 "data_offset": 0, 00:18:13.025 "data_size": 65536 00:18:13.025 }, 00:18:13.025 { 00:18:13.025 "name": "BaseBdev2", 00:18:13.025 "uuid": "2fb38ab8-88aa-5f98-a9b8-08f0e1b2f835", 00:18:13.025 "is_configured": true, 00:18:13.025 "data_offset": 0, 00:18:13.025 "data_size": 65536 00:18:13.025 } 00:18:13.025 ] 00:18:13.025 }' 00:18:13.025 18:54:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:13.025 18:54:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:13.592 18:54:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:13.592 18:54:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:13.592 18:54:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:13.592 18:54:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:13.592 18:54:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:13.592 18:54:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.592 18:54:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:13.592 18:54:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:13.592 "name": "raid_bdev1", 00:18:13.592 "uuid": "954ce811-1b96-422a-ab02-de733d5ea49c", 00:18:13.592 "strip_size_kb": 0, 00:18:13.592 "state": "online", 00:18:13.592 "raid_level": "raid1", 00:18:13.592 "superblock": false, 00:18:13.592 "num_base_bdevs": 2, 00:18:13.592 "num_base_bdevs_discovered": 1, 00:18:13.592 "num_base_bdevs_operational": 1, 00:18:13.592 "base_bdevs_list": [ 00:18:13.592 { 00:18:13.592 "name": null, 00:18:13.592 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:13.592 "is_configured": false, 00:18:13.592 "data_offset": 0, 00:18:13.592 "data_size": 65536 00:18:13.592 }, 00:18:13.592 { 00:18:13.592 "name": "BaseBdev2", 00:18:13.592 "uuid": "2fb38ab8-88aa-5f98-a9b8-08f0e1b2f835", 00:18:13.592 "is_configured": true, 00:18:13.592 "data_offset": 0, 00:18:13.592 "data_size": 65536 00:18:13.592 } 00:18:13.592 ] 00:18:13.592 }' 00:18:13.592 18:54:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:13.592 18:54:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:13.592 18:54:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:13.592 18:54:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:13.592 18:54:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:13.850 [2024-07-24 18:54:58.748613] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:13.850 [2024-07-24 18:54:58.752964] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18f0cf0 00:18:13.850 [2024-07-24 18:54:58.754027] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:13.850 18:54:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:18:14.784 18:54:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:14.784 18:54:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:14.784 18:54:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:14.784 18:54:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:14.784 18:54:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:14.784 18:54:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.784 18:54:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:15.042 18:54:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:15.042 "name": "raid_bdev1", 00:18:15.042 "uuid": "954ce811-1b96-422a-ab02-de733d5ea49c", 00:18:15.042 "strip_size_kb": 0, 00:18:15.042 "state": "online", 00:18:15.042 "raid_level": "raid1", 00:18:15.042 "superblock": false, 00:18:15.042 "num_base_bdevs": 2, 00:18:15.042 "num_base_bdevs_discovered": 2, 00:18:15.042 "num_base_bdevs_operational": 2, 00:18:15.042 "process": { 00:18:15.042 "type": "rebuild", 00:18:15.042 "target": "spare", 00:18:15.042 "progress": { 00:18:15.042 "blocks": 22528, 00:18:15.042 "percent": 34 00:18:15.042 } 00:18:15.042 }, 00:18:15.042 "base_bdevs_list": [ 00:18:15.042 { 00:18:15.042 "name": "spare", 00:18:15.042 "uuid": "55754d53-afe8-5827-b81c-3cd9986640f6", 00:18:15.042 "is_configured": true, 00:18:15.042 "data_offset": 0, 00:18:15.042 "data_size": 65536 00:18:15.042 }, 00:18:15.042 { 00:18:15.042 "name": "BaseBdev2", 00:18:15.042 "uuid": "2fb38ab8-88aa-5f98-a9b8-08f0e1b2f835", 00:18:15.042 "is_configured": true, 00:18:15.042 "data_offset": 0, 00:18:15.042 "data_size": 65536 00:18:15.042 } 00:18:15.042 ] 00:18:15.042 }' 00:18:15.043 18:54:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:15.043 18:54:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:15.043 18:54:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:15.043 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:15.043 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:18:15.043 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:18:15.043 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:18:15.043 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:18:15.043 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=581 00:18:15.043 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:15.043 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:15.043 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:15.043 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:15.043 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:15.043 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:15.043 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.043 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:15.301 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:15.301 "name": "raid_bdev1", 00:18:15.301 "uuid": "954ce811-1b96-422a-ab02-de733d5ea49c", 00:18:15.301 "strip_size_kb": 0, 00:18:15.301 "state": "online", 00:18:15.301 "raid_level": "raid1", 00:18:15.301 "superblock": false, 00:18:15.301 "num_base_bdevs": 2, 00:18:15.301 "num_base_bdevs_discovered": 2, 00:18:15.301 "num_base_bdevs_operational": 2, 00:18:15.301 "process": { 00:18:15.301 "type": "rebuild", 00:18:15.301 "target": "spare", 00:18:15.301 "progress": { 00:18:15.301 "blocks": 28672, 00:18:15.301 "percent": 43 00:18:15.301 } 00:18:15.301 }, 00:18:15.301 "base_bdevs_list": [ 00:18:15.301 { 00:18:15.301 "name": "spare", 00:18:15.301 "uuid": "55754d53-afe8-5827-b81c-3cd9986640f6", 00:18:15.301 "is_configured": true, 00:18:15.301 "data_offset": 0, 00:18:15.301 "data_size": 65536 00:18:15.301 }, 00:18:15.301 { 00:18:15.301 "name": "BaseBdev2", 00:18:15.301 "uuid": "2fb38ab8-88aa-5f98-a9b8-08f0e1b2f835", 00:18:15.301 "is_configured": true, 00:18:15.301 "data_offset": 0, 00:18:15.301 "data_size": 65536 00:18:15.301 } 00:18:15.301 ] 00:18:15.301 }' 00:18:15.301 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:15.301 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:15.301 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:15.301 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:15.301 18:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:16.678 18:55:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:16.678 18:55:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:16.678 18:55:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:16.678 18:55:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:16.678 18:55:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:16.678 18:55:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:16.678 18:55:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.678 18:55:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:16.678 18:55:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:16.678 "name": "raid_bdev1", 00:18:16.678 "uuid": "954ce811-1b96-422a-ab02-de733d5ea49c", 00:18:16.678 "strip_size_kb": 0, 00:18:16.678 "state": "online", 00:18:16.678 "raid_level": "raid1", 00:18:16.678 "superblock": false, 00:18:16.678 "num_base_bdevs": 2, 00:18:16.678 "num_base_bdevs_discovered": 2, 00:18:16.678 "num_base_bdevs_operational": 2, 00:18:16.678 "process": { 00:18:16.678 "type": "rebuild", 00:18:16.678 "target": "spare", 00:18:16.678 "progress": { 00:18:16.678 "blocks": 53248, 00:18:16.678 "percent": 81 00:18:16.678 } 00:18:16.678 }, 00:18:16.678 "base_bdevs_list": [ 00:18:16.678 { 00:18:16.678 "name": "spare", 00:18:16.678 "uuid": "55754d53-afe8-5827-b81c-3cd9986640f6", 00:18:16.678 "is_configured": true, 00:18:16.678 "data_offset": 0, 00:18:16.678 "data_size": 65536 00:18:16.678 }, 00:18:16.678 { 00:18:16.678 "name": "BaseBdev2", 00:18:16.678 "uuid": "2fb38ab8-88aa-5f98-a9b8-08f0e1b2f835", 00:18:16.678 "is_configured": true, 00:18:16.678 "data_offset": 0, 00:18:16.678 "data_size": 65536 00:18:16.678 } 00:18:16.678 ] 00:18:16.678 }' 00:18:16.678 18:55:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:16.678 18:55:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:16.678 18:55:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:16.678 18:55:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:16.678 18:55:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:17.246 [2024-07-24 18:55:01.976386] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:18:17.246 [2024-07-24 18:55:01.976430] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:18:17.246 [2024-07-24 18:55:01.976459] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:17.504 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:17.504 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:17.504 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:17.504 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:17.504 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:17.505 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:17.505 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:17.505 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:17.763 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:17.763 "name": "raid_bdev1", 00:18:17.763 "uuid": "954ce811-1b96-422a-ab02-de733d5ea49c", 00:18:17.763 "strip_size_kb": 0, 00:18:17.763 "state": "online", 00:18:17.763 "raid_level": "raid1", 00:18:17.763 "superblock": false, 00:18:17.763 "num_base_bdevs": 2, 00:18:17.763 "num_base_bdevs_discovered": 2, 00:18:17.763 "num_base_bdevs_operational": 2, 00:18:17.763 "base_bdevs_list": [ 00:18:17.763 { 00:18:17.763 "name": "spare", 00:18:17.763 "uuid": "55754d53-afe8-5827-b81c-3cd9986640f6", 00:18:17.763 "is_configured": true, 00:18:17.763 "data_offset": 0, 00:18:17.763 "data_size": 65536 00:18:17.763 }, 00:18:17.763 { 00:18:17.763 "name": "BaseBdev2", 00:18:17.763 "uuid": "2fb38ab8-88aa-5f98-a9b8-08f0e1b2f835", 00:18:17.763 "is_configured": true, 00:18:17.763 "data_offset": 0, 00:18:17.763 "data_size": 65536 00:18:17.763 } 00:18:17.763 ] 00:18:17.763 }' 00:18:17.763 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:17.763 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:18:17.763 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:17.763 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:18:17.763 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:18:17.763 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:17.763 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:17.763 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:17.763 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:17.763 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:17.763 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:17.763 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:18.022 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:18.022 "name": "raid_bdev1", 00:18:18.022 "uuid": "954ce811-1b96-422a-ab02-de733d5ea49c", 00:18:18.022 "strip_size_kb": 0, 00:18:18.022 "state": "online", 00:18:18.022 "raid_level": "raid1", 00:18:18.022 "superblock": false, 00:18:18.022 "num_base_bdevs": 2, 00:18:18.022 "num_base_bdevs_discovered": 2, 00:18:18.022 "num_base_bdevs_operational": 2, 00:18:18.022 "base_bdevs_list": [ 00:18:18.022 { 00:18:18.022 "name": "spare", 00:18:18.022 "uuid": "55754d53-afe8-5827-b81c-3cd9986640f6", 00:18:18.022 "is_configured": true, 00:18:18.022 "data_offset": 0, 00:18:18.022 "data_size": 65536 00:18:18.022 }, 00:18:18.022 { 00:18:18.022 "name": "BaseBdev2", 00:18:18.022 "uuid": "2fb38ab8-88aa-5f98-a9b8-08f0e1b2f835", 00:18:18.022 "is_configured": true, 00:18:18.022 "data_offset": 0, 00:18:18.022 "data_size": 65536 00:18:18.022 } 00:18:18.022 ] 00:18:18.022 }' 00:18:18.022 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:18.022 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:18.022 18:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:18.022 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:18.022 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:18.022 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:18.022 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:18.022 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:18.022 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:18.022 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:18.022 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:18.022 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:18.022 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:18.022 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:18.022 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:18.022 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:18.281 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:18.281 "name": "raid_bdev1", 00:18:18.281 "uuid": "954ce811-1b96-422a-ab02-de733d5ea49c", 00:18:18.281 "strip_size_kb": 0, 00:18:18.281 "state": "online", 00:18:18.281 "raid_level": "raid1", 00:18:18.281 "superblock": false, 00:18:18.281 "num_base_bdevs": 2, 00:18:18.281 "num_base_bdevs_discovered": 2, 00:18:18.281 "num_base_bdevs_operational": 2, 00:18:18.281 "base_bdevs_list": [ 00:18:18.281 { 00:18:18.281 "name": "spare", 00:18:18.281 "uuid": "55754d53-afe8-5827-b81c-3cd9986640f6", 00:18:18.281 "is_configured": true, 00:18:18.281 "data_offset": 0, 00:18:18.281 "data_size": 65536 00:18:18.281 }, 00:18:18.281 { 00:18:18.281 "name": "BaseBdev2", 00:18:18.281 "uuid": "2fb38ab8-88aa-5f98-a9b8-08f0e1b2f835", 00:18:18.281 "is_configured": true, 00:18:18.281 "data_offset": 0, 00:18:18.281 "data_size": 65536 00:18:18.281 } 00:18:18.281 ] 00:18:18.281 }' 00:18:18.281 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:18.281 18:55:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:18.847 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:18.847 [2024-07-24 18:55:03.792783] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:18.847 [2024-07-24 18:55:03.792805] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:18.847 [2024-07-24 18:55:03.792852] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:18.847 [2024-07-24 18:55:03.792892] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:18.847 [2024-07-24 18:55:03.792898] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18e8370 name raid_bdev1, state offline 00:18:18.847 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:18.847 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:18:19.105 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:18:19.105 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:18:19.105 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:18:19.105 18:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:18:19.105 18:55:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:19.105 18:55:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:18:19.105 18:55:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:19.105 18:55:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:19.105 18:55:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:19.105 18:55:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:18:19.105 18:55:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:19.105 18:55:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:19.105 18:55:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:18:19.364 /dev/nbd0 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:19.364 1+0 records in 00:18:19.364 1+0 records out 00:18:19.364 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000211694 s, 19.3 MB/s 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:18:19.364 /dev/nbd1 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:18:19.364 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:19.623 1+0 records in 00:18:19.623 1+0 records out 00:18:19.623 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00022103 s, 18.5 MB/s 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:19.623 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:18:19.881 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:18:19.881 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:18:19.881 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:18:19.882 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:19.882 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:19.882 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:18:19.882 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:18:19.882 18:55:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:18:19.882 18:55:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:18:19.882 18:55:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 2149647 00:18:19.882 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 2149647 ']' 00:18:19.882 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 2149647 00:18:19.882 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:18:19.882 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:19.882 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2149647 00:18:19.882 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:19.882 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:19.882 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2149647' 00:18:19.882 killing process with pid 2149647 00:18:19.882 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 2149647 00:18:19.882 Received shutdown signal, test time was about 60.000000 seconds 00:18:19.882 00:18:19.882 Latency(us) 00:18:19.882 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:18:19.882 =================================================================================================================== 00:18:19.882 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:18:19.882 [2024-07-24 18:55:04.851270] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:19.882 18:55:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 2149647 00:18:19.882 [2024-07-24 18:55:04.874153] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:18:20.141 00:18:20.141 real 0m16.970s 00:18:20.141 user 0m23.288s 00:18:20.141 sys 0m2.766s 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:20.141 ************************************ 00:18:20.141 END TEST raid_rebuild_test 00:18:20.141 ************************************ 00:18:20.141 18:55:05 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:18:20.141 18:55:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:18:20.141 18:55:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:20.141 18:55:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:20.141 ************************************ 00:18:20.141 START TEST raid_rebuild_test_sb 00:18:20.141 ************************************ 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=2152740 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 2152740 /var/tmp/spdk-raid.sock 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2152740 ']' 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:20.141 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:20.141 18:55:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:20.400 [2024-07-24 18:55:05.177880] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:18:20.400 [2024-07-24 18:55:05.177920] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2152740 ] 00:18:20.400 I/O size of 3145728 is greater than zero copy threshold (65536). 00:18:20.400 Zero copy mechanism will not be used. 00:18:20.400 [2024-07-24 18:55:05.245578] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:20.400 [2024-07-24 18:55:05.318774] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:20.400 [2024-07-24 18:55:05.370515] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:20.400 [2024-07-24 18:55:05.370545] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:20.968 18:55:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:20.968 18:55:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:18:20.968 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:20.968 18:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:21.226 BaseBdev1_malloc 00:18:21.226 18:55:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:18:21.485 [2024-07-24 18:55:06.290387] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:18:21.485 [2024-07-24 18:55:06.290420] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:21.485 [2024-07-24 18:55:06.290433] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x174a130 00:18:21.485 [2024-07-24 18:55:06.290457] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:21.485 [2024-07-24 18:55:06.291521] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:21.485 [2024-07-24 18:55:06.291542] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:21.485 BaseBdev1 00:18:21.485 18:55:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:21.485 18:55:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:21.485 BaseBdev2_malloc 00:18:21.485 18:55:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:18:21.743 [2024-07-24 18:55:06.646701] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:18:21.743 [2024-07-24 18:55:06.646731] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:21.743 [2024-07-24 18:55:06.646742] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18effa0 00:18:21.743 [2024-07-24 18:55:06.646748] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:21.743 [2024-07-24 18:55:06.647717] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:21.743 [2024-07-24 18:55:06.647737] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:21.743 BaseBdev2 00:18:21.743 18:55:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:18:22.001 spare_malloc 00:18:22.001 18:55:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:18:22.001 spare_delay 00:18:22.001 18:55:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:22.260 [2024-07-24 18:55:07.131201] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:22.260 [2024-07-24 18:55:07.131229] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:22.260 [2024-07-24 18:55:07.131240] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18f1f40 00:18:22.260 [2024-07-24 18:55:07.131246] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:22.260 [2024-07-24 18:55:07.132171] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:22.260 [2024-07-24 18:55:07.132189] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:22.260 spare 00:18:22.260 18:55:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:18:22.519 [2024-07-24 18:55:07.299661] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:22.519 [2024-07-24 18:55:07.300453] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:22.519 [2024-07-24 18:55:07.300570] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x18f3370 00:18:22.519 [2024-07-24 18:55:07.300579] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:22.519 [2024-07-24 18:55:07.300694] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18f21d0 00:18:22.519 [2024-07-24 18:55:07.300785] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18f3370 00:18:22.519 [2024-07-24 18:55:07.300790] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18f3370 00:18:22.519 [2024-07-24 18:55:07.300849] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:22.519 18:55:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:22.519 18:55:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:22.519 18:55:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:22.519 18:55:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:22.519 18:55:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:22.519 18:55:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:22.519 18:55:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:22.519 18:55:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:22.519 18:55:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:22.519 18:55:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:22.519 18:55:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.519 18:55:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:22.519 18:55:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:22.519 "name": "raid_bdev1", 00:18:22.519 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:22.519 "strip_size_kb": 0, 00:18:22.519 "state": "online", 00:18:22.519 "raid_level": "raid1", 00:18:22.519 "superblock": true, 00:18:22.519 "num_base_bdevs": 2, 00:18:22.519 "num_base_bdevs_discovered": 2, 00:18:22.519 "num_base_bdevs_operational": 2, 00:18:22.519 "base_bdevs_list": [ 00:18:22.519 { 00:18:22.519 "name": "BaseBdev1", 00:18:22.519 "uuid": "762f7c4e-15cf-5c24-9417-3459b4e827c2", 00:18:22.519 "is_configured": true, 00:18:22.519 "data_offset": 2048, 00:18:22.519 "data_size": 63488 00:18:22.519 }, 00:18:22.519 { 00:18:22.519 "name": "BaseBdev2", 00:18:22.519 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:22.519 "is_configured": true, 00:18:22.519 "data_offset": 2048, 00:18:22.519 "data_size": 63488 00:18:22.519 } 00:18:22.519 ] 00:18:22.519 }' 00:18:22.519 18:55:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:22.519 18:55:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:23.085 18:55:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:23.085 18:55:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:18:23.344 [2024-07-24 18:55:08.113921] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:23.344 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:18:23.344 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:23.344 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:18:23.344 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:18:23.344 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:18:23.344 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:18:23.344 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:18:23.344 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:18:23.344 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:23.344 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:18:23.344 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:23.344 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:18:23.344 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:23.344 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:18:23.344 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:23.344 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:23.344 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:18:23.602 [2024-07-24 18:55:08.442641] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18f21d0 00:18:23.602 /dev/nbd0 00:18:23.602 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:23.603 1+0 records in 00:18:23.603 1+0 records out 00:18:23.603 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000223048 s, 18.4 MB/s 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:18:23.603 18:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:18:26.887 63488+0 records in 00:18:26.887 63488+0 records out 00:18:26.887 32505856 bytes (33 MB, 31 MiB) copied, 3.32522 s, 9.8 MB/s 00:18:26.887 18:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:18:26.887 18:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:26.887 18:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:18:26.887 18:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:26.887 18:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:18:26.887 18:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:26.887 18:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:18:27.146 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:27.146 [2024-07-24 18:55:12.011298] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:27.146 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:27.146 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:27.146 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:27.146 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:27.146 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:27.146 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:18:27.146 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:18:27.146 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:18:27.405 [2024-07-24 18:55:12.171877] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:27.405 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:27.405 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:27.405 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:27.405 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:27.405 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:27.405 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:27.405 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:27.405 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:27.405 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:27.405 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:27.405 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.405 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:27.405 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:27.405 "name": "raid_bdev1", 00:18:27.405 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:27.405 "strip_size_kb": 0, 00:18:27.405 "state": "online", 00:18:27.405 "raid_level": "raid1", 00:18:27.405 "superblock": true, 00:18:27.405 "num_base_bdevs": 2, 00:18:27.405 "num_base_bdevs_discovered": 1, 00:18:27.405 "num_base_bdevs_operational": 1, 00:18:27.405 "base_bdevs_list": [ 00:18:27.405 { 00:18:27.405 "name": null, 00:18:27.405 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.405 "is_configured": false, 00:18:27.405 "data_offset": 2048, 00:18:27.405 "data_size": 63488 00:18:27.405 }, 00:18:27.405 { 00:18:27.405 "name": "BaseBdev2", 00:18:27.405 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:27.405 "is_configured": true, 00:18:27.405 "data_offset": 2048, 00:18:27.405 "data_size": 63488 00:18:27.405 } 00:18:27.405 ] 00:18:27.405 }' 00:18:27.405 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:27.405 18:55:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:27.972 18:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:28.230 [2024-07-24 18:55:12.997994] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:28.230 [2024-07-24 18:55:13.002300] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18f21d0 00:18:28.230 [2024-07-24 18:55:13.003718] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:28.230 18:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:18:29.165 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:29.165 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:29.165 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:29.165 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:29.165 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:29.165 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.165 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:29.424 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:29.424 "name": "raid_bdev1", 00:18:29.424 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:29.424 "strip_size_kb": 0, 00:18:29.424 "state": "online", 00:18:29.424 "raid_level": "raid1", 00:18:29.424 "superblock": true, 00:18:29.424 "num_base_bdevs": 2, 00:18:29.424 "num_base_bdevs_discovered": 2, 00:18:29.424 "num_base_bdevs_operational": 2, 00:18:29.424 "process": { 00:18:29.424 "type": "rebuild", 00:18:29.424 "target": "spare", 00:18:29.424 "progress": { 00:18:29.424 "blocks": 22528, 00:18:29.424 "percent": 35 00:18:29.424 } 00:18:29.424 }, 00:18:29.424 "base_bdevs_list": [ 00:18:29.424 { 00:18:29.424 "name": "spare", 00:18:29.424 "uuid": "016dd32e-8ba2-594d-bc18-82f2a6d46a20", 00:18:29.424 "is_configured": true, 00:18:29.424 "data_offset": 2048, 00:18:29.424 "data_size": 63488 00:18:29.424 }, 00:18:29.424 { 00:18:29.424 "name": "BaseBdev2", 00:18:29.424 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:29.424 "is_configured": true, 00:18:29.424 "data_offset": 2048, 00:18:29.424 "data_size": 63488 00:18:29.424 } 00:18:29.424 ] 00:18:29.424 }' 00:18:29.424 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:29.424 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:29.424 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:29.424 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:29.424 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:18:29.683 [2024-07-24 18:55:14.438395] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:29.683 [2024-07-24 18:55:14.514246] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:29.683 [2024-07-24 18:55:14.514279] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:29.683 [2024-07-24 18:55:14.514289] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:29.683 [2024-07-24 18:55:14.514293] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:18:29.683 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:29.683 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:29.683 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:29.683 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:29.683 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:29.683 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:29.683 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:29.683 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:29.683 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:29.683 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:29.683 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.683 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:29.941 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:29.941 "name": "raid_bdev1", 00:18:29.941 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:29.941 "strip_size_kb": 0, 00:18:29.941 "state": "online", 00:18:29.941 "raid_level": "raid1", 00:18:29.941 "superblock": true, 00:18:29.941 "num_base_bdevs": 2, 00:18:29.941 "num_base_bdevs_discovered": 1, 00:18:29.941 "num_base_bdevs_operational": 1, 00:18:29.941 "base_bdevs_list": [ 00:18:29.941 { 00:18:29.941 "name": null, 00:18:29.941 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.941 "is_configured": false, 00:18:29.941 "data_offset": 2048, 00:18:29.941 "data_size": 63488 00:18:29.941 }, 00:18:29.941 { 00:18:29.941 "name": "BaseBdev2", 00:18:29.941 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:29.941 "is_configured": true, 00:18:29.941 "data_offset": 2048, 00:18:29.941 "data_size": 63488 00:18:29.941 } 00:18:29.941 ] 00:18:29.941 }' 00:18:29.941 18:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:29.941 18:55:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:30.199 18:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:30.199 18:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:30.199 18:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:30.199 18:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:30.199 18:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:30.199 18:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:30.199 18:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:30.459 18:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:30.459 "name": "raid_bdev1", 00:18:30.459 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:30.459 "strip_size_kb": 0, 00:18:30.459 "state": "online", 00:18:30.459 "raid_level": "raid1", 00:18:30.459 "superblock": true, 00:18:30.459 "num_base_bdevs": 2, 00:18:30.459 "num_base_bdevs_discovered": 1, 00:18:30.459 "num_base_bdevs_operational": 1, 00:18:30.459 "base_bdevs_list": [ 00:18:30.459 { 00:18:30.459 "name": null, 00:18:30.459 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.459 "is_configured": false, 00:18:30.459 "data_offset": 2048, 00:18:30.459 "data_size": 63488 00:18:30.459 }, 00:18:30.459 { 00:18:30.459 "name": "BaseBdev2", 00:18:30.459 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:30.459 "is_configured": true, 00:18:30.459 "data_offset": 2048, 00:18:30.459 "data_size": 63488 00:18:30.459 } 00:18:30.459 ] 00:18:30.459 }' 00:18:30.459 18:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:30.459 18:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:30.459 18:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:30.459 18:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:30.459 18:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:30.719 [2024-07-24 18:55:15.581074] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:30.719 [2024-07-24 18:55:15.585303] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18fbcf0 00:18:30.719 [2024-07-24 18:55:15.586356] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:30.719 18:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:18:31.694 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:31.694 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:31.694 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:31.694 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:31.694 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:31.694 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.694 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:31.953 "name": "raid_bdev1", 00:18:31.953 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:31.953 "strip_size_kb": 0, 00:18:31.953 "state": "online", 00:18:31.953 "raid_level": "raid1", 00:18:31.953 "superblock": true, 00:18:31.953 "num_base_bdevs": 2, 00:18:31.953 "num_base_bdevs_discovered": 2, 00:18:31.953 "num_base_bdevs_operational": 2, 00:18:31.953 "process": { 00:18:31.953 "type": "rebuild", 00:18:31.953 "target": "spare", 00:18:31.953 "progress": { 00:18:31.953 "blocks": 22528, 00:18:31.953 "percent": 35 00:18:31.953 } 00:18:31.953 }, 00:18:31.953 "base_bdevs_list": [ 00:18:31.953 { 00:18:31.953 "name": "spare", 00:18:31.953 "uuid": "016dd32e-8ba2-594d-bc18-82f2a6d46a20", 00:18:31.953 "is_configured": true, 00:18:31.953 "data_offset": 2048, 00:18:31.953 "data_size": 63488 00:18:31.953 }, 00:18:31.953 { 00:18:31.953 "name": "BaseBdev2", 00:18:31.953 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:31.953 "is_configured": true, 00:18:31.953 "data_offset": 2048, 00:18:31.953 "data_size": 63488 00:18:31.953 } 00:18:31.953 ] 00:18:31.953 }' 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:18:31.953 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=597 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.953 18:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:32.212 18:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:32.212 "name": "raid_bdev1", 00:18:32.212 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:32.212 "strip_size_kb": 0, 00:18:32.212 "state": "online", 00:18:32.212 "raid_level": "raid1", 00:18:32.212 "superblock": true, 00:18:32.212 "num_base_bdevs": 2, 00:18:32.212 "num_base_bdevs_discovered": 2, 00:18:32.212 "num_base_bdevs_operational": 2, 00:18:32.212 "process": { 00:18:32.212 "type": "rebuild", 00:18:32.212 "target": "spare", 00:18:32.212 "progress": { 00:18:32.212 "blocks": 28672, 00:18:32.212 "percent": 45 00:18:32.212 } 00:18:32.212 }, 00:18:32.212 "base_bdevs_list": [ 00:18:32.212 { 00:18:32.212 "name": "spare", 00:18:32.212 "uuid": "016dd32e-8ba2-594d-bc18-82f2a6d46a20", 00:18:32.212 "is_configured": true, 00:18:32.212 "data_offset": 2048, 00:18:32.212 "data_size": 63488 00:18:32.212 }, 00:18:32.212 { 00:18:32.212 "name": "BaseBdev2", 00:18:32.212 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:32.212 "is_configured": true, 00:18:32.212 "data_offset": 2048, 00:18:32.212 "data_size": 63488 00:18:32.212 } 00:18:32.212 ] 00:18:32.212 }' 00:18:32.212 18:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:32.212 18:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:32.212 18:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:32.212 18:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:32.212 18:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:33.148 18:55:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:33.148 18:55:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:33.148 18:55:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:33.148 18:55:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:33.148 18:55:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:33.148 18:55:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:33.148 18:55:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.148 18:55:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:33.407 18:55:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:33.407 "name": "raid_bdev1", 00:18:33.407 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:33.407 "strip_size_kb": 0, 00:18:33.407 "state": "online", 00:18:33.407 "raid_level": "raid1", 00:18:33.407 "superblock": true, 00:18:33.407 "num_base_bdevs": 2, 00:18:33.407 "num_base_bdevs_discovered": 2, 00:18:33.407 "num_base_bdevs_operational": 2, 00:18:33.407 "process": { 00:18:33.407 "type": "rebuild", 00:18:33.407 "target": "spare", 00:18:33.407 "progress": { 00:18:33.407 "blocks": 53248, 00:18:33.407 "percent": 83 00:18:33.407 } 00:18:33.407 }, 00:18:33.407 "base_bdevs_list": [ 00:18:33.407 { 00:18:33.407 "name": "spare", 00:18:33.407 "uuid": "016dd32e-8ba2-594d-bc18-82f2a6d46a20", 00:18:33.407 "is_configured": true, 00:18:33.407 "data_offset": 2048, 00:18:33.407 "data_size": 63488 00:18:33.407 }, 00:18:33.407 { 00:18:33.407 "name": "BaseBdev2", 00:18:33.407 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:33.407 "is_configured": true, 00:18:33.407 "data_offset": 2048, 00:18:33.407 "data_size": 63488 00:18:33.407 } 00:18:33.407 ] 00:18:33.407 }' 00:18:33.407 18:55:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:33.407 18:55:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:33.407 18:55:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:33.407 18:55:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:33.407 18:55:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:33.974 [2024-07-24 18:55:18.708017] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:18:33.974 [2024-07-24 18:55:18.708061] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:18:33.974 [2024-07-24 18:55:18.708139] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:34.541 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:34.541 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:34.541 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:34.541 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:34.541 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:34.541 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:34.541 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.541 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:34.541 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:34.541 "name": "raid_bdev1", 00:18:34.541 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:34.541 "strip_size_kb": 0, 00:18:34.541 "state": "online", 00:18:34.541 "raid_level": "raid1", 00:18:34.541 "superblock": true, 00:18:34.541 "num_base_bdevs": 2, 00:18:34.541 "num_base_bdevs_discovered": 2, 00:18:34.541 "num_base_bdevs_operational": 2, 00:18:34.541 "base_bdevs_list": [ 00:18:34.541 { 00:18:34.541 "name": "spare", 00:18:34.541 "uuid": "016dd32e-8ba2-594d-bc18-82f2a6d46a20", 00:18:34.541 "is_configured": true, 00:18:34.541 "data_offset": 2048, 00:18:34.541 "data_size": 63488 00:18:34.541 }, 00:18:34.541 { 00:18:34.541 "name": "BaseBdev2", 00:18:34.541 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:34.541 "is_configured": true, 00:18:34.541 "data_offset": 2048, 00:18:34.541 "data_size": 63488 00:18:34.541 } 00:18:34.541 ] 00:18:34.541 }' 00:18:34.541 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:34.800 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:18:34.800 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:34.801 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:18:34.801 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:18:34.801 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:34.801 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:34.801 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:34.801 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:34.801 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:34.801 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.801 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:34.801 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:34.801 "name": "raid_bdev1", 00:18:34.801 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:34.801 "strip_size_kb": 0, 00:18:34.801 "state": "online", 00:18:34.801 "raid_level": "raid1", 00:18:34.801 "superblock": true, 00:18:34.801 "num_base_bdevs": 2, 00:18:34.801 "num_base_bdevs_discovered": 2, 00:18:34.801 "num_base_bdevs_operational": 2, 00:18:34.801 "base_bdevs_list": [ 00:18:34.801 { 00:18:34.801 "name": "spare", 00:18:34.801 "uuid": "016dd32e-8ba2-594d-bc18-82f2a6d46a20", 00:18:34.801 "is_configured": true, 00:18:34.801 "data_offset": 2048, 00:18:34.801 "data_size": 63488 00:18:34.801 }, 00:18:34.801 { 00:18:34.801 "name": "BaseBdev2", 00:18:34.801 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:34.801 "is_configured": true, 00:18:34.801 "data_offset": 2048, 00:18:34.801 "data_size": 63488 00:18:34.801 } 00:18:34.801 ] 00:18:34.801 }' 00:18:34.801 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:35.059 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:35.059 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:35.059 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:35.059 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:35.059 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:35.059 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:35.059 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:35.059 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:35.059 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:35.059 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:35.059 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:35.059 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:35.059 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:35.059 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.059 18:55:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:35.059 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:35.059 "name": "raid_bdev1", 00:18:35.059 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:35.059 "strip_size_kb": 0, 00:18:35.059 "state": "online", 00:18:35.059 "raid_level": "raid1", 00:18:35.059 "superblock": true, 00:18:35.059 "num_base_bdevs": 2, 00:18:35.059 "num_base_bdevs_discovered": 2, 00:18:35.059 "num_base_bdevs_operational": 2, 00:18:35.059 "base_bdevs_list": [ 00:18:35.059 { 00:18:35.059 "name": "spare", 00:18:35.059 "uuid": "016dd32e-8ba2-594d-bc18-82f2a6d46a20", 00:18:35.059 "is_configured": true, 00:18:35.059 "data_offset": 2048, 00:18:35.059 "data_size": 63488 00:18:35.059 }, 00:18:35.059 { 00:18:35.059 "name": "BaseBdev2", 00:18:35.059 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:35.059 "is_configured": true, 00:18:35.059 "data_offset": 2048, 00:18:35.059 "data_size": 63488 00:18:35.059 } 00:18:35.059 ] 00:18:35.059 }' 00:18:35.059 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:35.059 18:55:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:35.626 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:35.626 [2024-07-24 18:55:20.633098] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:35.626 [2024-07-24 18:55:20.633121] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:35.626 [2024-07-24 18:55:20.633171] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:35.626 [2024-07-24 18:55:20.633212] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:35.626 [2024-07-24 18:55:20.633218] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18f3370 name raid_bdev1, state offline 00:18:35.884 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.884 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:18:35.884 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:18:35.884 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:18:35.884 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:18:35.884 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:18:35.884 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:35.884 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:18:35.884 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:35.884 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:35.884 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:35.884 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:18:35.884 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:35.884 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:35.884 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:18:36.142 /dev/nbd0 00:18:36.142 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:36.142 18:55:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:36.142 18:55:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:18:36.142 18:55:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:18:36.142 18:55:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:18:36.142 18:55:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:18:36.142 18:55:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:18:36.142 18:55:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:18:36.142 18:55:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:18:36.142 18:55:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:18:36.142 18:55:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:36.142 1+0 records in 00:18:36.142 1+0 records out 00:18:36.142 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00022104 s, 18.5 MB/s 00:18:36.142 18:55:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:36.142 18:55:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:18:36.142 18:55:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:36.142 18:55:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:18:36.142 18:55:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:18:36.142 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:36.142 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:36.142 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:18:36.401 /dev/nbd1 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:36.401 1+0 records in 00:18:36.401 1+0 records out 00:18:36.401 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000230025 s, 17.8 MB/s 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:36.401 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:18:36.659 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:18:36.917 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:37.176 [2024-07-24 18:55:21.973914] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:37.176 [2024-07-24 18:55:21.973948] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:37.176 [2024-07-24 18:55:21.973959] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1744600 00:18:37.176 [2024-07-24 18:55:21.973965] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:37.176 [2024-07-24 18:55:21.975154] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:37.176 [2024-07-24 18:55:21.975181] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:37.176 [2024-07-24 18:55:21.975224] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:18:37.176 [2024-07-24 18:55:21.975244] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:37.176 [2024-07-24 18:55:21.975332] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:37.176 spare 00:18:37.176 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:37.176 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:37.176 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:37.176 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:37.176 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:37.176 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:37.176 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:37.176 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:37.176 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:37.176 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:37.176 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.176 18:55:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:37.176 [2024-07-24 18:55:22.075623] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1742770 00:18:37.176 [2024-07-24 18:55:22.075632] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:37.176 [2024-07-24 18:55:22.075758] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18fbcf0 00:18:37.176 [2024-07-24 18:55:22.075858] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1742770 00:18:37.176 [2024-07-24 18:55:22.075863] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1742770 00:18:37.176 [2024-07-24 18:55:22.075931] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:37.176 18:55:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:37.176 "name": "raid_bdev1", 00:18:37.176 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:37.176 "strip_size_kb": 0, 00:18:37.176 "state": "online", 00:18:37.176 "raid_level": "raid1", 00:18:37.176 "superblock": true, 00:18:37.176 "num_base_bdevs": 2, 00:18:37.176 "num_base_bdevs_discovered": 2, 00:18:37.176 "num_base_bdevs_operational": 2, 00:18:37.176 "base_bdevs_list": [ 00:18:37.176 { 00:18:37.176 "name": "spare", 00:18:37.176 "uuid": "016dd32e-8ba2-594d-bc18-82f2a6d46a20", 00:18:37.176 "is_configured": true, 00:18:37.176 "data_offset": 2048, 00:18:37.176 "data_size": 63488 00:18:37.176 }, 00:18:37.176 { 00:18:37.176 "name": "BaseBdev2", 00:18:37.176 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:37.176 "is_configured": true, 00:18:37.176 "data_offset": 2048, 00:18:37.176 "data_size": 63488 00:18:37.176 } 00:18:37.176 ] 00:18:37.176 }' 00:18:37.176 18:55:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:37.176 18:55:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:37.743 18:55:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:37.743 18:55:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:37.743 18:55:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:37.743 18:55:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:37.743 18:55:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:37.743 18:55:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.743 18:55:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:38.002 18:55:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:38.002 "name": "raid_bdev1", 00:18:38.002 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:38.002 "strip_size_kb": 0, 00:18:38.002 "state": "online", 00:18:38.002 "raid_level": "raid1", 00:18:38.002 "superblock": true, 00:18:38.002 "num_base_bdevs": 2, 00:18:38.002 "num_base_bdevs_discovered": 2, 00:18:38.002 "num_base_bdevs_operational": 2, 00:18:38.002 "base_bdevs_list": [ 00:18:38.002 { 00:18:38.002 "name": "spare", 00:18:38.002 "uuid": "016dd32e-8ba2-594d-bc18-82f2a6d46a20", 00:18:38.002 "is_configured": true, 00:18:38.002 "data_offset": 2048, 00:18:38.002 "data_size": 63488 00:18:38.002 }, 00:18:38.002 { 00:18:38.002 "name": "BaseBdev2", 00:18:38.002 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:38.002 "is_configured": true, 00:18:38.002 "data_offset": 2048, 00:18:38.002 "data_size": 63488 00:18:38.002 } 00:18:38.002 ] 00:18:38.002 }' 00:18:38.002 18:55:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:38.002 18:55:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:38.002 18:55:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:38.002 18:55:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:38.002 18:55:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:18:38.002 18:55:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.261 18:55:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:18:38.261 18:55:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:18:38.261 [2024-07-24 18:55:23.201151] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:38.261 18:55:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:38.261 18:55:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:38.261 18:55:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:38.261 18:55:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:38.261 18:55:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:38.261 18:55:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:38.261 18:55:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:38.261 18:55:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:38.261 18:55:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:38.261 18:55:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:38.261 18:55:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.261 18:55:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:38.520 18:55:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:38.520 "name": "raid_bdev1", 00:18:38.520 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:38.520 "strip_size_kb": 0, 00:18:38.520 "state": "online", 00:18:38.520 "raid_level": "raid1", 00:18:38.520 "superblock": true, 00:18:38.520 "num_base_bdevs": 2, 00:18:38.520 "num_base_bdevs_discovered": 1, 00:18:38.520 "num_base_bdevs_operational": 1, 00:18:38.520 "base_bdevs_list": [ 00:18:38.520 { 00:18:38.520 "name": null, 00:18:38.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:38.520 "is_configured": false, 00:18:38.520 "data_offset": 2048, 00:18:38.520 "data_size": 63488 00:18:38.520 }, 00:18:38.520 { 00:18:38.520 "name": "BaseBdev2", 00:18:38.520 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:38.520 "is_configured": true, 00:18:38.520 "data_offset": 2048, 00:18:38.520 "data_size": 63488 00:18:38.520 } 00:18:38.520 ] 00:18:38.520 }' 00:18:38.520 18:55:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:38.520 18:55:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:39.087 18:55:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:39.088 [2024-07-24 18:55:24.027283] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:39.088 [2024-07-24 18:55:24.027403] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:18:39.088 [2024-07-24 18:55:24.027413] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:18:39.088 [2024-07-24 18:55:24.027432] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:39.088 [2024-07-24 18:55:24.031671] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18fbcf0 00:18:39.088 [2024-07-24 18:55:24.032647] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:39.088 18:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:18:40.463 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:40.463 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:40.463 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:40.463 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:40.463 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:40.463 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.463 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:40.463 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:40.463 "name": "raid_bdev1", 00:18:40.463 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:40.463 "strip_size_kb": 0, 00:18:40.463 "state": "online", 00:18:40.463 "raid_level": "raid1", 00:18:40.463 "superblock": true, 00:18:40.463 "num_base_bdevs": 2, 00:18:40.463 "num_base_bdevs_discovered": 2, 00:18:40.463 "num_base_bdevs_operational": 2, 00:18:40.463 "process": { 00:18:40.463 "type": "rebuild", 00:18:40.463 "target": "spare", 00:18:40.463 "progress": { 00:18:40.463 "blocks": 22528, 00:18:40.463 "percent": 35 00:18:40.463 } 00:18:40.463 }, 00:18:40.463 "base_bdevs_list": [ 00:18:40.463 { 00:18:40.463 "name": "spare", 00:18:40.463 "uuid": "016dd32e-8ba2-594d-bc18-82f2a6d46a20", 00:18:40.463 "is_configured": true, 00:18:40.463 "data_offset": 2048, 00:18:40.463 "data_size": 63488 00:18:40.463 }, 00:18:40.463 { 00:18:40.463 "name": "BaseBdev2", 00:18:40.463 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:40.463 "is_configured": true, 00:18:40.463 "data_offset": 2048, 00:18:40.463 "data_size": 63488 00:18:40.463 } 00:18:40.463 ] 00:18:40.463 }' 00:18:40.463 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:40.463 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:40.463 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:40.463 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:40.463 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:18:40.463 [2024-07-24 18:55:25.444030] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:40.721 [2024-07-24 18:55:25.543135] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:40.721 [2024-07-24 18:55:25.543168] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:40.721 [2024-07-24 18:55:25.543177] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:40.721 [2024-07-24 18:55:25.543181] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:18:40.721 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:40.721 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:40.721 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:40.721 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:40.721 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:40.721 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:40.721 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:40.721 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:40.721 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:40.721 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:40.721 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.721 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:40.979 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:40.979 "name": "raid_bdev1", 00:18:40.979 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:40.979 "strip_size_kb": 0, 00:18:40.979 "state": "online", 00:18:40.979 "raid_level": "raid1", 00:18:40.979 "superblock": true, 00:18:40.979 "num_base_bdevs": 2, 00:18:40.979 "num_base_bdevs_discovered": 1, 00:18:40.979 "num_base_bdevs_operational": 1, 00:18:40.979 "base_bdevs_list": [ 00:18:40.979 { 00:18:40.979 "name": null, 00:18:40.979 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:40.979 "is_configured": false, 00:18:40.979 "data_offset": 2048, 00:18:40.979 "data_size": 63488 00:18:40.979 }, 00:18:40.979 { 00:18:40.979 "name": "BaseBdev2", 00:18:40.979 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:40.979 "is_configured": true, 00:18:40.979 "data_offset": 2048, 00:18:40.979 "data_size": 63488 00:18:40.979 } 00:18:40.979 ] 00:18:40.979 }' 00:18:40.979 18:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:40.979 18:55:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:41.236 18:55:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:41.494 [2024-07-24 18:55:26.373255] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:41.494 [2024-07-24 18:55:26.373294] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:41.494 [2024-07-24 18:55:26.373307] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17453e0 00:18:41.494 [2024-07-24 18:55:26.373313] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:41.494 [2024-07-24 18:55:26.373622] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:41.494 [2024-07-24 18:55:26.373634] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:41.494 [2024-07-24 18:55:26.373703] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:18:41.494 [2024-07-24 18:55:26.373710] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:18:41.494 [2024-07-24 18:55:26.373715] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:18:41.494 [2024-07-24 18:55:26.373725] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:41.494 [2024-07-24 18:55:26.377921] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18fbcf0 00:18:41.494 spare 00:18:41.494 [2024-07-24 18:55:26.378945] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:41.494 18:55:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:18:42.428 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:42.428 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:42.428 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:42.428 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:42.428 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:42.428 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.428 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:42.686 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:42.686 "name": "raid_bdev1", 00:18:42.686 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:42.686 "strip_size_kb": 0, 00:18:42.686 "state": "online", 00:18:42.686 "raid_level": "raid1", 00:18:42.686 "superblock": true, 00:18:42.686 "num_base_bdevs": 2, 00:18:42.686 "num_base_bdevs_discovered": 2, 00:18:42.686 "num_base_bdevs_operational": 2, 00:18:42.686 "process": { 00:18:42.686 "type": "rebuild", 00:18:42.686 "target": "spare", 00:18:42.686 "progress": { 00:18:42.686 "blocks": 22528, 00:18:42.686 "percent": 35 00:18:42.686 } 00:18:42.686 }, 00:18:42.686 "base_bdevs_list": [ 00:18:42.686 { 00:18:42.686 "name": "spare", 00:18:42.686 "uuid": "016dd32e-8ba2-594d-bc18-82f2a6d46a20", 00:18:42.686 "is_configured": true, 00:18:42.686 "data_offset": 2048, 00:18:42.686 "data_size": 63488 00:18:42.686 }, 00:18:42.686 { 00:18:42.686 "name": "BaseBdev2", 00:18:42.686 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:42.686 "is_configured": true, 00:18:42.686 "data_offset": 2048, 00:18:42.686 "data_size": 63488 00:18:42.686 } 00:18:42.686 ] 00:18:42.686 }' 00:18:42.686 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:42.686 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:42.686 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:42.686 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:42.686 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:18:42.944 [2024-07-24 18:55:27.785957] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:42.945 [2024-07-24 18:55:27.788803] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:42.945 [2024-07-24 18:55:27.788828] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:42.945 [2024-07-24 18:55:27.788837] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:42.945 [2024-07-24 18:55:27.788840] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:18:42.945 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:42.945 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:42.945 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:42.945 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:42.945 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:42.945 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:42.945 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:42.945 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:42.945 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:42.945 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:42.945 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.945 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:43.203 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:43.203 "name": "raid_bdev1", 00:18:43.203 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:43.203 "strip_size_kb": 0, 00:18:43.203 "state": "online", 00:18:43.203 "raid_level": "raid1", 00:18:43.203 "superblock": true, 00:18:43.203 "num_base_bdevs": 2, 00:18:43.203 "num_base_bdevs_discovered": 1, 00:18:43.203 "num_base_bdevs_operational": 1, 00:18:43.203 "base_bdevs_list": [ 00:18:43.203 { 00:18:43.203 "name": null, 00:18:43.203 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:43.203 "is_configured": false, 00:18:43.203 "data_offset": 2048, 00:18:43.203 "data_size": 63488 00:18:43.203 }, 00:18:43.203 { 00:18:43.203 "name": "BaseBdev2", 00:18:43.203 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:43.203 "is_configured": true, 00:18:43.203 "data_offset": 2048, 00:18:43.203 "data_size": 63488 00:18:43.203 } 00:18:43.203 ] 00:18:43.203 }' 00:18:43.203 18:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:43.203 18:55:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:43.461 18:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:43.461 18:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:43.461 18:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:43.461 18:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:43.461 18:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:43.461 18:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.461 18:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:43.720 18:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:43.720 "name": "raid_bdev1", 00:18:43.720 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:43.720 "strip_size_kb": 0, 00:18:43.720 "state": "online", 00:18:43.720 "raid_level": "raid1", 00:18:43.720 "superblock": true, 00:18:43.720 "num_base_bdevs": 2, 00:18:43.720 "num_base_bdevs_discovered": 1, 00:18:43.720 "num_base_bdevs_operational": 1, 00:18:43.720 "base_bdevs_list": [ 00:18:43.720 { 00:18:43.720 "name": null, 00:18:43.720 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:43.720 "is_configured": false, 00:18:43.720 "data_offset": 2048, 00:18:43.720 "data_size": 63488 00:18:43.720 }, 00:18:43.720 { 00:18:43.720 "name": "BaseBdev2", 00:18:43.720 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:43.720 "is_configured": true, 00:18:43.720 "data_offset": 2048, 00:18:43.720 "data_size": 63488 00:18:43.720 } 00:18:43.720 ] 00:18:43.720 }' 00:18:43.720 18:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:43.720 18:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:43.720 18:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:43.720 18:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:43.720 18:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:18:43.978 18:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:18:44.237 [2024-07-24 18:55:29.032109] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:18:44.237 [2024-07-24 18:55:29.032145] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:44.237 [2024-07-24 18:55:29.032155] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x174a360 00:18:44.237 [2024-07-24 18:55:29.032161] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:44.237 [2024-07-24 18:55:29.032433] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:44.237 [2024-07-24 18:55:29.032444] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:44.237 [2024-07-24 18:55:29.032496] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:18:44.237 [2024-07-24 18:55:29.032504] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:18:44.237 [2024-07-24 18:55:29.032509] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:18:44.237 BaseBdev1 00:18:44.237 18:55:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:18:45.173 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:45.173 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:45.173 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:45.173 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:45.173 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:45.173 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:45.173 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:45.173 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:45.173 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:45.173 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:45.173 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.173 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:45.431 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:45.431 "name": "raid_bdev1", 00:18:45.431 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:45.431 "strip_size_kb": 0, 00:18:45.431 "state": "online", 00:18:45.431 "raid_level": "raid1", 00:18:45.431 "superblock": true, 00:18:45.431 "num_base_bdevs": 2, 00:18:45.431 "num_base_bdevs_discovered": 1, 00:18:45.431 "num_base_bdevs_operational": 1, 00:18:45.431 "base_bdevs_list": [ 00:18:45.431 { 00:18:45.431 "name": null, 00:18:45.431 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:45.431 "is_configured": false, 00:18:45.431 "data_offset": 2048, 00:18:45.431 "data_size": 63488 00:18:45.431 }, 00:18:45.431 { 00:18:45.431 "name": "BaseBdev2", 00:18:45.431 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:45.431 "is_configured": true, 00:18:45.431 "data_offset": 2048, 00:18:45.431 "data_size": 63488 00:18:45.431 } 00:18:45.431 ] 00:18:45.431 }' 00:18:45.431 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:45.431 18:55:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:45.998 "name": "raid_bdev1", 00:18:45.998 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:45.998 "strip_size_kb": 0, 00:18:45.998 "state": "online", 00:18:45.998 "raid_level": "raid1", 00:18:45.998 "superblock": true, 00:18:45.998 "num_base_bdevs": 2, 00:18:45.998 "num_base_bdevs_discovered": 1, 00:18:45.998 "num_base_bdevs_operational": 1, 00:18:45.998 "base_bdevs_list": [ 00:18:45.998 { 00:18:45.998 "name": null, 00:18:45.998 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:45.998 "is_configured": false, 00:18:45.998 "data_offset": 2048, 00:18:45.998 "data_size": 63488 00:18:45.998 }, 00:18:45.998 { 00:18:45.998 "name": "BaseBdev2", 00:18:45.998 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:45.998 "is_configured": true, 00:18:45.998 "data_offset": 2048, 00:18:45.998 "data_size": 63488 00:18:45.998 } 00:18:45.998 ] 00:18:45.998 }' 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:45.998 18:55:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:18:46.256 [2024-07-24 18:55:31.129572] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:46.257 [2024-07-24 18:55:31.129676] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:18:46.257 [2024-07-24 18:55:31.129685] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:18:46.257 request: 00:18:46.257 { 00:18:46.257 "base_bdev": "BaseBdev1", 00:18:46.257 "raid_bdev": "raid_bdev1", 00:18:46.257 "method": "bdev_raid_add_base_bdev", 00:18:46.257 "req_id": 1 00:18:46.257 } 00:18:46.257 Got JSON-RPC error response 00:18:46.257 response: 00:18:46.257 { 00:18:46.257 "code": -22, 00:18:46.257 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:18:46.257 } 00:18:46.257 18:55:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:18:46.257 18:55:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:18:46.257 18:55:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:18:46.257 18:55:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:18:46.257 18:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:18:47.191 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:47.191 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:47.191 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:47.191 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:47.191 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:47.191 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:47.191 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:47.191 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:47.191 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:47.191 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:47.191 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.191 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:47.449 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:47.449 "name": "raid_bdev1", 00:18:47.449 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:47.449 "strip_size_kb": 0, 00:18:47.449 "state": "online", 00:18:47.449 "raid_level": "raid1", 00:18:47.449 "superblock": true, 00:18:47.449 "num_base_bdevs": 2, 00:18:47.449 "num_base_bdevs_discovered": 1, 00:18:47.449 "num_base_bdevs_operational": 1, 00:18:47.449 "base_bdevs_list": [ 00:18:47.449 { 00:18:47.449 "name": null, 00:18:47.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:47.449 "is_configured": false, 00:18:47.449 "data_offset": 2048, 00:18:47.449 "data_size": 63488 00:18:47.449 }, 00:18:47.449 { 00:18:47.449 "name": "BaseBdev2", 00:18:47.449 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:47.449 "is_configured": true, 00:18:47.449 "data_offset": 2048, 00:18:47.449 "data_size": 63488 00:18:47.449 } 00:18:47.449 ] 00:18:47.449 }' 00:18:47.449 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:47.449 18:55:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:48.015 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:48.015 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:48.015 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:48.015 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:48.015 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:48.015 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.015 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:48.015 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:48.015 "name": "raid_bdev1", 00:18:48.015 "uuid": "c4b47855-ff6b-4766-8d14-df4b53a72e38", 00:18:48.015 "strip_size_kb": 0, 00:18:48.015 "state": "online", 00:18:48.015 "raid_level": "raid1", 00:18:48.015 "superblock": true, 00:18:48.015 "num_base_bdevs": 2, 00:18:48.015 "num_base_bdevs_discovered": 1, 00:18:48.015 "num_base_bdevs_operational": 1, 00:18:48.015 "base_bdevs_list": [ 00:18:48.015 { 00:18:48.015 "name": null, 00:18:48.015 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:48.015 "is_configured": false, 00:18:48.015 "data_offset": 2048, 00:18:48.015 "data_size": 63488 00:18:48.015 }, 00:18:48.015 { 00:18:48.015 "name": "BaseBdev2", 00:18:48.015 "uuid": "ddbe01ab-6448-5a7d-96e9-f64a2ec9a07f", 00:18:48.015 "is_configured": true, 00:18:48.015 "data_offset": 2048, 00:18:48.015 "data_size": 63488 00:18:48.015 } 00:18:48.015 ] 00:18:48.015 }' 00:18:48.015 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:48.015 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:48.015 18:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:48.272 18:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:48.272 18:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 2152740 00:18:48.272 18:55:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2152740 ']' 00:18:48.272 18:55:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 2152740 00:18:48.272 18:55:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:18:48.272 18:55:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:48.272 18:55:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2152740 00:18:48.272 18:55:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:48.272 18:55:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:48.272 18:55:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2152740' 00:18:48.272 killing process with pid 2152740 00:18:48.272 18:55:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 2152740 00:18:48.272 Received shutdown signal, test time was about 60.000000 seconds 00:18:48.272 00:18:48.272 Latency(us) 00:18:48.272 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:18:48.272 =================================================================================================================== 00:18:48.272 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:18:48.272 [2024-07-24 18:55:33.065573] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:48.272 [2024-07-24 18:55:33.065648] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:48.272 [2024-07-24 18:55:33.065680] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:48.273 [2024-07-24 18:55:33.065687] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1742770 name raid_bdev1, state offline 00:18:48.273 18:55:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 2152740 00:18:48.273 [2024-07-24 18:55:33.089523] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:48.273 18:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:18:48.273 00:18:48.273 real 0m28.142s 00:18:48.273 user 0m41.021s 00:18:48.273 sys 0m3.934s 00:18:48.273 18:55:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:48.273 18:55:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:48.273 ************************************ 00:18:48.273 END TEST raid_rebuild_test_sb 00:18:48.273 ************************************ 00:18:48.540 18:55:33 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:18:48.540 18:55:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:18:48.540 18:55:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:48.540 18:55:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:48.540 ************************************ 00:18:48.540 START TEST raid_rebuild_test_io 00:18:48.540 ************************************ 00:18:48.540 18:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2157850 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2157850 /var/tmp/spdk-raid.sock 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 2157850 ']' 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:48.541 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:48.541 18:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:18:48.541 [2024-07-24 18:55:33.393792] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:18:48.541 [2024-07-24 18:55:33.393829] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2157850 ] 00:18:48.541 I/O size of 3145728 is greater than zero copy threshold (65536). 00:18:48.541 Zero copy mechanism will not be used. 00:18:48.541 [2024-07-24 18:55:33.456671] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:48.541 [2024-07-24 18:55:33.533831] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:48.825 [2024-07-24 18:55:33.584150] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:48.825 [2024-07-24 18:55:33.584177] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:49.396 18:55:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:49.396 18:55:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:18:49.396 18:55:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:49.396 18:55:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:49.396 BaseBdev1_malloc 00:18:49.396 18:55:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:18:49.654 [2024-07-24 18:55:34.491275] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:18:49.654 [2024-07-24 18:55:34.491310] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:49.654 [2024-07-24 18:55:34.491323] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x906130 00:18:49.654 [2024-07-24 18:55:34.491329] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:49.654 [2024-07-24 18:55:34.492390] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:49.655 [2024-07-24 18:55:34.492411] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:49.655 BaseBdev1 00:18:49.655 18:55:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:49.655 18:55:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:49.913 BaseBdev2_malloc 00:18:49.913 18:55:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:18:49.913 [2024-07-24 18:55:34.835719] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:18:49.913 [2024-07-24 18:55:34.835749] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:49.913 [2024-07-24 18:55:34.835760] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xaabfa0 00:18:49.913 [2024-07-24 18:55:34.835786] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:49.913 [2024-07-24 18:55:34.836786] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:49.913 [2024-07-24 18:55:34.836805] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:49.913 BaseBdev2 00:18:49.913 18:55:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:18:50.171 spare_malloc 00:18:50.171 18:55:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:18:50.171 spare_delay 00:18:50.171 18:55:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:50.429 [2024-07-24 18:55:35.308272] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:50.429 [2024-07-24 18:55:35.308304] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:50.429 [2024-07-24 18:55:35.308315] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xaadf40 00:18:50.429 [2024-07-24 18:55:35.308321] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:50.429 [2024-07-24 18:55:35.309329] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:50.429 [2024-07-24 18:55:35.309349] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:50.429 spare 00:18:50.429 18:55:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:18:50.689 [2024-07-24 18:55:35.476727] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:50.689 [2024-07-24 18:55:35.477599] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:50.689 [2024-07-24 18:55:35.477655] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xaaf370 00:18:50.689 [2024-07-24 18:55:35.477660] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:18:50.689 [2024-07-24 18:55:35.477796] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xaae1d0 00:18:50.689 [2024-07-24 18:55:35.477893] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xaaf370 00:18:50.689 [2024-07-24 18:55:35.477899] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xaaf370 00:18:50.689 [2024-07-24 18:55:35.477972] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:50.689 18:55:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:50.689 18:55:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:50.689 18:55:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:50.689 18:55:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:50.689 18:55:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:50.689 18:55:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:50.689 18:55:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:50.689 18:55:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:50.689 18:55:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:50.689 18:55:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:50.689 18:55:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.689 18:55:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:50.689 18:55:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:50.689 "name": "raid_bdev1", 00:18:50.689 "uuid": "fa16f559-131e-4ae7-a240-804f47df2637", 00:18:50.689 "strip_size_kb": 0, 00:18:50.689 "state": "online", 00:18:50.689 "raid_level": "raid1", 00:18:50.689 "superblock": false, 00:18:50.689 "num_base_bdevs": 2, 00:18:50.689 "num_base_bdevs_discovered": 2, 00:18:50.689 "num_base_bdevs_operational": 2, 00:18:50.689 "base_bdevs_list": [ 00:18:50.689 { 00:18:50.689 "name": "BaseBdev1", 00:18:50.689 "uuid": "aa33079f-f274-5eb0-adff-15fb7a2cac66", 00:18:50.689 "is_configured": true, 00:18:50.689 "data_offset": 0, 00:18:50.689 "data_size": 65536 00:18:50.689 }, 00:18:50.689 { 00:18:50.689 "name": "BaseBdev2", 00:18:50.689 "uuid": "eb56261e-0bf6-53fe-96b2-4b6e2d2e7278", 00:18:50.689 "is_configured": true, 00:18:50.689 "data_offset": 0, 00:18:50.689 "data_size": 65536 00:18:50.689 } 00:18:50.689 ] 00:18:50.689 }' 00:18:50.689 18:55:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:50.689 18:55:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:18:51.255 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:18:51.255 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:51.512 [2024-07-24 18:55:36.311004] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:51.512 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:18:51.512 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.512 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:18:51.513 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:18:51.513 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:18:51.513 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:18:51.513 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:51.771 [2024-07-24 18:55:36.589383] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8fd920 00:18:51.771 I/O size of 3145728 is greater than zero copy threshold (65536). 00:18:51.771 Zero copy mechanism will not be used. 00:18:51.771 Running I/O for 60 seconds... 00:18:51.771 [2024-07-24 18:55:36.669621] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:51.771 [2024-07-24 18:55:36.669783] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x8fd920 00:18:51.771 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:51.771 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:51.771 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:51.771 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:51.771 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:51.771 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:51.771 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:51.771 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:51.771 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:51.771 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:51.771 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.771 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:52.029 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:52.029 "name": "raid_bdev1", 00:18:52.029 "uuid": "fa16f559-131e-4ae7-a240-804f47df2637", 00:18:52.029 "strip_size_kb": 0, 00:18:52.029 "state": "online", 00:18:52.029 "raid_level": "raid1", 00:18:52.029 "superblock": false, 00:18:52.029 "num_base_bdevs": 2, 00:18:52.029 "num_base_bdevs_discovered": 1, 00:18:52.029 "num_base_bdevs_operational": 1, 00:18:52.029 "base_bdevs_list": [ 00:18:52.029 { 00:18:52.029 "name": null, 00:18:52.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:52.029 "is_configured": false, 00:18:52.029 "data_offset": 0, 00:18:52.029 "data_size": 65536 00:18:52.029 }, 00:18:52.029 { 00:18:52.029 "name": "BaseBdev2", 00:18:52.029 "uuid": "eb56261e-0bf6-53fe-96b2-4b6e2d2e7278", 00:18:52.029 "is_configured": true, 00:18:52.029 "data_offset": 0, 00:18:52.029 "data_size": 65536 00:18:52.029 } 00:18:52.029 ] 00:18:52.029 }' 00:18:52.029 18:55:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:52.029 18:55:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:18:52.596 18:55:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:52.596 [2024-07-24 18:55:37.541595] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:52.596 18:55:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:18:52.596 [2024-07-24 18:55:37.571117] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x905a60 00:18:52.596 [2024-07-24 18:55:37.572661] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:52.855 [2024-07-24 18:55:37.674804] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:18:52.855 [2024-07-24 18:55:37.675152] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:18:53.113 [2024-07-24 18:55:37.893703] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:18:53.113 [2024-07-24 18:55:37.893805] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:18:53.373 [2024-07-24 18:55:38.326452] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:18:53.373 [2024-07-24 18:55:38.326574] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:18:53.632 18:55:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:53.632 18:55:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:53.632 18:55:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:53.632 18:55:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:53.632 18:55:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:53.632 18:55:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:53.632 18:55:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:53.891 [2024-07-24 18:55:38.658800] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:18:53.891 [2024-07-24 18:55:38.658966] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:18:53.891 18:55:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:53.891 "name": "raid_bdev1", 00:18:53.891 "uuid": "fa16f559-131e-4ae7-a240-804f47df2637", 00:18:53.891 "strip_size_kb": 0, 00:18:53.891 "state": "online", 00:18:53.891 "raid_level": "raid1", 00:18:53.891 "superblock": false, 00:18:53.891 "num_base_bdevs": 2, 00:18:53.891 "num_base_bdevs_discovered": 2, 00:18:53.891 "num_base_bdevs_operational": 2, 00:18:53.891 "process": { 00:18:53.891 "type": "rebuild", 00:18:53.891 "target": "spare", 00:18:53.891 "progress": { 00:18:53.891 "blocks": 16384, 00:18:53.891 "percent": 25 00:18:53.891 } 00:18:53.891 }, 00:18:53.891 "base_bdevs_list": [ 00:18:53.891 { 00:18:53.891 "name": "spare", 00:18:53.891 "uuid": "2827c86a-20ca-57eb-ab94-38d662fac6c4", 00:18:53.891 "is_configured": true, 00:18:53.891 "data_offset": 0, 00:18:53.891 "data_size": 65536 00:18:53.891 }, 00:18:53.891 { 00:18:53.891 "name": "BaseBdev2", 00:18:53.891 "uuid": "eb56261e-0bf6-53fe-96b2-4b6e2d2e7278", 00:18:53.891 "is_configured": true, 00:18:53.891 "data_offset": 0, 00:18:53.891 "data_size": 65536 00:18:53.891 } 00:18:53.891 ] 00:18:53.891 }' 00:18:53.891 18:55:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:53.891 18:55:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:53.891 18:55:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:53.891 18:55:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:53.891 18:55:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:18:54.149 [2024-07-24 18:55:38.969355] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:54.149 [2024-07-24 18:55:38.985576] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:18:54.149 [2024-07-24 18:55:38.985922] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:18:54.149 [2024-07-24 18:55:39.086884] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:54.149 [2024-07-24 18:55:39.088464] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:54.149 [2024-07-24 18:55:39.088488] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:54.149 [2024-07-24 18:55:39.088494] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:18:54.149 [2024-07-24 18:55:39.099113] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x8fd920 00:18:54.149 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:54.149 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:54.149 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:54.149 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:54.149 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:54.149 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:54.149 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:54.149 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:54.149 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:54.149 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:54.149 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:54.149 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.408 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:54.408 "name": "raid_bdev1", 00:18:54.408 "uuid": "fa16f559-131e-4ae7-a240-804f47df2637", 00:18:54.408 "strip_size_kb": 0, 00:18:54.408 "state": "online", 00:18:54.408 "raid_level": "raid1", 00:18:54.408 "superblock": false, 00:18:54.408 "num_base_bdevs": 2, 00:18:54.408 "num_base_bdevs_discovered": 1, 00:18:54.408 "num_base_bdevs_operational": 1, 00:18:54.408 "base_bdevs_list": [ 00:18:54.408 { 00:18:54.408 "name": null, 00:18:54.408 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:54.408 "is_configured": false, 00:18:54.408 "data_offset": 0, 00:18:54.408 "data_size": 65536 00:18:54.408 }, 00:18:54.408 { 00:18:54.408 "name": "BaseBdev2", 00:18:54.408 "uuid": "eb56261e-0bf6-53fe-96b2-4b6e2d2e7278", 00:18:54.408 "is_configured": true, 00:18:54.408 "data_offset": 0, 00:18:54.408 "data_size": 65536 00:18:54.408 } 00:18:54.408 ] 00:18:54.408 }' 00:18:54.408 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:54.408 18:55:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:18:54.975 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:54.975 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:54.975 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:54.975 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:54.975 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:54.975 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.975 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:55.234 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:55.234 "name": "raid_bdev1", 00:18:55.234 "uuid": "fa16f559-131e-4ae7-a240-804f47df2637", 00:18:55.234 "strip_size_kb": 0, 00:18:55.234 "state": "online", 00:18:55.234 "raid_level": "raid1", 00:18:55.234 "superblock": false, 00:18:55.234 "num_base_bdevs": 2, 00:18:55.234 "num_base_bdevs_discovered": 1, 00:18:55.234 "num_base_bdevs_operational": 1, 00:18:55.234 "base_bdevs_list": [ 00:18:55.234 { 00:18:55.234 "name": null, 00:18:55.234 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:55.234 "is_configured": false, 00:18:55.234 "data_offset": 0, 00:18:55.234 "data_size": 65536 00:18:55.234 }, 00:18:55.234 { 00:18:55.234 "name": "BaseBdev2", 00:18:55.234 "uuid": "eb56261e-0bf6-53fe-96b2-4b6e2d2e7278", 00:18:55.234 "is_configured": true, 00:18:55.234 "data_offset": 0, 00:18:55.234 "data_size": 65536 00:18:55.234 } 00:18:55.234 ] 00:18:55.234 }' 00:18:55.234 18:55:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:55.234 18:55:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:55.234 18:55:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:55.234 18:55:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:55.234 18:55:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:55.234 [2024-07-24 18:55:40.232228] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:55.493 [2024-07-24 18:55:40.262199] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9056f0 00:18:55.493 [2024-07-24 18:55:40.263246] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:55.493 18:55:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:18:55.493 [2024-07-24 18:55:40.381509] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:18:55.493 [2024-07-24 18:55:40.381857] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:18:55.751 [2024-07-24 18:55:40.589546] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:18:55.751 [2024-07-24 18:55:40.589647] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:18:56.019 [2024-07-24 18:55:40.919420] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:18:56.019 [2024-07-24 18:55:40.919666] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:18:56.283 [2024-07-24 18:55:41.128168] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:18:56.283 [2024-07-24 18:55:41.128321] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:18:56.283 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:56.283 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:56.283 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:56.283 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:56.283 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:56.283 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.283 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:56.542 [2024-07-24 18:55:41.350616] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:18:56.542 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:56.542 "name": "raid_bdev1", 00:18:56.542 "uuid": "fa16f559-131e-4ae7-a240-804f47df2637", 00:18:56.542 "strip_size_kb": 0, 00:18:56.542 "state": "online", 00:18:56.542 "raid_level": "raid1", 00:18:56.542 "superblock": false, 00:18:56.542 "num_base_bdevs": 2, 00:18:56.542 "num_base_bdevs_discovered": 2, 00:18:56.542 "num_base_bdevs_operational": 2, 00:18:56.542 "process": { 00:18:56.542 "type": "rebuild", 00:18:56.542 "target": "spare", 00:18:56.542 "progress": { 00:18:56.542 "blocks": 14336, 00:18:56.542 "percent": 21 00:18:56.542 } 00:18:56.542 }, 00:18:56.542 "base_bdevs_list": [ 00:18:56.542 { 00:18:56.542 "name": "spare", 00:18:56.542 "uuid": "2827c86a-20ca-57eb-ab94-38d662fac6c4", 00:18:56.542 "is_configured": true, 00:18:56.542 "data_offset": 0, 00:18:56.542 "data_size": 65536 00:18:56.542 }, 00:18:56.542 { 00:18:56.542 "name": "BaseBdev2", 00:18:56.542 "uuid": "eb56261e-0bf6-53fe-96b2-4b6e2d2e7278", 00:18:56.542 "is_configured": true, 00:18:56.542 "data_offset": 0, 00:18:56.542 "data_size": 65536 00:18:56.542 } 00:18:56.542 ] 00:18:56.542 }' 00:18:56.542 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:56.542 [2024-07-24 18:55:41.468404] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:18:56.542 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:56.542 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:56.542 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:56.542 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:18:56.542 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:18:56.542 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:18:56.542 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:18:56.542 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=622 00:18:56.542 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:56.542 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:56.542 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:56.542 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:56.542 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:56.542 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:56.542 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.542 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:56.802 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:56.802 "name": "raid_bdev1", 00:18:56.802 "uuid": "fa16f559-131e-4ae7-a240-804f47df2637", 00:18:56.802 "strip_size_kb": 0, 00:18:56.802 "state": "online", 00:18:56.802 "raid_level": "raid1", 00:18:56.802 "superblock": false, 00:18:56.802 "num_base_bdevs": 2, 00:18:56.802 "num_base_bdevs_discovered": 2, 00:18:56.802 "num_base_bdevs_operational": 2, 00:18:56.802 "process": { 00:18:56.802 "type": "rebuild", 00:18:56.802 "target": "spare", 00:18:56.802 "progress": { 00:18:56.802 "blocks": 18432, 00:18:56.802 "percent": 28 00:18:56.802 } 00:18:56.802 }, 00:18:56.802 "base_bdevs_list": [ 00:18:56.802 { 00:18:56.802 "name": "spare", 00:18:56.802 "uuid": "2827c86a-20ca-57eb-ab94-38d662fac6c4", 00:18:56.802 "is_configured": true, 00:18:56.802 "data_offset": 0, 00:18:56.802 "data_size": 65536 00:18:56.802 }, 00:18:56.802 { 00:18:56.802 "name": "BaseBdev2", 00:18:56.802 "uuid": "eb56261e-0bf6-53fe-96b2-4b6e2d2e7278", 00:18:56.802 "is_configured": true, 00:18:56.802 "data_offset": 0, 00:18:56.802 "data_size": 65536 00:18:56.802 } 00:18:56.802 ] 00:18:56.802 }' 00:18:56.802 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:56.802 [2024-07-24 18:55:41.695757] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:18:56.802 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:56.802 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:56.802 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:56.802 18:55:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:57.061 [2024-07-24 18:55:41.920137] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:18:57.319 [2024-07-24 18:55:42.150229] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:18:57.577 [2024-07-24 18:55:42.362526] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:18:57.577 [2024-07-24 18:55:42.362648] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:18:57.842 18:55:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:57.842 18:55:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:57.842 18:55:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:57.842 18:55:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:57.842 18:55:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:57.842 18:55:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:57.842 18:55:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:57.842 18:55:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:58.103 18:55:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:58.103 "name": "raid_bdev1", 00:18:58.103 "uuid": "fa16f559-131e-4ae7-a240-804f47df2637", 00:18:58.103 "strip_size_kb": 0, 00:18:58.103 "state": "online", 00:18:58.103 "raid_level": "raid1", 00:18:58.103 "superblock": false, 00:18:58.103 "num_base_bdevs": 2, 00:18:58.103 "num_base_bdevs_discovered": 2, 00:18:58.103 "num_base_bdevs_operational": 2, 00:18:58.103 "process": { 00:18:58.103 "type": "rebuild", 00:18:58.103 "target": "spare", 00:18:58.103 "progress": { 00:18:58.103 "blocks": 36864, 00:18:58.103 "percent": 56 00:18:58.103 } 00:18:58.103 }, 00:18:58.103 "base_bdevs_list": [ 00:18:58.103 { 00:18:58.103 "name": "spare", 00:18:58.103 "uuid": "2827c86a-20ca-57eb-ab94-38d662fac6c4", 00:18:58.103 "is_configured": true, 00:18:58.103 "data_offset": 0, 00:18:58.103 "data_size": 65536 00:18:58.103 }, 00:18:58.103 { 00:18:58.103 "name": "BaseBdev2", 00:18:58.103 "uuid": "eb56261e-0bf6-53fe-96b2-4b6e2d2e7278", 00:18:58.103 "is_configured": true, 00:18:58.103 "data_offset": 0, 00:18:58.103 "data_size": 65536 00:18:58.103 } 00:18:58.103 ] 00:18:58.103 }' 00:18:58.103 18:55:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:58.103 18:55:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:58.103 18:55:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:58.103 18:55:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:58.103 18:55:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:58.361 [2024-07-24 18:55:43.332541] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:18:58.361 [2024-07-24 18:55:43.332805] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:18:58.619 [2024-07-24 18:55:43.539626] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:18:59.186 18:55:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:59.186 18:55:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:59.186 18:55:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:59.186 18:55:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:59.186 18:55:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:59.186 18:55:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:59.186 18:55:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:59.186 18:55:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:59.444 18:55:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:59.444 "name": "raid_bdev1", 00:18:59.444 "uuid": "fa16f559-131e-4ae7-a240-804f47df2637", 00:18:59.444 "strip_size_kb": 0, 00:18:59.444 "state": "online", 00:18:59.444 "raid_level": "raid1", 00:18:59.444 "superblock": false, 00:18:59.444 "num_base_bdevs": 2, 00:18:59.444 "num_base_bdevs_discovered": 2, 00:18:59.444 "num_base_bdevs_operational": 2, 00:18:59.444 "process": { 00:18:59.444 "type": "rebuild", 00:18:59.444 "target": "spare", 00:18:59.444 "progress": { 00:18:59.444 "blocks": 57344, 00:18:59.444 "percent": 87 00:18:59.444 } 00:18:59.444 }, 00:18:59.444 "base_bdevs_list": [ 00:18:59.444 { 00:18:59.444 "name": "spare", 00:18:59.444 "uuid": "2827c86a-20ca-57eb-ab94-38d662fac6c4", 00:18:59.444 "is_configured": true, 00:18:59.444 "data_offset": 0, 00:18:59.444 "data_size": 65536 00:18:59.444 }, 00:18:59.444 { 00:18:59.444 "name": "BaseBdev2", 00:18:59.444 "uuid": "eb56261e-0bf6-53fe-96b2-4b6e2d2e7278", 00:18:59.444 "is_configured": true, 00:18:59.444 "data_offset": 0, 00:18:59.444 "data_size": 65536 00:18:59.444 } 00:18:59.444 ] 00:18:59.444 }' 00:18:59.444 18:55:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:59.444 18:55:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:59.444 18:55:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:59.444 18:55:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:59.444 18:55:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:59.702 [2024-07-24 18:55:44.618735] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:18:59.961 [2024-07-24 18:55:44.718999] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:18:59.961 [2024-07-24 18:55:44.720069] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:00.528 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:00.528 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:00.529 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:00.529 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:00.529 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:00.529 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:00.529 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.529 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:00.529 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:00.529 "name": "raid_bdev1", 00:19:00.529 "uuid": "fa16f559-131e-4ae7-a240-804f47df2637", 00:19:00.529 "strip_size_kb": 0, 00:19:00.529 "state": "online", 00:19:00.529 "raid_level": "raid1", 00:19:00.529 "superblock": false, 00:19:00.529 "num_base_bdevs": 2, 00:19:00.529 "num_base_bdevs_discovered": 2, 00:19:00.529 "num_base_bdevs_operational": 2, 00:19:00.529 "base_bdevs_list": [ 00:19:00.529 { 00:19:00.529 "name": "spare", 00:19:00.529 "uuid": "2827c86a-20ca-57eb-ab94-38d662fac6c4", 00:19:00.529 "is_configured": true, 00:19:00.529 "data_offset": 0, 00:19:00.529 "data_size": 65536 00:19:00.529 }, 00:19:00.529 { 00:19:00.529 "name": "BaseBdev2", 00:19:00.529 "uuid": "eb56261e-0bf6-53fe-96b2-4b6e2d2e7278", 00:19:00.529 "is_configured": true, 00:19:00.529 "data_offset": 0, 00:19:00.529 "data_size": 65536 00:19:00.529 } 00:19:00.529 ] 00:19:00.529 }' 00:19:00.529 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:00.529 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:19:00.529 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:00.787 "name": "raid_bdev1", 00:19:00.787 "uuid": "fa16f559-131e-4ae7-a240-804f47df2637", 00:19:00.787 "strip_size_kb": 0, 00:19:00.787 "state": "online", 00:19:00.787 "raid_level": "raid1", 00:19:00.787 "superblock": false, 00:19:00.787 "num_base_bdevs": 2, 00:19:00.787 "num_base_bdevs_discovered": 2, 00:19:00.787 "num_base_bdevs_operational": 2, 00:19:00.787 "base_bdevs_list": [ 00:19:00.787 { 00:19:00.787 "name": "spare", 00:19:00.787 "uuid": "2827c86a-20ca-57eb-ab94-38d662fac6c4", 00:19:00.787 "is_configured": true, 00:19:00.787 "data_offset": 0, 00:19:00.787 "data_size": 65536 00:19:00.787 }, 00:19:00.787 { 00:19:00.787 "name": "BaseBdev2", 00:19:00.787 "uuid": "eb56261e-0bf6-53fe-96b2-4b6e2d2e7278", 00:19:00.787 "is_configured": true, 00:19:00.787 "data_offset": 0, 00:19:00.787 "data_size": 65536 00:19:00.787 } 00:19:00.787 ] 00:19:00.787 }' 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.787 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:01.045 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:01.045 "name": "raid_bdev1", 00:19:01.045 "uuid": "fa16f559-131e-4ae7-a240-804f47df2637", 00:19:01.045 "strip_size_kb": 0, 00:19:01.045 "state": "online", 00:19:01.045 "raid_level": "raid1", 00:19:01.045 "superblock": false, 00:19:01.045 "num_base_bdevs": 2, 00:19:01.045 "num_base_bdevs_discovered": 2, 00:19:01.045 "num_base_bdevs_operational": 2, 00:19:01.045 "base_bdevs_list": [ 00:19:01.045 { 00:19:01.045 "name": "spare", 00:19:01.045 "uuid": "2827c86a-20ca-57eb-ab94-38d662fac6c4", 00:19:01.045 "is_configured": true, 00:19:01.045 "data_offset": 0, 00:19:01.045 "data_size": 65536 00:19:01.045 }, 00:19:01.045 { 00:19:01.045 "name": "BaseBdev2", 00:19:01.045 "uuid": "eb56261e-0bf6-53fe-96b2-4b6e2d2e7278", 00:19:01.045 "is_configured": true, 00:19:01.045 "data_offset": 0, 00:19:01.045 "data_size": 65536 00:19:01.045 } 00:19:01.045 ] 00:19:01.045 }' 00:19:01.045 18:55:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:01.045 18:55:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:01.613 18:55:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:01.613 [2024-07-24 18:55:46.585275] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:01.613 [2024-07-24 18:55:46.585300] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:01.872 00:19:01.872 Latency(us) 00:19:01.872 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:01.872 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:19:01.872 raid_bdev1 : 10.03 112.21 336.62 0.00 0.00 11897.17 245.76 107354.21 00:19:01.872 =================================================================================================================== 00:19:01.872 Total : 112.21 336.62 0.00 0.00 11897.17 245.76 107354.21 00:19:01.872 [2024-07-24 18:55:46.644107] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:01.873 [2024-07-24 18:55:46.644126] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:01.873 [2024-07-24 18:55:46.644170] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:01.873 [2024-07-24 18:55:46.644176] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaaf370 name raid_bdev1, state offline 00:19:01.873 0 00:19:01.873 18:55:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.873 18:55:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:19:01.873 18:55:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:19:01.873 18:55:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:19:01.873 18:55:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:19:01.873 18:55:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:19:01.873 18:55:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:01.873 18:55:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:19:01.873 18:55:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:01.873 18:55:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:01.873 18:55:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:01.873 18:55:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:19:01.873 18:55:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:01.873 18:55:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:01.873 18:55:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:19:02.131 /dev/nbd0 00:19:02.131 18:55:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:02.131 18:55:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:02.131 18:55:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:19:02.131 18:55:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:19:02.131 18:55:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:02.131 18:55:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:02.131 18:55:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:19:02.131 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:19:02.131 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:02.131 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:02.131 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:02.131 1+0 records in 00:19:02.131 1+0 records out 00:19:02.131 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021492 s, 19.1 MB/s 00:19:02.131 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:02.131 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:19:02.131 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:02.131 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:02.131 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:19:02.131 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:02.131 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:02.131 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:19:02.131 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:19:02.131 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:19:02.131 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:02.131 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:19:02.132 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:02.132 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:19:02.132 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:02.132 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:19:02.132 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:02.132 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:02.132 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:19:02.390 /dev/nbd1 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:02.390 1+0 records in 00:19:02.390 1+0 records out 00:19:02.390 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000196344 s, 20.9 MB/s 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:02.390 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 2157850 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 2157850 ']' 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 2157850 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:02.649 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2157850 00:19:02.907 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:02.907 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:02.907 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2157850' 00:19:02.907 killing process with pid 2157850 00:19:02.907 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 2157850 00:19:02.907 Received shutdown signal, test time was about 11.067245 seconds 00:19:02.907 00:19:02.907 Latency(us) 00:19:02.907 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:02.907 =================================================================================================================== 00:19:02.907 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:19:02.907 [2024-07-24 18:55:47.685141] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:02.907 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 2157850 00:19:02.907 [2024-07-24 18:55:47.703560] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:02.907 18:55:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:19:02.907 00:19:02.907 real 0m14.544s 00:19:02.907 user 0m21.755s 00:19:02.907 sys 0m1.778s 00:19:02.907 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:02.907 18:55:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:02.907 ************************************ 00:19:02.907 END TEST raid_rebuild_test_io 00:19:02.907 ************************************ 00:19:02.907 18:55:47 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:19:02.907 18:55:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:19:02.907 18:55:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:02.907 18:55:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:03.166 ************************************ 00:19:03.166 START TEST raid_rebuild_test_sb_io 00:19:03.166 ************************************ 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2160440 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2160440 /var/tmp/spdk-raid.sock 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 2160440 ']' 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:03.166 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:03.166 18:55:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:03.166 [2024-07-24 18:55:48.005848] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:19:03.166 [2024-07-24 18:55:48.005891] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2160440 ] 00:19:03.166 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:03.166 Zero copy mechanism will not be used. 00:19:03.166 [2024-07-24 18:55:48.070668] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:03.166 [2024-07-24 18:55:48.145648] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:03.424 [2024-07-24 18:55:48.203862] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:03.424 [2024-07-24 18:55:48.203888] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:03.991 18:55:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:03.991 18:55:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:19:03.991 18:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:03.991 18:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:03.991 BaseBdev1_malloc 00:19:03.991 18:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:04.250 [2024-07-24 18:55:49.119151] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:04.250 [2024-07-24 18:55:49.119185] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:04.250 [2024-07-24 18:55:49.119196] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cca130 00:19:04.250 [2024-07-24 18:55:49.119202] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:04.250 [2024-07-24 18:55:49.120245] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:04.250 [2024-07-24 18:55:49.120264] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:04.250 BaseBdev1 00:19:04.250 18:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:04.250 18:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:04.509 BaseBdev2_malloc 00:19:04.509 18:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:19:04.509 [2024-07-24 18:55:49.471504] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:19:04.509 [2024-07-24 18:55:49.471536] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:04.509 [2024-07-24 18:55:49.471548] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e6ffa0 00:19:04.509 [2024-07-24 18:55:49.471554] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:04.509 [2024-07-24 18:55:49.472578] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:04.509 [2024-07-24 18:55:49.472599] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:04.509 BaseBdev2 00:19:04.509 18:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:19:04.768 spare_malloc 00:19:04.768 18:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:19:05.067 spare_delay 00:19:05.067 18:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:05.067 [2024-07-24 18:55:49.976113] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:05.067 [2024-07-24 18:55:49.976138] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:05.067 [2024-07-24 18:55:49.976147] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e71f40 00:19:05.067 [2024-07-24 18:55:49.976153] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:05.067 [2024-07-24 18:55:49.977114] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:05.067 [2024-07-24 18:55:49.977134] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:05.067 spare 00:19:05.067 18:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:19:05.326 [2024-07-24 18:55:50.140576] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:05.326 [2024-07-24 18:55:50.141454] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:05.326 [2024-07-24 18:55:50.141578] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e73370 00:19:05.326 [2024-07-24 18:55:50.141587] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:05.326 [2024-07-24 18:55:50.141719] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e721d0 00:19:05.326 [2024-07-24 18:55:50.141814] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e73370 00:19:05.326 [2024-07-24 18:55:50.141819] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e73370 00:19:05.326 [2024-07-24 18:55:50.141882] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:05.326 18:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:05.326 18:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:05.326 18:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:05.326 18:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:05.326 18:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:05.326 18:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:05.327 18:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:05.327 18:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:05.327 18:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:05.327 18:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:05.327 18:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.327 18:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:05.327 18:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:05.327 "name": "raid_bdev1", 00:19:05.327 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:05.327 "strip_size_kb": 0, 00:19:05.327 "state": "online", 00:19:05.327 "raid_level": "raid1", 00:19:05.327 "superblock": true, 00:19:05.327 "num_base_bdevs": 2, 00:19:05.327 "num_base_bdevs_discovered": 2, 00:19:05.327 "num_base_bdevs_operational": 2, 00:19:05.327 "base_bdevs_list": [ 00:19:05.327 { 00:19:05.327 "name": "BaseBdev1", 00:19:05.327 "uuid": "8ed8a451-dc7f-5ca0-be82-e554dc202e19", 00:19:05.327 "is_configured": true, 00:19:05.327 "data_offset": 2048, 00:19:05.327 "data_size": 63488 00:19:05.327 }, 00:19:05.327 { 00:19:05.327 "name": "BaseBdev2", 00:19:05.327 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:05.327 "is_configured": true, 00:19:05.327 "data_offset": 2048, 00:19:05.327 "data_size": 63488 00:19:05.327 } 00:19:05.327 ] 00:19:05.327 }' 00:19:05.327 18:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:05.327 18:55:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:05.894 18:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:05.894 18:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:19:06.153 [2024-07-24 18:55:50.978880] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:06.153 18:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:19:06.153 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:06.153 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:19:06.411 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:19:06.411 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:19:06.411 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:19:06.411 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:06.411 [2024-07-24 18:55:51.257260] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cc14d0 00:19:06.411 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:06.411 Zero copy mechanism will not be used. 00:19:06.411 Running I/O for 60 seconds... 00:19:06.411 [2024-07-24 18:55:51.324795] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:06.411 [2024-07-24 18:55:51.329888] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1cc14d0 00:19:06.411 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:06.411 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:06.411 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:06.411 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:06.411 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:06.411 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:06.411 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:06.411 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:06.411 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:06.411 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:06.411 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:06.411 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:06.669 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:06.669 "name": "raid_bdev1", 00:19:06.669 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:06.669 "strip_size_kb": 0, 00:19:06.669 "state": "online", 00:19:06.669 "raid_level": "raid1", 00:19:06.669 "superblock": true, 00:19:06.669 "num_base_bdevs": 2, 00:19:06.669 "num_base_bdevs_discovered": 1, 00:19:06.669 "num_base_bdevs_operational": 1, 00:19:06.669 "base_bdevs_list": [ 00:19:06.669 { 00:19:06.669 "name": null, 00:19:06.669 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.669 "is_configured": false, 00:19:06.669 "data_offset": 2048, 00:19:06.669 "data_size": 63488 00:19:06.669 }, 00:19:06.669 { 00:19:06.669 "name": "BaseBdev2", 00:19:06.669 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:06.669 "is_configured": true, 00:19:06.669 "data_offset": 2048, 00:19:06.669 "data_size": 63488 00:19:06.669 } 00:19:06.669 ] 00:19:06.669 }' 00:19:06.669 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:06.669 18:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:07.237 18:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:07.237 [2024-07-24 18:55:52.182597] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:07.237 [2024-07-24 18:55:52.216679] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cc1760 00:19:07.237 [2024-07-24 18:55:52.218323] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:07.237 18:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:19:07.495 [2024-07-24 18:55:52.330909] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:07.495 [2024-07-24 18:55:52.331216] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:07.495 [2024-07-24 18:55:52.438649] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:07.495 [2024-07-24 18:55:52.438775] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:07.754 [2024-07-24 18:55:52.658466] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:19:08.012 [2024-07-24 18:55:52.871772] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:08.012 [2024-07-24 18:55:52.871882] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:08.269 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:08.269 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:08.269 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:08.269 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:08.269 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:08.269 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.269 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:08.526 [2024-07-24 18:55:53.326081] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:19:08.526 [2024-07-24 18:55:53.326227] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:19:08.526 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:08.526 "name": "raid_bdev1", 00:19:08.526 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:08.526 "strip_size_kb": 0, 00:19:08.526 "state": "online", 00:19:08.526 "raid_level": "raid1", 00:19:08.526 "superblock": true, 00:19:08.526 "num_base_bdevs": 2, 00:19:08.526 "num_base_bdevs_discovered": 2, 00:19:08.526 "num_base_bdevs_operational": 2, 00:19:08.526 "process": { 00:19:08.526 "type": "rebuild", 00:19:08.526 "target": "spare", 00:19:08.526 "progress": { 00:19:08.526 "blocks": 16384, 00:19:08.526 "percent": 25 00:19:08.526 } 00:19:08.526 }, 00:19:08.526 "base_bdevs_list": [ 00:19:08.526 { 00:19:08.526 "name": "spare", 00:19:08.526 "uuid": "f6048c3f-bcc7-5140-acb3-3feb67aee1b2", 00:19:08.526 "is_configured": true, 00:19:08.526 "data_offset": 2048, 00:19:08.526 "data_size": 63488 00:19:08.526 }, 00:19:08.526 { 00:19:08.526 "name": "BaseBdev2", 00:19:08.526 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:08.526 "is_configured": true, 00:19:08.526 "data_offset": 2048, 00:19:08.526 "data_size": 63488 00:19:08.526 } 00:19:08.526 ] 00:19:08.526 }' 00:19:08.527 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:08.527 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:08.527 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:08.527 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:08.527 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:08.784 [2024-07-24 18:55:53.647969] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:08.784 [2024-07-24 18:55:53.654094] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:19:08.784 [2024-07-24 18:55:53.760296] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:08.784 [2024-07-24 18:55:53.772177] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:08.784 [2024-07-24 18:55:53.772196] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:08.784 [2024-07-24 18:55:53.772201] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:08.784 [2024-07-24 18:55:53.782694] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1cc14d0 00:19:09.043 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:09.043 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:09.043 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:09.043 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:09.043 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:09.043 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:09.043 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:09.043 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:09.043 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:09.043 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:09.043 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.043 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:09.043 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:09.043 "name": "raid_bdev1", 00:19:09.043 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:09.043 "strip_size_kb": 0, 00:19:09.043 "state": "online", 00:19:09.043 "raid_level": "raid1", 00:19:09.043 "superblock": true, 00:19:09.043 "num_base_bdevs": 2, 00:19:09.043 "num_base_bdevs_discovered": 1, 00:19:09.043 "num_base_bdevs_operational": 1, 00:19:09.043 "base_bdevs_list": [ 00:19:09.043 { 00:19:09.043 "name": null, 00:19:09.043 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:09.043 "is_configured": false, 00:19:09.043 "data_offset": 2048, 00:19:09.043 "data_size": 63488 00:19:09.043 }, 00:19:09.043 { 00:19:09.043 "name": "BaseBdev2", 00:19:09.043 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:09.043 "is_configured": true, 00:19:09.043 "data_offset": 2048, 00:19:09.043 "data_size": 63488 00:19:09.043 } 00:19:09.043 ] 00:19:09.043 }' 00:19:09.043 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:09.043 18:55:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:09.610 18:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:09.610 18:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:09.610 18:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:09.610 18:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:09.610 18:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:09.610 18:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.610 18:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:09.869 18:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:09.869 "name": "raid_bdev1", 00:19:09.869 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:09.869 "strip_size_kb": 0, 00:19:09.869 "state": "online", 00:19:09.869 "raid_level": "raid1", 00:19:09.869 "superblock": true, 00:19:09.869 "num_base_bdevs": 2, 00:19:09.869 "num_base_bdevs_discovered": 1, 00:19:09.869 "num_base_bdevs_operational": 1, 00:19:09.869 "base_bdevs_list": [ 00:19:09.869 { 00:19:09.869 "name": null, 00:19:09.869 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:09.869 "is_configured": false, 00:19:09.869 "data_offset": 2048, 00:19:09.869 "data_size": 63488 00:19:09.869 }, 00:19:09.869 { 00:19:09.869 "name": "BaseBdev2", 00:19:09.869 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:09.869 "is_configured": true, 00:19:09.869 "data_offset": 2048, 00:19:09.869 "data_size": 63488 00:19:09.869 } 00:19:09.869 ] 00:19:09.869 }' 00:19:09.869 18:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:09.869 18:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:09.869 18:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:09.869 18:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:09.869 18:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:10.127 [2024-07-24 18:55:54.912451] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:10.127 [2024-07-24 18:55:54.952116] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cf0b70 00:19:10.127 18:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:19:10.127 [2024-07-24 18:55:54.953212] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:10.127 [2024-07-24 18:55:55.071578] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:10.127 [2024-07-24 18:55:55.071807] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:10.386 [2024-07-24 18:55:55.279010] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:10.386 [2024-07-24 18:55:55.279148] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:10.643 [2024-07-24 18:55:55.622661] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:19:10.902 [2024-07-24 18:55:55.830207] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:10.902 [2024-07-24 18:55:55.830321] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:11.160 18:55:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:11.160 18:55:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:11.161 18:55:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:11.161 18:55:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:11.161 18:55:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:11.161 18:55:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.161 18:55:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:11.161 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:11.161 "name": "raid_bdev1", 00:19:11.161 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:11.161 "strip_size_kb": 0, 00:19:11.161 "state": "online", 00:19:11.161 "raid_level": "raid1", 00:19:11.161 "superblock": true, 00:19:11.161 "num_base_bdevs": 2, 00:19:11.161 "num_base_bdevs_discovered": 2, 00:19:11.161 "num_base_bdevs_operational": 2, 00:19:11.161 "process": { 00:19:11.161 "type": "rebuild", 00:19:11.161 "target": "spare", 00:19:11.161 "progress": { 00:19:11.161 "blocks": 12288, 00:19:11.161 "percent": 19 00:19:11.161 } 00:19:11.161 }, 00:19:11.161 "base_bdevs_list": [ 00:19:11.161 { 00:19:11.161 "name": "spare", 00:19:11.161 "uuid": "f6048c3f-bcc7-5140-acb3-3feb67aee1b2", 00:19:11.161 "is_configured": true, 00:19:11.161 "data_offset": 2048, 00:19:11.161 "data_size": 63488 00:19:11.161 }, 00:19:11.161 { 00:19:11.161 "name": "BaseBdev2", 00:19:11.161 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:11.161 "is_configured": true, 00:19:11.161 "data_offset": 2048, 00:19:11.161 "data_size": 63488 00:19:11.161 } 00:19:11.161 ] 00:19:11.161 }' 00:19:11.161 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:11.161 [2024-07-24 18:55:56.161587] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:19:11.161 [2024-07-24 18:55:56.161931] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:19:11.419 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=637 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:11.419 [2024-07-24 18:55:56.364941] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:19:11.419 [2024-07-24 18:55:56.365129] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:11.419 "name": "raid_bdev1", 00:19:11.419 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:11.419 "strip_size_kb": 0, 00:19:11.419 "state": "online", 00:19:11.419 "raid_level": "raid1", 00:19:11.419 "superblock": true, 00:19:11.419 "num_base_bdevs": 2, 00:19:11.419 "num_base_bdevs_discovered": 2, 00:19:11.419 "num_base_bdevs_operational": 2, 00:19:11.419 "process": { 00:19:11.419 "type": "rebuild", 00:19:11.419 "target": "spare", 00:19:11.419 "progress": { 00:19:11.419 "blocks": 16384, 00:19:11.419 "percent": 25 00:19:11.419 } 00:19:11.419 }, 00:19:11.419 "base_bdevs_list": [ 00:19:11.419 { 00:19:11.419 "name": "spare", 00:19:11.419 "uuid": "f6048c3f-bcc7-5140-acb3-3feb67aee1b2", 00:19:11.419 "is_configured": true, 00:19:11.419 "data_offset": 2048, 00:19:11.419 "data_size": 63488 00:19:11.419 }, 00:19:11.419 { 00:19:11.419 "name": "BaseBdev2", 00:19:11.419 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:11.419 "is_configured": true, 00:19:11.419 "data_offset": 2048, 00:19:11.419 "data_size": 63488 00:19:11.419 } 00:19:11.419 ] 00:19:11.419 }' 00:19:11.419 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:11.678 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:11.678 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:11.678 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:11.679 18:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:11.679 [2024-07-24 18:55:56.637545] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:19:12.245 [2024-07-24 18:55:57.011085] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:19:12.245 [2024-07-24 18:55:57.117662] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:19:12.504 18:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:12.504 18:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:12.504 18:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:12.504 18:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:12.504 18:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:12.504 18:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:12.504 18:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.504 18:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:12.762 18:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:12.762 "name": "raid_bdev1", 00:19:12.762 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:12.762 "strip_size_kb": 0, 00:19:12.762 "state": "online", 00:19:12.762 "raid_level": "raid1", 00:19:12.762 "superblock": true, 00:19:12.762 "num_base_bdevs": 2, 00:19:12.762 "num_base_bdevs_discovered": 2, 00:19:12.762 "num_base_bdevs_operational": 2, 00:19:12.762 "process": { 00:19:12.762 "type": "rebuild", 00:19:12.762 "target": "spare", 00:19:12.762 "progress": { 00:19:12.762 "blocks": 34816, 00:19:12.762 "percent": 54 00:19:12.762 } 00:19:12.762 }, 00:19:12.762 "base_bdevs_list": [ 00:19:12.763 { 00:19:12.763 "name": "spare", 00:19:12.763 "uuid": "f6048c3f-bcc7-5140-acb3-3feb67aee1b2", 00:19:12.763 "is_configured": true, 00:19:12.763 "data_offset": 2048, 00:19:12.763 "data_size": 63488 00:19:12.763 }, 00:19:12.763 { 00:19:12.763 "name": "BaseBdev2", 00:19:12.763 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:12.763 "is_configured": true, 00:19:12.763 "data_offset": 2048, 00:19:12.763 "data_size": 63488 00:19:12.763 } 00:19:12.763 ] 00:19:12.763 }' 00:19:12.763 18:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:12.763 18:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:12.763 18:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:12.763 18:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:12.763 18:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:13.330 [2024-07-24 18:55:58.122626] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:19:13.330 [2024-07-24 18:55:58.330942] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:19:13.330 [2024-07-24 18:55:58.336120] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:19:13.898 [2024-07-24 18:55:58.639146] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:19:13.898 [2024-07-24 18:55:58.740536] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:19:13.898 18:55:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:13.898 18:55:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:13.898 18:55:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:13.898 18:55:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:13.898 18:55:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:13.898 18:55:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:13.898 18:55:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.898 18:55:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:14.157 18:55:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:14.157 "name": "raid_bdev1", 00:19:14.157 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:14.157 "strip_size_kb": 0, 00:19:14.157 "state": "online", 00:19:14.157 "raid_level": "raid1", 00:19:14.157 "superblock": true, 00:19:14.157 "num_base_bdevs": 2, 00:19:14.157 "num_base_bdevs_discovered": 2, 00:19:14.157 "num_base_bdevs_operational": 2, 00:19:14.157 "process": { 00:19:14.157 "type": "rebuild", 00:19:14.157 "target": "spare", 00:19:14.157 "progress": { 00:19:14.157 "blocks": 55296, 00:19:14.157 "percent": 87 00:19:14.157 } 00:19:14.157 }, 00:19:14.157 "base_bdevs_list": [ 00:19:14.157 { 00:19:14.157 "name": "spare", 00:19:14.157 "uuid": "f6048c3f-bcc7-5140-acb3-3feb67aee1b2", 00:19:14.157 "is_configured": true, 00:19:14.157 "data_offset": 2048, 00:19:14.157 "data_size": 63488 00:19:14.157 }, 00:19:14.157 { 00:19:14.157 "name": "BaseBdev2", 00:19:14.157 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:14.157 "is_configured": true, 00:19:14.157 "data_offset": 2048, 00:19:14.157 "data_size": 63488 00:19:14.157 } 00:19:14.157 ] 00:19:14.157 }' 00:19:14.157 18:55:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:14.157 18:55:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:14.157 18:55:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:14.157 18:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:14.157 18:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:14.157 [2024-07-24 18:55:59.062001] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:19:14.415 [2024-07-24 18:55:59.284621] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:19:14.415 [2024-07-24 18:55:59.384943] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:19:14.415 [2024-07-24 18:55:59.386027] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:15.351 "name": "raid_bdev1", 00:19:15.351 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:15.351 "strip_size_kb": 0, 00:19:15.351 "state": "online", 00:19:15.351 "raid_level": "raid1", 00:19:15.351 "superblock": true, 00:19:15.351 "num_base_bdevs": 2, 00:19:15.351 "num_base_bdevs_discovered": 2, 00:19:15.351 "num_base_bdevs_operational": 2, 00:19:15.351 "base_bdevs_list": [ 00:19:15.351 { 00:19:15.351 "name": "spare", 00:19:15.351 "uuid": "f6048c3f-bcc7-5140-acb3-3feb67aee1b2", 00:19:15.351 "is_configured": true, 00:19:15.351 "data_offset": 2048, 00:19:15.351 "data_size": 63488 00:19:15.351 }, 00:19:15.351 { 00:19:15.351 "name": "BaseBdev2", 00:19:15.351 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:15.351 "is_configured": true, 00:19:15.351 "data_offset": 2048, 00:19:15.351 "data_size": 63488 00:19:15.351 } 00:19:15.351 ] 00:19:15.351 }' 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.351 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:15.609 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:15.609 "name": "raid_bdev1", 00:19:15.609 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:15.609 "strip_size_kb": 0, 00:19:15.609 "state": "online", 00:19:15.609 "raid_level": "raid1", 00:19:15.609 "superblock": true, 00:19:15.609 "num_base_bdevs": 2, 00:19:15.609 "num_base_bdevs_discovered": 2, 00:19:15.609 "num_base_bdevs_operational": 2, 00:19:15.609 "base_bdevs_list": [ 00:19:15.609 { 00:19:15.609 "name": "spare", 00:19:15.609 "uuid": "f6048c3f-bcc7-5140-acb3-3feb67aee1b2", 00:19:15.609 "is_configured": true, 00:19:15.609 "data_offset": 2048, 00:19:15.609 "data_size": 63488 00:19:15.609 }, 00:19:15.609 { 00:19:15.609 "name": "BaseBdev2", 00:19:15.609 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:15.609 "is_configured": true, 00:19:15.609 "data_offset": 2048, 00:19:15.609 "data_size": 63488 00:19:15.609 } 00:19:15.609 ] 00:19:15.609 }' 00:19:15.609 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:15.609 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:15.609 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:15.609 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:15.609 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:15.609 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:15.609 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:15.609 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:15.609 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:15.609 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:15.609 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.609 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.609 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.609 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.609 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:15.609 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.867 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:15.867 "name": "raid_bdev1", 00:19:15.868 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:15.868 "strip_size_kb": 0, 00:19:15.868 "state": "online", 00:19:15.868 "raid_level": "raid1", 00:19:15.868 "superblock": true, 00:19:15.868 "num_base_bdevs": 2, 00:19:15.868 "num_base_bdevs_discovered": 2, 00:19:15.868 "num_base_bdevs_operational": 2, 00:19:15.868 "base_bdevs_list": [ 00:19:15.868 { 00:19:15.868 "name": "spare", 00:19:15.868 "uuid": "f6048c3f-bcc7-5140-acb3-3feb67aee1b2", 00:19:15.868 "is_configured": true, 00:19:15.868 "data_offset": 2048, 00:19:15.868 "data_size": 63488 00:19:15.868 }, 00:19:15.868 { 00:19:15.868 "name": "BaseBdev2", 00:19:15.868 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:15.868 "is_configured": true, 00:19:15.868 "data_offset": 2048, 00:19:15.868 "data_size": 63488 00:19:15.868 } 00:19:15.868 ] 00:19:15.868 }' 00:19:15.868 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:15.868 18:56:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:16.434 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:16.434 [2024-07-24 18:56:01.296235] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:16.434 [2024-07-24 18:56:01.296262] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:16.434 00:19:16.434 Latency(us) 00:19:16.435 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:16.435 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:19:16.435 raid_bdev1 : 10.04 113.98 341.94 0.00 0.00 11867.29 236.98 108352.85 00:19:16.435 =================================================================================================================== 00:19:16.435 Total : 113.98 341.94 0.00 0.00 11867.29 236.98 108352.85 00:19:16.435 [2024-07-24 18:56:01.322954] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:16.435 [2024-07-24 18:56:01.322991] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:16.435 [2024-07-24 18:56:01.323039] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:16.435 [2024-07-24 18:56:01.323044] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e73370 name raid_bdev1, state offline 00:19:16.435 0 00:19:16.435 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.435 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:19:16.694 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:19:16.694 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:19:16.694 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:19:16.694 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:19:16.694 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:16.694 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:19:16.694 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:16.694 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:16.694 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:16.694 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:19:16.694 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:16.694 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:16.694 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:19:16.694 /dev/nbd0 00:19:16.694 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:16.694 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:16.695 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:19:16.695 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:19:16.695 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:16.953 1+0 records in 00:19:16.953 1+0 records out 00:19:16.953 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000226903 s, 18.1 MB/s 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:19:16.953 /dev/nbd1 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:16.953 1+0 records in 00:19:16.953 1+0 records out 00:19:16.953 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000184999 s, 22.1 MB/s 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:16.953 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:19:17.212 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:19:17.212 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:17.212 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:19:17.212 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:17.212 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:19:17.212 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:17.212 18:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:19:17.212 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:19:17.212 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:19:17.212 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:19:17.212 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:17.212 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:17.212 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:19:17.212 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:19:17.212 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:19:17.212 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:17.212 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:17.212 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:17.212 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:17.212 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:19:17.212 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:17.212 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:17.471 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:17.471 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:17.471 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:17.471 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:17.471 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:17.471 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:17.471 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:19:17.471 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:19:17.471 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:19:17.471 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:17.729 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:17.730 [2024-07-24 18:56:02.672715] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:17.730 [2024-07-24 18:56:02.672751] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:17.730 [2024-07-24 18:56:02.672766] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cf0f80 00:19:17.730 [2024-07-24 18:56:02.672772] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:17.730 [2024-07-24 18:56:02.674003] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:17.730 [2024-07-24 18:56:02.674025] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:17.730 [2024-07-24 18:56:02.674078] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:19:17.730 [2024-07-24 18:56:02.674097] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:17.730 [2024-07-24 18:56:02.674172] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:17.730 spare 00:19:17.730 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:17.730 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:17.730 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:17.730 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:17.730 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:17.730 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:17.730 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:17.730 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:17.730 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:17.730 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:17.730 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.730 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:17.989 [2024-07-24 18:56:02.774463] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cf0790 00:19:17.989 [2024-07-24 18:56:02.774477] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:17.989 [2024-07-24 18:56:02.774601] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cf1210 00:19:17.989 [2024-07-24 18:56:02.774701] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cf0790 00:19:17.989 [2024-07-24 18:56:02.774706] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1cf0790 00:19:17.989 [2024-07-24 18:56:02.774778] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:17.989 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:17.989 "name": "raid_bdev1", 00:19:17.989 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:17.989 "strip_size_kb": 0, 00:19:17.989 "state": "online", 00:19:17.989 "raid_level": "raid1", 00:19:17.989 "superblock": true, 00:19:17.989 "num_base_bdevs": 2, 00:19:17.989 "num_base_bdevs_discovered": 2, 00:19:17.989 "num_base_bdevs_operational": 2, 00:19:17.989 "base_bdevs_list": [ 00:19:17.989 { 00:19:17.989 "name": "spare", 00:19:17.989 "uuid": "f6048c3f-bcc7-5140-acb3-3feb67aee1b2", 00:19:17.989 "is_configured": true, 00:19:17.989 "data_offset": 2048, 00:19:17.989 "data_size": 63488 00:19:17.989 }, 00:19:17.989 { 00:19:17.989 "name": "BaseBdev2", 00:19:17.989 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:17.989 "is_configured": true, 00:19:17.989 "data_offset": 2048, 00:19:17.989 "data_size": 63488 00:19:17.989 } 00:19:17.989 ] 00:19:17.989 }' 00:19:17.989 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:17.989 18:56:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:18.558 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:18.558 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:18.558 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:18.558 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:18.558 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:18.558 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.558 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:18.558 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:18.558 "name": "raid_bdev1", 00:19:18.558 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:18.558 "strip_size_kb": 0, 00:19:18.558 "state": "online", 00:19:18.558 "raid_level": "raid1", 00:19:18.558 "superblock": true, 00:19:18.558 "num_base_bdevs": 2, 00:19:18.558 "num_base_bdevs_discovered": 2, 00:19:18.558 "num_base_bdevs_operational": 2, 00:19:18.558 "base_bdevs_list": [ 00:19:18.558 { 00:19:18.558 "name": "spare", 00:19:18.558 "uuid": "f6048c3f-bcc7-5140-acb3-3feb67aee1b2", 00:19:18.558 "is_configured": true, 00:19:18.558 "data_offset": 2048, 00:19:18.558 "data_size": 63488 00:19:18.558 }, 00:19:18.558 { 00:19:18.558 "name": "BaseBdev2", 00:19:18.558 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:18.558 "is_configured": true, 00:19:18.558 "data_offset": 2048, 00:19:18.558 "data_size": 63488 00:19:18.558 } 00:19:18.558 ] 00:19:18.558 }' 00:19:18.558 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:18.816 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:18.816 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:18.816 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:18.816 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.816 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:19:18.816 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:19:18.816 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:19.075 [2024-07-24 18:56:03.940150] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:19.075 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:19.075 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:19.075 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:19.075 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:19.075 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:19.075 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:19.075 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:19.075 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:19.075 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:19.075 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:19.075 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.075 18:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:19.335 18:56:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:19.335 "name": "raid_bdev1", 00:19:19.335 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:19.335 "strip_size_kb": 0, 00:19:19.335 "state": "online", 00:19:19.335 "raid_level": "raid1", 00:19:19.335 "superblock": true, 00:19:19.335 "num_base_bdevs": 2, 00:19:19.335 "num_base_bdevs_discovered": 1, 00:19:19.335 "num_base_bdevs_operational": 1, 00:19:19.335 "base_bdevs_list": [ 00:19:19.335 { 00:19:19.335 "name": null, 00:19:19.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.335 "is_configured": false, 00:19:19.335 "data_offset": 2048, 00:19:19.335 "data_size": 63488 00:19:19.335 }, 00:19:19.335 { 00:19:19.335 "name": "BaseBdev2", 00:19:19.335 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:19.335 "is_configured": true, 00:19:19.335 "data_offset": 2048, 00:19:19.335 "data_size": 63488 00:19:19.335 } 00:19:19.335 ] 00:19:19.335 }' 00:19:19.335 18:56:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:19.335 18:56:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:19.594 18:56:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:19.853 [2024-07-24 18:56:04.726305] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:19.853 [2024-07-24 18:56:04.726430] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:19:19.853 [2024-07-24 18:56:04.726439] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:19:19.853 [2024-07-24 18:56:04.726458] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:19.853 [2024-07-24 18:56:04.731156] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e721d0 00:19:19.853 [2024-07-24 18:56:04.732538] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:19.853 18:56:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:19:20.788 18:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:20.788 18:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:20.788 18:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:20.788 18:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:20.788 18:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:20.788 18:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.788 18:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:21.047 18:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:21.047 "name": "raid_bdev1", 00:19:21.047 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:21.047 "strip_size_kb": 0, 00:19:21.047 "state": "online", 00:19:21.047 "raid_level": "raid1", 00:19:21.047 "superblock": true, 00:19:21.047 "num_base_bdevs": 2, 00:19:21.047 "num_base_bdevs_discovered": 2, 00:19:21.047 "num_base_bdevs_operational": 2, 00:19:21.047 "process": { 00:19:21.047 "type": "rebuild", 00:19:21.047 "target": "spare", 00:19:21.047 "progress": { 00:19:21.047 "blocks": 22528, 00:19:21.047 "percent": 35 00:19:21.047 } 00:19:21.047 }, 00:19:21.047 "base_bdevs_list": [ 00:19:21.047 { 00:19:21.047 "name": "spare", 00:19:21.047 "uuid": "f6048c3f-bcc7-5140-acb3-3feb67aee1b2", 00:19:21.047 "is_configured": true, 00:19:21.047 "data_offset": 2048, 00:19:21.047 "data_size": 63488 00:19:21.047 }, 00:19:21.047 { 00:19:21.047 "name": "BaseBdev2", 00:19:21.047 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:21.047 "is_configured": true, 00:19:21.047 "data_offset": 2048, 00:19:21.047 "data_size": 63488 00:19:21.047 } 00:19:21.047 ] 00:19:21.047 }' 00:19:21.047 18:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:21.047 18:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:21.047 18:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:21.047 18:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:21.047 18:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:21.306 [2024-07-24 18:56:06.143032] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:21.306 [2024-07-24 18:56:06.243320] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:21.306 [2024-07-24 18:56:06.243355] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:21.306 [2024-07-24 18:56:06.243365] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:21.306 [2024-07-24 18:56:06.243369] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:21.306 18:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:21.306 18:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:21.306 18:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:21.306 18:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:21.306 18:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:21.306 18:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:21.306 18:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:21.306 18:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:21.306 18:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:21.306 18:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:21.306 18:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.306 18:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:21.564 18:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:21.564 "name": "raid_bdev1", 00:19:21.564 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:21.564 "strip_size_kb": 0, 00:19:21.564 "state": "online", 00:19:21.564 "raid_level": "raid1", 00:19:21.564 "superblock": true, 00:19:21.564 "num_base_bdevs": 2, 00:19:21.564 "num_base_bdevs_discovered": 1, 00:19:21.564 "num_base_bdevs_operational": 1, 00:19:21.564 "base_bdevs_list": [ 00:19:21.564 { 00:19:21.564 "name": null, 00:19:21.564 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:21.564 "is_configured": false, 00:19:21.564 "data_offset": 2048, 00:19:21.564 "data_size": 63488 00:19:21.564 }, 00:19:21.564 { 00:19:21.564 "name": "BaseBdev2", 00:19:21.564 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:21.564 "is_configured": true, 00:19:21.564 "data_offset": 2048, 00:19:21.564 "data_size": 63488 00:19:21.564 } 00:19:21.564 ] 00:19:21.564 }' 00:19:21.564 18:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:21.564 18:56:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:22.150 18:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:22.150 [2024-07-24 18:56:07.085815] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:22.150 [2024-07-24 18:56:07.085857] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:22.150 [2024-07-24 18:56:07.085885] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e72ad0 00:19:22.150 [2024-07-24 18:56:07.085891] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:22.150 [2024-07-24 18:56:07.086174] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:22.150 [2024-07-24 18:56:07.086184] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:22.151 [2024-07-24 18:56:07.086240] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:19:22.151 [2024-07-24 18:56:07.086247] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:19:22.151 [2024-07-24 18:56:07.086251] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:19:22.151 [2024-07-24 18:56:07.086262] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:22.151 [2024-07-24 18:56:07.090815] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e721d0 00:19:22.151 spare 00:19:22.151 [2024-07-24 18:56:07.091881] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:22.151 18:56:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:19:23.525 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:23.525 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:23.525 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:23.525 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:23.525 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:23.525 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.525 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:23.525 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:23.525 "name": "raid_bdev1", 00:19:23.525 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:23.525 "strip_size_kb": 0, 00:19:23.525 "state": "online", 00:19:23.525 "raid_level": "raid1", 00:19:23.525 "superblock": true, 00:19:23.525 "num_base_bdevs": 2, 00:19:23.525 "num_base_bdevs_discovered": 2, 00:19:23.525 "num_base_bdevs_operational": 2, 00:19:23.525 "process": { 00:19:23.525 "type": "rebuild", 00:19:23.525 "target": "spare", 00:19:23.525 "progress": { 00:19:23.525 "blocks": 22528, 00:19:23.525 "percent": 35 00:19:23.525 } 00:19:23.525 }, 00:19:23.525 "base_bdevs_list": [ 00:19:23.525 { 00:19:23.525 "name": "spare", 00:19:23.525 "uuid": "f6048c3f-bcc7-5140-acb3-3feb67aee1b2", 00:19:23.525 "is_configured": true, 00:19:23.525 "data_offset": 2048, 00:19:23.525 "data_size": 63488 00:19:23.525 }, 00:19:23.525 { 00:19:23.525 "name": "BaseBdev2", 00:19:23.525 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:23.525 "is_configured": true, 00:19:23.525 "data_offset": 2048, 00:19:23.525 "data_size": 63488 00:19:23.525 } 00:19:23.525 ] 00:19:23.525 }' 00:19:23.525 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:23.525 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:23.525 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:23.525 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:23.525 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:23.525 [2024-07-24 18:56:08.527834] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:23.784 [2024-07-24 18:56:08.602405] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:23.784 [2024-07-24 18:56:08.602433] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:23.784 [2024-07-24 18:56:08.602442] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:23.784 [2024-07-24 18:56:08.602462] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:23.784 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:23.784 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:23.784 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:23.784 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:23.784 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:23.784 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:23.784 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:23.784 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:23.784 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:23.784 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:23.784 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.784 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:24.043 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:24.043 "name": "raid_bdev1", 00:19:24.043 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:24.043 "strip_size_kb": 0, 00:19:24.043 "state": "online", 00:19:24.043 "raid_level": "raid1", 00:19:24.043 "superblock": true, 00:19:24.043 "num_base_bdevs": 2, 00:19:24.043 "num_base_bdevs_discovered": 1, 00:19:24.043 "num_base_bdevs_operational": 1, 00:19:24.043 "base_bdevs_list": [ 00:19:24.043 { 00:19:24.043 "name": null, 00:19:24.043 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:24.043 "is_configured": false, 00:19:24.043 "data_offset": 2048, 00:19:24.043 "data_size": 63488 00:19:24.043 }, 00:19:24.043 { 00:19:24.043 "name": "BaseBdev2", 00:19:24.043 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:24.043 "is_configured": true, 00:19:24.043 "data_offset": 2048, 00:19:24.043 "data_size": 63488 00:19:24.043 } 00:19:24.043 ] 00:19:24.043 }' 00:19:24.043 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:24.043 18:56:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:24.300 18:56:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:24.300 18:56:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:24.300 18:56:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:24.300 18:56:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:24.300 18:56:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:24.300 18:56:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:24.300 18:56:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.558 18:56:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:24.558 "name": "raid_bdev1", 00:19:24.558 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:24.558 "strip_size_kb": 0, 00:19:24.558 "state": "online", 00:19:24.558 "raid_level": "raid1", 00:19:24.558 "superblock": true, 00:19:24.558 "num_base_bdevs": 2, 00:19:24.558 "num_base_bdevs_discovered": 1, 00:19:24.558 "num_base_bdevs_operational": 1, 00:19:24.558 "base_bdevs_list": [ 00:19:24.558 { 00:19:24.558 "name": null, 00:19:24.558 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:24.558 "is_configured": false, 00:19:24.558 "data_offset": 2048, 00:19:24.558 "data_size": 63488 00:19:24.558 }, 00:19:24.558 { 00:19:24.558 "name": "BaseBdev2", 00:19:24.558 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:24.558 "is_configured": true, 00:19:24.558 "data_offset": 2048, 00:19:24.558 "data_size": 63488 00:19:24.558 } 00:19:24.558 ] 00:19:24.558 }' 00:19:24.558 18:56:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:24.558 18:56:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:24.558 18:56:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:24.558 18:56:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:24.558 18:56:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:19:24.817 18:56:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:25.075 [2024-07-24 18:56:09.849951] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:25.075 [2024-07-24 18:56:09.849986] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:25.075 [2024-07-24 18:56:09.849997] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cca360 00:19:25.075 [2024-07-24 18:56:09.850003] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:25.075 [2024-07-24 18:56:09.850260] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:25.075 [2024-07-24 18:56:09.850270] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:25.075 [2024-07-24 18:56:09.850312] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:19:25.075 [2024-07-24 18:56:09.850319] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:19:25.075 [2024-07-24 18:56:09.850324] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:19:25.075 BaseBdev1 00:19:25.075 18:56:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:19:26.011 18:56:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:26.011 18:56:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:26.011 18:56:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:26.011 18:56:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:26.011 18:56:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:26.011 18:56:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:26.011 18:56:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:26.011 18:56:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:26.011 18:56:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:26.011 18:56:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:26.011 18:56:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.011 18:56:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:26.269 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:26.269 "name": "raid_bdev1", 00:19:26.269 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:26.269 "strip_size_kb": 0, 00:19:26.269 "state": "online", 00:19:26.269 "raid_level": "raid1", 00:19:26.269 "superblock": true, 00:19:26.269 "num_base_bdevs": 2, 00:19:26.269 "num_base_bdevs_discovered": 1, 00:19:26.269 "num_base_bdevs_operational": 1, 00:19:26.269 "base_bdevs_list": [ 00:19:26.269 { 00:19:26.269 "name": null, 00:19:26.269 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:26.269 "is_configured": false, 00:19:26.269 "data_offset": 2048, 00:19:26.269 "data_size": 63488 00:19:26.269 }, 00:19:26.269 { 00:19:26.269 "name": "BaseBdev2", 00:19:26.269 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:26.269 "is_configured": true, 00:19:26.269 "data_offset": 2048, 00:19:26.269 "data_size": 63488 00:19:26.269 } 00:19:26.269 ] 00:19:26.269 }' 00:19:26.269 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:26.269 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:26.835 "name": "raid_bdev1", 00:19:26.835 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:26.835 "strip_size_kb": 0, 00:19:26.835 "state": "online", 00:19:26.835 "raid_level": "raid1", 00:19:26.835 "superblock": true, 00:19:26.835 "num_base_bdevs": 2, 00:19:26.835 "num_base_bdevs_discovered": 1, 00:19:26.835 "num_base_bdevs_operational": 1, 00:19:26.835 "base_bdevs_list": [ 00:19:26.835 { 00:19:26.835 "name": null, 00:19:26.835 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:26.835 "is_configured": false, 00:19:26.835 "data_offset": 2048, 00:19:26.835 "data_size": 63488 00:19:26.835 }, 00:19:26.835 { 00:19:26.835 "name": "BaseBdev2", 00:19:26.835 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:26.835 "is_configured": true, 00:19:26.835 "data_offset": 2048, 00:19:26.835 "data_size": 63488 00:19:26.835 } 00:19:26.835 ] 00:19:26.835 }' 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:26.835 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:19:27.092 [2024-07-24 18:56:11.943551] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:27.092 [2024-07-24 18:56:11.943654] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:19:27.092 [2024-07-24 18:56:11.943672] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:19:27.092 request: 00:19:27.092 { 00:19:27.092 "base_bdev": "BaseBdev1", 00:19:27.092 "raid_bdev": "raid_bdev1", 00:19:27.092 "method": "bdev_raid_add_base_bdev", 00:19:27.092 "req_id": 1 00:19:27.092 } 00:19:27.092 Got JSON-RPC error response 00:19:27.092 response: 00:19:27.092 { 00:19:27.092 "code": -22, 00:19:27.092 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:19:27.092 } 00:19:27.092 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:19:27.092 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:19:27.092 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:19:27.092 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:19:27.092 18:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:19:28.027 18:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:28.027 18:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:28.027 18:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:28.027 18:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:28.027 18:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:28.027 18:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:28.027 18:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:28.027 18:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:28.027 18:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:28.027 18:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:28.027 18:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.027 18:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:28.285 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:28.285 "name": "raid_bdev1", 00:19:28.285 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:28.285 "strip_size_kb": 0, 00:19:28.285 "state": "online", 00:19:28.285 "raid_level": "raid1", 00:19:28.285 "superblock": true, 00:19:28.285 "num_base_bdevs": 2, 00:19:28.285 "num_base_bdevs_discovered": 1, 00:19:28.285 "num_base_bdevs_operational": 1, 00:19:28.285 "base_bdevs_list": [ 00:19:28.285 { 00:19:28.285 "name": null, 00:19:28.285 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:28.285 "is_configured": false, 00:19:28.285 "data_offset": 2048, 00:19:28.285 "data_size": 63488 00:19:28.285 }, 00:19:28.285 { 00:19:28.285 "name": "BaseBdev2", 00:19:28.285 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:28.285 "is_configured": true, 00:19:28.285 "data_offset": 2048, 00:19:28.285 "data_size": 63488 00:19:28.285 } 00:19:28.285 ] 00:19:28.285 }' 00:19:28.285 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:28.285 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:28.848 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:28.848 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:28.848 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:28.848 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:28.848 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:28.848 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.848 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:28.848 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:28.848 "name": "raid_bdev1", 00:19:28.848 "uuid": "1eb6d41a-d940-43b7-9971-e9600e586731", 00:19:28.848 "strip_size_kb": 0, 00:19:28.848 "state": "online", 00:19:28.848 "raid_level": "raid1", 00:19:28.848 "superblock": true, 00:19:28.848 "num_base_bdevs": 2, 00:19:28.848 "num_base_bdevs_discovered": 1, 00:19:28.848 "num_base_bdevs_operational": 1, 00:19:28.848 "base_bdevs_list": [ 00:19:28.848 { 00:19:28.848 "name": null, 00:19:28.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:28.848 "is_configured": false, 00:19:28.848 "data_offset": 2048, 00:19:28.848 "data_size": 63488 00:19:28.848 }, 00:19:28.848 { 00:19:28.848 "name": "BaseBdev2", 00:19:28.848 "uuid": "bd600703-cf95-5d56-a08a-c9c71af9182b", 00:19:28.848 "is_configured": true, 00:19:28.848 "data_offset": 2048, 00:19:28.848 "data_size": 63488 00:19:28.848 } 00:19:28.848 ] 00:19:28.848 }' 00:19:28.848 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:28.848 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:28.848 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:29.106 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:29.106 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 2160440 00:19:29.106 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 2160440 ']' 00:19:29.106 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 2160440 00:19:29.106 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:19:29.106 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:29.106 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2160440 00:19:29.106 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:29.106 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:29.106 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2160440' 00:19:29.106 killing process with pid 2160440 00:19:29.106 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 2160440 00:19:29.106 Received shutdown signal, test time was about 22.585182 seconds 00:19:29.106 00:19:29.106 Latency(us) 00:19:29.106 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:29.106 =================================================================================================================== 00:19:29.106 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:19:29.106 [2024-07-24 18:56:13.899126] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:29.106 [2024-07-24 18:56:13.899201] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:29.106 [2024-07-24 18:56:13.899232] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:29.106 [2024-07-24 18:56:13.899238] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cf0790 name raid_bdev1, state offline 00:19:29.106 18:56:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 2160440 00:19:29.106 [2024-07-24 18:56:13.917959] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:29.106 18:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:19:29.106 00:19:29.106 real 0m26.150s 00:19:29.106 user 0m40.080s 00:19:29.106 sys 0m2.911s 00:19:29.106 18:56:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:29.106 18:56:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:29.106 ************************************ 00:19:29.106 END TEST raid_rebuild_test_sb_io 00:19:29.106 ************************************ 00:19:29.365 18:56:14 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:19:29.365 18:56:14 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:19:29.365 18:56:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:19:29.365 18:56:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:29.365 18:56:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:29.365 ************************************ 00:19:29.365 START TEST raid_rebuild_test 00:19:29.365 ************************************ 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=2165151 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 2165151 /var/tmp/spdk-raid.sock 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 2165151 ']' 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:29.365 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:29.365 18:56:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:29.365 [2024-07-24 18:56:14.231422] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:19:29.365 [2024-07-24 18:56:14.231463] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2165151 ] 00:19:29.365 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:29.365 Zero copy mechanism will not be used. 00:19:29.365 [2024-07-24 18:56:14.298811] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:29.624 [2024-07-24 18:56:14.378007] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:29.624 [2024-07-24 18:56:14.429195] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:29.624 [2024-07-24 18:56:14.429220] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:30.190 18:56:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:30.190 18:56:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:19:30.190 18:56:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:30.190 18:56:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:30.190 BaseBdev1_malloc 00:19:30.449 18:56:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:30.449 [2024-07-24 18:56:15.352759] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:30.449 [2024-07-24 18:56:15.352794] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:30.449 [2024-07-24 18:56:15.352806] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19a7130 00:19:30.449 [2024-07-24 18:56:15.352812] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:30.449 [2024-07-24 18:56:15.353879] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:30.449 [2024-07-24 18:56:15.353898] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:30.449 BaseBdev1 00:19:30.449 18:56:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:30.449 18:56:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:30.707 BaseBdev2_malloc 00:19:30.707 18:56:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:19:30.707 [2024-07-24 18:56:15.684962] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:19:30.707 [2024-07-24 18:56:15.684993] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:30.707 [2024-07-24 18:56:15.685003] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b4cfa0 00:19:30.707 [2024-07-24 18:56:15.685009] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:30.707 [2024-07-24 18:56:15.686019] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:30.707 [2024-07-24 18:56:15.686040] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:30.707 BaseBdev2 00:19:30.707 18:56:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:30.707 18:56:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:30.966 BaseBdev3_malloc 00:19:30.966 18:56:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:19:31.224 [2024-07-24 18:56:16.033298] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:19:31.224 [2024-07-24 18:56:16.033325] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:31.224 [2024-07-24 18:56:16.033335] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b58970 00:19:31.224 [2024-07-24 18:56:16.033340] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:31.224 [2024-07-24 18:56:16.034317] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:31.224 [2024-07-24 18:56:16.034336] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:31.224 BaseBdev3 00:19:31.224 18:56:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:31.224 18:56:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:31.224 BaseBdev4_malloc 00:19:31.224 18:56:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:19:31.483 [2024-07-24 18:56:16.373604] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:19:31.483 [2024-07-24 18:56:16.373629] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:31.483 [2024-07-24 18:56:16.373641] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b4f8c0 00:19:31.483 [2024-07-24 18:56:16.373648] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:31.483 [2024-07-24 18:56:16.374591] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:31.483 [2024-07-24 18:56:16.374610] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:31.483 BaseBdev4 00:19:31.483 18:56:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:19:31.741 spare_malloc 00:19:31.741 18:56:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:19:31.741 spare_delay 00:19:31.741 18:56:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:31.999 [2024-07-24 18:56:16.890270] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:31.999 [2024-07-24 18:56:16.890296] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:31.999 [2024-07-24 18:56:16.890305] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x199fbf0 00:19:31.999 [2024-07-24 18:56:16.890311] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:32.000 [2024-07-24 18:56:16.891257] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:32.000 [2024-07-24 18:56:16.891276] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:32.000 spare 00:19:32.000 18:56:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:19:32.258 [2024-07-24 18:56:17.054715] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:32.258 [2024-07-24 18:56:17.055507] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:32.258 [2024-07-24 18:56:17.055545] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:32.258 [2024-07-24 18:56:17.055577] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:32.258 [2024-07-24 18:56:17.055627] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x19a1990 00:19:32.258 [2024-07-24 18:56:17.055633] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:19:32.258 [2024-07-24 18:56:17.055758] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19a51f0 00:19:32.258 [2024-07-24 18:56:17.055856] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19a1990 00:19:32.258 [2024-07-24 18:56:17.055861] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19a1990 00:19:32.258 [2024-07-24 18:56:17.055929] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:32.258 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:19:32.258 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:32.258 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:32.258 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:32.258 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:32.258 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:32.258 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:32.258 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:32.258 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:32.258 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:32.258 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:32.258 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.258 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:32.258 "name": "raid_bdev1", 00:19:32.258 "uuid": "19662012-7029-4d81-a045-1f1637009a9a", 00:19:32.258 "strip_size_kb": 0, 00:19:32.258 "state": "online", 00:19:32.258 "raid_level": "raid1", 00:19:32.258 "superblock": false, 00:19:32.258 "num_base_bdevs": 4, 00:19:32.258 "num_base_bdevs_discovered": 4, 00:19:32.258 "num_base_bdevs_operational": 4, 00:19:32.258 "base_bdevs_list": [ 00:19:32.258 { 00:19:32.258 "name": "BaseBdev1", 00:19:32.258 "uuid": "7e0c6dc0-5fa1-57a5-8887-0de7659bcff4", 00:19:32.258 "is_configured": true, 00:19:32.258 "data_offset": 0, 00:19:32.258 "data_size": 65536 00:19:32.258 }, 00:19:32.258 { 00:19:32.258 "name": "BaseBdev2", 00:19:32.258 "uuid": "016f15d8-739c-5332-aa83-207d453f7d8e", 00:19:32.258 "is_configured": true, 00:19:32.258 "data_offset": 0, 00:19:32.258 "data_size": 65536 00:19:32.258 }, 00:19:32.258 { 00:19:32.258 "name": "BaseBdev3", 00:19:32.258 "uuid": "02db7b38-0931-5d10-9760-e469f7ebb598", 00:19:32.258 "is_configured": true, 00:19:32.258 "data_offset": 0, 00:19:32.258 "data_size": 65536 00:19:32.258 }, 00:19:32.258 { 00:19:32.258 "name": "BaseBdev4", 00:19:32.258 "uuid": "eca11f4b-bc92-5acd-9578-e384fdce1274", 00:19:32.258 "is_configured": true, 00:19:32.258 "data_offset": 0, 00:19:32.258 "data_size": 65536 00:19:32.258 } 00:19:32.258 ] 00:19:32.258 }' 00:19:32.258 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:32.258 18:56:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:32.824 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:32.824 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:19:33.083 [2024-07-24 18:56:17.905097] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:33.083 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:19:33.083 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.083 18:56:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:19:33.342 [2024-07-24 18:56:18.245816] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19a6550 00:19:33.342 /dev/nbd0 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:33.342 1+0 records in 00:19:33.342 1+0 records out 00:19:33.342 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000222723 s, 18.4 MB/s 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:19:33.342 18:56:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:19:38.606 65536+0 records in 00:19:38.606 65536+0 records out 00:19:38.606 33554432 bytes (34 MB, 32 MiB) copied, 4.76907 s, 7.0 MB/s 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:38.606 [2024-07-24 18:56:23.270383] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:19:38.606 [2024-07-24 18:56:23.422810] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.606 18:56:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:38.863 18:56:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:38.863 "name": "raid_bdev1", 00:19:38.863 "uuid": "19662012-7029-4d81-a045-1f1637009a9a", 00:19:38.863 "strip_size_kb": 0, 00:19:38.863 "state": "online", 00:19:38.863 "raid_level": "raid1", 00:19:38.863 "superblock": false, 00:19:38.863 "num_base_bdevs": 4, 00:19:38.863 "num_base_bdevs_discovered": 3, 00:19:38.863 "num_base_bdevs_operational": 3, 00:19:38.863 "base_bdevs_list": [ 00:19:38.863 { 00:19:38.863 "name": null, 00:19:38.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:38.863 "is_configured": false, 00:19:38.863 "data_offset": 0, 00:19:38.863 "data_size": 65536 00:19:38.863 }, 00:19:38.863 { 00:19:38.863 "name": "BaseBdev2", 00:19:38.863 "uuid": "016f15d8-739c-5332-aa83-207d453f7d8e", 00:19:38.863 "is_configured": true, 00:19:38.863 "data_offset": 0, 00:19:38.863 "data_size": 65536 00:19:38.863 }, 00:19:38.863 { 00:19:38.863 "name": "BaseBdev3", 00:19:38.863 "uuid": "02db7b38-0931-5d10-9760-e469f7ebb598", 00:19:38.863 "is_configured": true, 00:19:38.863 "data_offset": 0, 00:19:38.863 "data_size": 65536 00:19:38.863 }, 00:19:38.863 { 00:19:38.863 "name": "BaseBdev4", 00:19:38.864 "uuid": "eca11f4b-bc92-5acd-9578-e384fdce1274", 00:19:38.864 "is_configured": true, 00:19:38.864 "data_offset": 0, 00:19:38.864 "data_size": 65536 00:19:38.864 } 00:19:38.864 ] 00:19:38.864 }' 00:19:38.864 18:56:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:38.864 18:56:23 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:39.120 18:56:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:39.378 [2024-07-24 18:56:24.248962] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:39.378 [2024-07-24 18:56:24.252674] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19a3530 00:19:39.378 [2024-07-24 18:56:24.254333] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:39.378 18:56:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:19:40.314 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:40.314 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:40.314 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:40.314 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:40.314 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:40.314 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.314 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:40.572 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:40.572 "name": "raid_bdev1", 00:19:40.572 "uuid": "19662012-7029-4d81-a045-1f1637009a9a", 00:19:40.572 "strip_size_kb": 0, 00:19:40.572 "state": "online", 00:19:40.572 "raid_level": "raid1", 00:19:40.572 "superblock": false, 00:19:40.572 "num_base_bdevs": 4, 00:19:40.572 "num_base_bdevs_discovered": 4, 00:19:40.572 "num_base_bdevs_operational": 4, 00:19:40.572 "process": { 00:19:40.572 "type": "rebuild", 00:19:40.572 "target": "spare", 00:19:40.572 "progress": { 00:19:40.572 "blocks": 22528, 00:19:40.572 "percent": 34 00:19:40.572 } 00:19:40.572 }, 00:19:40.572 "base_bdevs_list": [ 00:19:40.572 { 00:19:40.572 "name": "spare", 00:19:40.572 "uuid": "4b06dd01-2461-5a04-8c50-68be83fed59e", 00:19:40.572 "is_configured": true, 00:19:40.572 "data_offset": 0, 00:19:40.572 "data_size": 65536 00:19:40.572 }, 00:19:40.572 { 00:19:40.572 "name": "BaseBdev2", 00:19:40.572 "uuid": "016f15d8-739c-5332-aa83-207d453f7d8e", 00:19:40.572 "is_configured": true, 00:19:40.572 "data_offset": 0, 00:19:40.572 "data_size": 65536 00:19:40.572 }, 00:19:40.572 { 00:19:40.572 "name": "BaseBdev3", 00:19:40.572 "uuid": "02db7b38-0931-5d10-9760-e469f7ebb598", 00:19:40.572 "is_configured": true, 00:19:40.572 "data_offset": 0, 00:19:40.572 "data_size": 65536 00:19:40.572 }, 00:19:40.572 { 00:19:40.572 "name": "BaseBdev4", 00:19:40.572 "uuid": "eca11f4b-bc92-5acd-9578-e384fdce1274", 00:19:40.572 "is_configured": true, 00:19:40.572 "data_offset": 0, 00:19:40.572 "data_size": 65536 00:19:40.572 } 00:19:40.572 ] 00:19:40.572 }' 00:19:40.572 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:40.572 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:40.572 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:40.572 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:40.572 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:40.831 [2024-07-24 18:56:25.698633] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:40.831 [2024-07-24 18:56:25.764880] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:40.831 [2024-07-24 18:56:25.764908] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:40.831 [2024-07-24 18:56:25.764918] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:40.831 [2024-07-24 18:56:25.764922] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:40.831 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:40.831 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:40.831 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:40.831 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:40.831 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:40.831 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:40.831 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:40.831 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:40.831 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:40.831 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:40.831 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.831 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:41.089 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:41.089 "name": "raid_bdev1", 00:19:41.089 "uuid": "19662012-7029-4d81-a045-1f1637009a9a", 00:19:41.089 "strip_size_kb": 0, 00:19:41.089 "state": "online", 00:19:41.089 "raid_level": "raid1", 00:19:41.089 "superblock": false, 00:19:41.089 "num_base_bdevs": 4, 00:19:41.089 "num_base_bdevs_discovered": 3, 00:19:41.089 "num_base_bdevs_operational": 3, 00:19:41.089 "base_bdevs_list": [ 00:19:41.089 { 00:19:41.089 "name": null, 00:19:41.089 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:41.089 "is_configured": false, 00:19:41.089 "data_offset": 0, 00:19:41.089 "data_size": 65536 00:19:41.089 }, 00:19:41.089 { 00:19:41.089 "name": "BaseBdev2", 00:19:41.089 "uuid": "016f15d8-739c-5332-aa83-207d453f7d8e", 00:19:41.089 "is_configured": true, 00:19:41.089 "data_offset": 0, 00:19:41.089 "data_size": 65536 00:19:41.089 }, 00:19:41.089 { 00:19:41.089 "name": "BaseBdev3", 00:19:41.089 "uuid": "02db7b38-0931-5d10-9760-e469f7ebb598", 00:19:41.089 "is_configured": true, 00:19:41.089 "data_offset": 0, 00:19:41.089 "data_size": 65536 00:19:41.089 }, 00:19:41.089 { 00:19:41.089 "name": "BaseBdev4", 00:19:41.089 "uuid": "eca11f4b-bc92-5acd-9578-e384fdce1274", 00:19:41.089 "is_configured": true, 00:19:41.089 "data_offset": 0, 00:19:41.089 "data_size": 65536 00:19:41.089 } 00:19:41.089 ] 00:19:41.089 }' 00:19:41.089 18:56:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:41.089 18:56:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:41.654 18:56:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:41.655 18:56:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:41.655 18:56:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:41.655 18:56:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:41.655 18:56:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:41.655 18:56:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:41.655 18:56:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.655 18:56:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:41.655 "name": "raid_bdev1", 00:19:41.655 "uuid": "19662012-7029-4d81-a045-1f1637009a9a", 00:19:41.655 "strip_size_kb": 0, 00:19:41.655 "state": "online", 00:19:41.655 "raid_level": "raid1", 00:19:41.655 "superblock": false, 00:19:41.655 "num_base_bdevs": 4, 00:19:41.655 "num_base_bdevs_discovered": 3, 00:19:41.655 "num_base_bdevs_operational": 3, 00:19:41.655 "base_bdevs_list": [ 00:19:41.655 { 00:19:41.655 "name": null, 00:19:41.655 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:41.655 "is_configured": false, 00:19:41.655 "data_offset": 0, 00:19:41.655 "data_size": 65536 00:19:41.655 }, 00:19:41.655 { 00:19:41.655 "name": "BaseBdev2", 00:19:41.655 "uuid": "016f15d8-739c-5332-aa83-207d453f7d8e", 00:19:41.655 "is_configured": true, 00:19:41.655 "data_offset": 0, 00:19:41.655 "data_size": 65536 00:19:41.655 }, 00:19:41.655 { 00:19:41.655 "name": "BaseBdev3", 00:19:41.655 "uuid": "02db7b38-0931-5d10-9760-e469f7ebb598", 00:19:41.655 "is_configured": true, 00:19:41.655 "data_offset": 0, 00:19:41.655 "data_size": 65536 00:19:41.655 }, 00:19:41.655 { 00:19:41.655 "name": "BaseBdev4", 00:19:41.655 "uuid": "eca11f4b-bc92-5acd-9578-e384fdce1274", 00:19:41.655 "is_configured": true, 00:19:41.655 "data_offset": 0, 00:19:41.655 "data_size": 65536 00:19:41.655 } 00:19:41.655 ] 00:19:41.655 }' 00:19:41.655 18:56:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:41.655 18:56:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:41.655 18:56:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:41.915 18:56:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:41.915 18:56:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:41.915 [2024-07-24 18:56:26.847344] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:41.915 [2024-07-24 18:56:26.850879] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19a6610 00:19:41.915 [2024-07-24 18:56:26.851911] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:41.915 18:56:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:19:42.887 18:56:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:42.887 18:56:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:42.887 18:56:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:42.887 18:56:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:42.887 18:56:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:42.887 18:56:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:42.887 18:56:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:43.146 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:43.146 "name": "raid_bdev1", 00:19:43.146 "uuid": "19662012-7029-4d81-a045-1f1637009a9a", 00:19:43.146 "strip_size_kb": 0, 00:19:43.146 "state": "online", 00:19:43.146 "raid_level": "raid1", 00:19:43.146 "superblock": false, 00:19:43.146 "num_base_bdevs": 4, 00:19:43.146 "num_base_bdevs_discovered": 4, 00:19:43.146 "num_base_bdevs_operational": 4, 00:19:43.146 "process": { 00:19:43.146 "type": "rebuild", 00:19:43.146 "target": "spare", 00:19:43.146 "progress": { 00:19:43.146 "blocks": 22528, 00:19:43.146 "percent": 34 00:19:43.146 } 00:19:43.146 }, 00:19:43.146 "base_bdevs_list": [ 00:19:43.146 { 00:19:43.146 "name": "spare", 00:19:43.146 "uuid": "4b06dd01-2461-5a04-8c50-68be83fed59e", 00:19:43.146 "is_configured": true, 00:19:43.146 "data_offset": 0, 00:19:43.146 "data_size": 65536 00:19:43.146 }, 00:19:43.146 { 00:19:43.146 "name": "BaseBdev2", 00:19:43.146 "uuid": "016f15d8-739c-5332-aa83-207d453f7d8e", 00:19:43.146 "is_configured": true, 00:19:43.146 "data_offset": 0, 00:19:43.146 "data_size": 65536 00:19:43.146 }, 00:19:43.146 { 00:19:43.146 "name": "BaseBdev3", 00:19:43.146 "uuid": "02db7b38-0931-5d10-9760-e469f7ebb598", 00:19:43.146 "is_configured": true, 00:19:43.146 "data_offset": 0, 00:19:43.146 "data_size": 65536 00:19:43.146 }, 00:19:43.146 { 00:19:43.146 "name": "BaseBdev4", 00:19:43.146 "uuid": "eca11f4b-bc92-5acd-9578-e384fdce1274", 00:19:43.146 "is_configured": true, 00:19:43.146 "data_offset": 0, 00:19:43.146 "data_size": 65536 00:19:43.146 } 00:19:43.146 ] 00:19:43.146 }' 00:19:43.146 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:43.146 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:43.146 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:43.146 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:43.146 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:19:43.146 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:19:43.146 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:19:43.146 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:19:43.146 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:43.405 [2024-07-24 18:56:28.260151] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:43.405 [2024-07-24 18:56:28.261753] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x19a6610 00:19:43.405 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:19:43.405 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:19:43.405 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:43.405 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:43.405 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:43.405 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:43.405 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:43.405 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:43.405 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:43.664 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:43.664 "name": "raid_bdev1", 00:19:43.664 "uuid": "19662012-7029-4d81-a045-1f1637009a9a", 00:19:43.664 "strip_size_kb": 0, 00:19:43.664 "state": "online", 00:19:43.664 "raid_level": "raid1", 00:19:43.664 "superblock": false, 00:19:43.664 "num_base_bdevs": 4, 00:19:43.664 "num_base_bdevs_discovered": 3, 00:19:43.664 "num_base_bdevs_operational": 3, 00:19:43.664 "process": { 00:19:43.664 "type": "rebuild", 00:19:43.664 "target": "spare", 00:19:43.664 "progress": { 00:19:43.664 "blocks": 30720, 00:19:43.664 "percent": 46 00:19:43.664 } 00:19:43.664 }, 00:19:43.664 "base_bdevs_list": [ 00:19:43.664 { 00:19:43.664 "name": "spare", 00:19:43.664 "uuid": "4b06dd01-2461-5a04-8c50-68be83fed59e", 00:19:43.664 "is_configured": true, 00:19:43.664 "data_offset": 0, 00:19:43.664 "data_size": 65536 00:19:43.664 }, 00:19:43.664 { 00:19:43.664 "name": null, 00:19:43.664 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:43.664 "is_configured": false, 00:19:43.664 "data_offset": 0, 00:19:43.664 "data_size": 65536 00:19:43.664 }, 00:19:43.664 { 00:19:43.664 "name": "BaseBdev3", 00:19:43.664 "uuid": "02db7b38-0931-5d10-9760-e469f7ebb598", 00:19:43.664 "is_configured": true, 00:19:43.664 "data_offset": 0, 00:19:43.664 "data_size": 65536 00:19:43.664 }, 00:19:43.664 { 00:19:43.664 "name": "BaseBdev4", 00:19:43.664 "uuid": "eca11f4b-bc92-5acd-9578-e384fdce1274", 00:19:43.664 "is_configured": true, 00:19:43.664 "data_offset": 0, 00:19:43.664 "data_size": 65536 00:19:43.664 } 00:19:43.664 ] 00:19:43.664 }' 00:19:43.664 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:43.664 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:43.664 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:43.664 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:43.664 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=669 00:19:43.664 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:43.664 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:43.664 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:43.664 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:43.664 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:43.664 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:43.664 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:43.664 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:43.923 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:43.923 "name": "raid_bdev1", 00:19:43.923 "uuid": "19662012-7029-4d81-a045-1f1637009a9a", 00:19:43.923 "strip_size_kb": 0, 00:19:43.923 "state": "online", 00:19:43.923 "raid_level": "raid1", 00:19:43.923 "superblock": false, 00:19:43.923 "num_base_bdevs": 4, 00:19:43.923 "num_base_bdevs_discovered": 3, 00:19:43.923 "num_base_bdevs_operational": 3, 00:19:43.923 "process": { 00:19:43.923 "type": "rebuild", 00:19:43.923 "target": "spare", 00:19:43.923 "progress": { 00:19:43.923 "blocks": 36864, 00:19:43.923 "percent": 56 00:19:43.923 } 00:19:43.923 }, 00:19:43.923 "base_bdevs_list": [ 00:19:43.923 { 00:19:43.923 "name": "spare", 00:19:43.923 "uuid": "4b06dd01-2461-5a04-8c50-68be83fed59e", 00:19:43.923 "is_configured": true, 00:19:43.923 "data_offset": 0, 00:19:43.923 "data_size": 65536 00:19:43.923 }, 00:19:43.923 { 00:19:43.923 "name": null, 00:19:43.923 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:43.923 "is_configured": false, 00:19:43.923 "data_offset": 0, 00:19:43.923 "data_size": 65536 00:19:43.923 }, 00:19:43.923 { 00:19:43.923 "name": "BaseBdev3", 00:19:43.923 "uuid": "02db7b38-0931-5d10-9760-e469f7ebb598", 00:19:43.923 "is_configured": true, 00:19:43.923 "data_offset": 0, 00:19:43.923 "data_size": 65536 00:19:43.923 }, 00:19:43.923 { 00:19:43.923 "name": "BaseBdev4", 00:19:43.923 "uuid": "eca11f4b-bc92-5acd-9578-e384fdce1274", 00:19:43.923 "is_configured": true, 00:19:43.923 "data_offset": 0, 00:19:43.923 "data_size": 65536 00:19:43.923 } 00:19:43.923 ] 00:19:43.923 }' 00:19:43.923 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:43.923 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:43.923 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:43.923 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:43.923 18:56:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:44.858 18:56:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:44.858 18:56:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:44.859 18:56:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:44.859 18:56:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:44.859 18:56:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:44.859 18:56:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:44.859 18:56:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:44.859 18:56:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:45.117 18:56:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:45.117 "name": "raid_bdev1", 00:19:45.117 "uuid": "19662012-7029-4d81-a045-1f1637009a9a", 00:19:45.117 "strip_size_kb": 0, 00:19:45.117 "state": "online", 00:19:45.117 "raid_level": "raid1", 00:19:45.117 "superblock": false, 00:19:45.117 "num_base_bdevs": 4, 00:19:45.117 "num_base_bdevs_discovered": 3, 00:19:45.117 "num_base_bdevs_operational": 3, 00:19:45.117 "process": { 00:19:45.117 "type": "rebuild", 00:19:45.117 "target": "spare", 00:19:45.117 "progress": { 00:19:45.117 "blocks": 61440, 00:19:45.117 "percent": 93 00:19:45.117 } 00:19:45.117 }, 00:19:45.117 "base_bdevs_list": [ 00:19:45.117 { 00:19:45.117 "name": "spare", 00:19:45.117 "uuid": "4b06dd01-2461-5a04-8c50-68be83fed59e", 00:19:45.117 "is_configured": true, 00:19:45.117 "data_offset": 0, 00:19:45.117 "data_size": 65536 00:19:45.117 }, 00:19:45.117 { 00:19:45.117 "name": null, 00:19:45.117 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:45.117 "is_configured": false, 00:19:45.117 "data_offset": 0, 00:19:45.117 "data_size": 65536 00:19:45.117 }, 00:19:45.117 { 00:19:45.117 "name": "BaseBdev3", 00:19:45.117 "uuid": "02db7b38-0931-5d10-9760-e469f7ebb598", 00:19:45.117 "is_configured": true, 00:19:45.117 "data_offset": 0, 00:19:45.117 "data_size": 65536 00:19:45.117 }, 00:19:45.117 { 00:19:45.117 "name": "BaseBdev4", 00:19:45.117 "uuid": "eca11f4b-bc92-5acd-9578-e384fdce1274", 00:19:45.117 "is_configured": true, 00:19:45.117 "data_offset": 0, 00:19:45.117 "data_size": 65536 00:19:45.117 } 00:19:45.117 ] 00:19:45.117 }' 00:19:45.117 18:56:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:45.117 18:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:45.117 18:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:45.117 18:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:45.117 18:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:45.117 [2024-07-24 18:56:30.074422] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:19:45.117 [2024-07-24 18:56:30.074464] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:19:45.117 [2024-07-24 18:56:30.074497] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:46.052 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:46.052 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:46.052 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:46.052 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:46.052 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:46.052 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:46.052 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.052 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:46.310 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:46.310 "name": "raid_bdev1", 00:19:46.310 "uuid": "19662012-7029-4d81-a045-1f1637009a9a", 00:19:46.310 "strip_size_kb": 0, 00:19:46.310 "state": "online", 00:19:46.310 "raid_level": "raid1", 00:19:46.310 "superblock": false, 00:19:46.310 "num_base_bdevs": 4, 00:19:46.310 "num_base_bdevs_discovered": 3, 00:19:46.310 "num_base_bdevs_operational": 3, 00:19:46.310 "base_bdevs_list": [ 00:19:46.310 { 00:19:46.310 "name": "spare", 00:19:46.310 "uuid": "4b06dd01-2461-5a04-8c50-68be83fed59e", 00:19:46.310 "is_configured": true, 00:19:46.310 "data_offset": 0, 00:19:46.310 "data_size": 65536 00:19:46.310 }, 00:19:46.310 { 00:19:46.310 "name": null, 00:19:46.310 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.310 "is_configured": false, 00:19:46.310 "data_offset": 0, 00:19:46.310 "data_size": 65536 00:19:46.310 }, 00:19:46.310 { 00:19:46.310 "name": "BaseBdev3", 00:19:46.310 "uuid": "02db7b38-0931-5d10-9760-e469f7ebb598", 00:19:46.310 "is_configured": true, 00:19:46.310 "data_offset": 0, 00:19:46.310 "data_size": 65536 00:19:46.310 }, 00:19:46.310 { 00:19:46.310 "name": "BaseBdev4", 00:19:46.310 "uuid": "eca11f4b-bc92-5acd-9578-e384fdce1274", 00:19:46.310 "is_configured": true, 00:19:46.310 "data_offset": 0, 00:19:46.310 "data_size": 65536 00:19:46.310 } 00:19:46.310 ] 00:19:46.310 }' 00:19:46.310 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:46.310 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:19:46.310 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:46.569 "name": "raid_bdev1", 00:19:46.569 "uuid": "19662012-7029-4d81-a045-1f1637009a9a", 00:19:46.569 "strip_size_kb": 0, 00:19:46.569 "state": "online", 00:19:46.569 "raid_level": "raid1", 00:19:46.569 "superblock": false, 00:19:46.569 "num_base_bdevs": 4, 00:19:46.569 "num_base_bdevs_discovered": 3, 00:19:46.569 "num_base_bdevs_operational": 3, 00:19:46.569 "base_bdevs_list": [ 00:19:46.569 { 00:19:46.569 "name": "spare", 00:19:46.569 "uuid": "4b06dd01-2461-5a04-8c50-68be83fed59e", 00:19:46.569 "is_configured": true, 00:19:46.569 "data_offset": 0, 00:19:46.569 "data_size": 65536 00:19:46.569 }, 00:19:46.569 { 00:19:46.569 "name": null, 00:19:46.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.569 "is_configured": false, 00:19:46.569 "data_offset": 0, 00:19:46.569 "data_size": 65536 00:19:46.569 }, 00:19:46.569 { 00:19:46.569 "name": "BaseBdev3", 00:19:46.569 "uuid": "02db7b38-0931-5d10-9760-e469f7ebb598", 00:19:46.569 "is_configured": true, 00:19:46.569 "data_offset": 0, 00:19:46.569 "data_size": 65536 00:19:46.569 }, 00:19:46.569 { 00:19:46.569 "name": "BaseBdev4", 00:19:46.569 "uuid": "eca11f4b-bc92-5acd-9578-e384fdce1274", 00:19:46.569 "is_configured": true, 00:19:46.569 "data_offset": 0, 00:19:46.569 "data_size": 65536 00:19:46.569 } 00:19:46.569 ] 00:19:46.569 }' 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.569 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:46.828 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:46.828 "name": "raid_bdev1", 00:19:46.828 "uuid": "19662012-7029-4d81-a045-1f1637009a9a", 00:19:46.828 "strip_size_kb": 0, 00:19:46.828 "state": "online", 00:19:46.828 "raid_level": "raid1", 00:19:46.828 "superblock": false, 00:19:46.828 "num_base_bdevs": 4, 00:19:46.828 "num_base_bdevs_discovered": 3, 00:19:46.828 "num_base_bdevs_operational": 3, 00:19:46.828 "base_bdevs_list": [ 00:19:46.828 { 00:19:46.828 "name": "spare", 00:19:46.828 "uuid": "4b06dd01-2461-5a04-8c50-68be83fed59e", 00:19:46.828 "is_configured": true, 00:19:46.828 "data_offset": 0, 00:19:46.828 "data_size": 65536 00:19:46.828 }, 00:19:46.828 { 00:19:46.828 "name": null, 00:19:46.828 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.828 "is_configured": false, 00:19:46.828 "data_offset": 0, 00:19:46.828 "data_size": 65536 00:19:46.828 }, 00:19:46.828 { 00:19:46.828 "name": "BaseBdev3", 00:19:46.828 "uuid": "02db7b38-0931-5d10-9760-e469f7ebb598", 00:19:46.828 "is_configured": true, 00:19:46.828 "data_offset": 0, 00:19:46.828 "data_size": 65536 00:19:46.828 }, 00:19:46.828 { 00:19:46.828 "name": "BaseBdev4", 00:19:46.828 "uuid": "eca11f4b-bc92-5acd-9578-e384fdce1274", 00:19:46.828 "is_configured": true, 00:19:46.828 "data_offset": 0, 00:19:46.828 "data_size": 65536 00:19:46.828 } 00:19:46.828 ] 00:19:46.828 }' 00:19:46.828 18:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:46.828 18:56:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:47.395 18:56:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:47.395 [2024-07-24 18:56:32.384232] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:47.395 [2024-07-24 18:56:32.384255] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:47.395 [2024-07-24 18:56:32.384305] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:47.395 [2024-07-24 18:56:32.384358] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:47.395 [2024-07-24 18:56:32.384364] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19a1990 name raid_bdev1, state offline 00:19:47.395 18:56:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:47.395 18:56:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:19:47.653 18:56:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:19:47.653 18:56:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:19:47.653 18:56:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:19:47.653 18:56:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:19:47.653 18:56:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:47.653 18:56:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:19:47.653 18:56:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:47.653 18:56:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:19:47.653 18:56:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:47.653 18:56:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:19:47.653 18:56:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:47.653 18:56:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:47.653 18:56:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:19:47.912 /dev/nbd0 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:47.912 1+0 records in 00:19:47.912 1+0 records out 00:19:47.912 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00022208 s, 18.4 MB/s 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:47.912 18:56:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:19:48.171 /dev/nbd1 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:48.171 1+0 records in 00:19:48.171 1+0 records out 00:19:48.171 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000198916 s, 20.6 MB/s 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:48.171 18:56:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:19:48.171 18:56:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:19:48.171 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:48.171 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:19:48.171 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:48.171 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:19:48.171 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:48.171 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 2165151 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 2165151 ']' 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 2165151 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:48.430 18:56:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2165151 00:19:48.689 18:56:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:48.689 18:56:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:48.689 18:56:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2165151' 00:19:48.689 killing process with pid 2165151 00:19:48.689 18:56:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 2165151 00:19:48.689 Received shutdown signal, test time was about 60.000000 seconds 00:19:48.689 00:19:48.689 Latency(us) 00:19:48.689 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:48.689 =================================================================================================================== 00:19:48.689 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:19:48.689 [2024-07-24 18:56:33.449096] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:48.689 18:56:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 2165151 00:19:48.689 [2024-07-24 18:56:33.487806] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:48.689 18:56:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:19:48.689 00:19:48.689 real 0m19.487s 00:19:48.689 user 0m26.725s 00:19:48.689 sys 0m3.061s 00:19:48.689 18:56:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:48.689 18:56:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:48.689 ************************************ 00:19:48.689 END TEST raid_rebuild_test 00:19:48.689 ************************************ 00:19:48.689 18:56:33 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:19:48.689 18:56:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:19:48.689 18:56:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:48.689 18:56:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:48.947 ************************************ 00:19:48.947 START TEST raid_rebuild_test_sb 00:19:48.947 ************************************ 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=2168625 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 2168625 /var/tmp/spdk-raid.sock 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2168625 ']' 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:48.947 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:48.947 18:56:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:48.947 [2024-07-24 18:56:33.784700] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:19:48.947 [2024-07-24 18:56:33.784737] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2168625 ] 00:19:48.947 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:48.947 Zero copy mechanism will not be used. 00:19:48.947 [2024-07-24 18:56:33.846771] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:48.947 [2024-07-24 18:56:33.924963] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:49.205 [2024-07-24 18:56:33.983859] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:49.205 [2024-07-24 18:56:33.983883] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:49.771 18:56:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:49.771 18:56:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:19:49.771 18:56:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:49.771 18:56:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:49.771 BaseBdev1_malloc 00:19:49.771 18:56:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:50.030 [2024-07-24 18:56:34.899991] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:50.030 [2024-07-24 18:56:34.900024] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:50.030 [2024-07-24 18:56:34.900038] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1da9130 00:19:50.030 [2024-07-24 18:56:34.900059] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:50.030 [2024-07-24 18:56:34.901189] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:50.030 [2024-07-24 18:56:34.901209] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:50.030 BaseBdev1 00:19:50.030 18:56:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:50.030 18:56:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:50.289 BaseBdev2_malloc 00:19:50.289 18:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:19:50.289 [2024-07-24 18:56:35.236521] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:19:50.289 [2024-07-24 18:56:35.236556] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:50.289 [2024-07-24 18:56:35.236567] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f4efa0 00:19:50.289 [2024-07-24 18:56:35.236574] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:50.289 [2024-07-24 18:56:35.237627] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:50.289 [2024-07-24 18:56:35.237648] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:50.289 BaseBdev2 00:19:50.289 18:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:50.289 18:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:50.547 BaseBdev3_malloc 00:19:50.547 18:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:19:50.805 [2024-07-24 18:56:35.568801] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:19:50.805 [2024-07-24 18:56:35.568834] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:50.805 [2024-07-24 18:56:35.568845] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f5a970 00:19:50.805 [2024-07-24 18:56:35.568866] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:50.805 [2024-07-24 18:56:35.569936] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:50.805 [2024-07-24 18:56:35.569957] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:50.805 BaseBdev3 00:19:50.805 18:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:50.805 18:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:50.805 BaseBdev4_malloc 00:19:50.806 18:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:19:51.063 [2024-07-24 18:56:35.892963] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:19:51.063 [2024-07-24 18:56:35.892995] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:51.063 [2024-07-24 18:56:35.893026] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f518c0 00:19:51.063 [2024-07-24 18:56:35.893032] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:51.063 [2024-07-24 18:56:35.894071] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:51.063 [2024-07-24 18:56:35.894091] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:51.063 BaseBdev4 00:19:51.063 18:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:19:51.063 spare_malloc 00:19:51.321 18:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:19:51.321 spare_delay 00:19:51.321 18:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:51.580 [2024-07-24 18:56:36.389747] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:51.580 [2024-07-24 18:56:36.389779] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:51.580 [2024-07-24 18:56:36.389790] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1da1bf0 00:19:51.580 [2024-07-24 18:56:36.389796] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:51.580 [2024-07-24 18:56:36.390869] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:51.580 [2024-07-24 18:56:36.390888] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:51.580 spare 00:19:51.580 18:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:19:51.580 [2024-07-24 18:56:36.542170] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:51.580 [2024-07-24 18:56:36.543013] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:51.580 [2024-07-24 18:56:36.543050] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:51.580 [2024-07-24 18:56:36.543077] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:51.580 [2024-07-24 18:56:36.543203] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1da3990 00:19:51.580 [2024-07-24 18:56:36.543210] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:51.580 [2024-07-24 18:56:36.543341] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f4e3b0 00:19:51.580 [2024-07-24 18:56:36.543439] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1da3990 00:19:51.580 [2024-07-24 18:56:36.543444] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1da3990 00:19:51.580 [2024-07-24 18:56:36.543531] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:51.580 18:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:19:51.580 18:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:51.580 18:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:51.580 18:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:51.580 18:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:51.580 18:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:51.580 18:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:51.580 18:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:51.580 18:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:51.580 18:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:51.580 18:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.580 18:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:51.839 18:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:51.839 "name": "raid_bdev1", 00:19:51.839 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:19:51.839 "strip_size_kb": 0, 00:19:51.839 "state": "online", 00:19:51.839 "raid_level": "raid1", 00:19:51.839 "superblock": true, 00:19:51.839 "num_base_bdevs": 4, 00:19:51.839 "num_base_bdevs_discovered": 4, 00:19:51.839 "num_base_bdevs_operational": 4, 00:19:51.839 "base_bdevs_list": [ 00:19:51.839 { 00:19:51.839 "name": "BaseBdev1", 00:19:51.839 "uuid": "9022ab05-f96b-5607-b83d-df3dc762dc5d", 00:19:51.839 "is_configured": true, 00:19:51.839 "data_offset": 2048, 00:19:51.839 "data_size": 63488 00:19:51.839 }, 00:19:51.839 { 00:19:51.839 "name": "BaseBdev2", 00:19:51.839 "uuid": "036d60ed-4f8a-56e1-9a03-f7462d780adb", 00:19:51.839 "is_configured": true, 00:19:51.839 "data_offset": 2048, 00:19:51.839 "data_size": 63488 00:19:51.839 }, 00:19:51.839 { 00:19:51.839 "name": "BaseBdev3", 00:19:51.839 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:19:51.839 "is_configured": true, 00:19:51.839 "data_offset": 2048, 00:19:51.839 "data_size": 63488 00:19:51.839 }, 00:19:51.839 { 00:19:51.839 "name": "BaseBdev4", 00:19:51.839 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:19:51.839 "is_configured": true, 00:19:51.839 "data_offset": 2048, 00:19:51.839 "data_size": 63488 00:19:51.839 } 00:19:51.839 ] 00:19:51.839 }' 00:19:51.839 18:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:51.839 18:56:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:52.406 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:52.406 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:19:52.406 [2024-07-24 18:56:37.376623] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:52.406 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:19:52.406 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:52.406 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:19:52.663 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:19:52.663 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:19:52.663 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:19:52.663 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:19:52.663 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:19:52.663 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:52.663 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:19:52.663 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:52.663 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:52.663 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:52.663 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:19:52.663 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:52.663 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:52.663 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:19:52.921 [2024-07-24 18:56:37.709301] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f4e770 00:19:52.921 /dev/nbd0 00:19:52.921 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:52.921 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:52.921 18:56:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:19:52.921 18:56:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:19:52.921 18:56:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:52.921 18:56:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:52.921 18:56:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:19:52.921 18:56:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:19:52.921 18:56:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:52.921 18:56:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:52.921 18:56:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:52.921 1+0 records in 00:19:52.921 1+0 records out 00:19:52.921 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000191151 s, 21.4 MB/s 00:19:52.921 18:56:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:52.921 18:56:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:19:52.921 18:56:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:52.921 18:56:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:52.922 18:56:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:19:52.922 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:52.922 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:52.922 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:19:52.922 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:19:52.922 18:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:19:58.186 63488+0 records in 00:19:58.186 63488+0 records out 00:19:58.187 32505856 bytes (33 MB, 31 MiB) copied, 4.49293 s, 7.2 MB/s 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:58.187 [2024-07-24 18:56:42.446964] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:19:58.187 [2024-07-24 18:56:42.675581] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:58.187 "name": "raid_bdev1", 00:19:58.187 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:19:58.187 "strip_size_kb": 0, 00:19:58.187 "state": "online", 00:19:58.187 "raid_level": "raid1", 00:19:58.187 "superblock": true, 00:19:58.187 "num_base_bdevs": 4, 00:19:58.187 "num_base_bdevs_discovered": 3, 00:19:58.187 "num_base_bdevs_operational": 3, 00:19:58.187 "base_bdevs_list": [ 00:19:58.187 { 00:19:58.187 "name": null, 00:19:58.187 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:58.187 "is_configured": false, 00:19:58.187 "data_offset": 2048, 00:19:58.187 "data_size": 63488 00:19:58.187 }, 00:19:58.187 { 00:19:58.187 "name": "BaseBdev2", 00:19:58.187 "uuid": "036d60ed-4f8a-56e1-9a03-f7462d780adb", 00:19:58.187 "is_configured": true, 00:19:58.187 "data_offset": 2048, 00:19:58.187 "data_size": 63488 00:19:58.187 }, 00:19:58.187 { 00:19:58.187 "name": "BaseBdev3", 00:19:58.187 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:19:58.187 "is_configured": true, 00:19:58.187 "data_offset": 2048, 00:19:58.187 "data_size": 63488 00:19:58.187 }, 00:19:58.187 { 00:19:58.187 "name": "BaseBdev4", 00:19:58.187 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:19:58.187 "is_configured": true, 00:19:58.187 "data_offset": 2048, 00:19:58.187 "data_size": 63488 00:19:58.187 } 00:19:58.187 ] 00:19:58.187 }' 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:58.187 18:56:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:58.445 18:56:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:58.704 [2024-07-24 18:56:43.517770] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:58.704 [2024-07-24 18:56:43.521319] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1da8a80 00:19:58.704 [2024-07-24 18:56:43.522872] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:58.704 18:56:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:19:59.639 18:56:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:59.639 18:56:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:59.639 18:56:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:59.639 18:56:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:59.639 18:56:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:59.639 18:56:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.639 18:56:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:59.897 18:56:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:59.897 "name": "raid_bdev1", 00:19:59.897 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:19:59.897 "strip_size_kb": 0, 00:19:59.897 "state": "online", 00:19:59.897 "raid_level": "raid1", 00:19:59.897 "superblock": true, 00:19:59.897 "num_base_bdevs": 4, 00:19:59.897 "num_base_bdevs_discovered": 4, 00:19:59.897 "num_base_bdevs_operational": 4, 00:19:59.897 "process": { 00:19:59.897 "type": "rebuild", 00:19:59.897 "target": "spare", 00:19:59.897 "progress": { 00:19:59.897 "blocks": 22528, 00:19:59.897 "percent": 35 00:19:59.897 } 00:19:59.897 }, 00:19:59.897 "base_bdevs_list": [ 00:19:59.897 { 00:19:59.897 "name": "spare", 00:19:59.897 "uuid": "851967cb-e92a-542c-9b43-adb64efec82c", 00:19:59.897 "is_configured": true, 00:19:59.897 "data_offset": 2048, 00:19:59.897 "data_size": 63488 00:19:59.897 }, 00:19:59.897 { 00:19:59.897 "name": "BaseBdev2", 00:19:59.897 "uuid": "036d60ed-4f8a-56e1-9a03-f7462d780adb", 00:19:59.897 "is_configured": true, 00:19:59.897 "data_offset": 2048, 00:19:59.897 "data_size": 63488 00:19:59.897 }, 00:19:59.897 { 00:19:59.897 "name": "BaseBdev3", 00:19:59.897 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:19:59.897 "is_configured": true, 00:19:59.897 "data_offset": 2048, 00:19:59.897 "data_size": 63488 00:19:59.897 }, 00:19:59.897 { 00:19:59.897 "name": "BaseBdev4", 00:19:59.897 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:19:59.897 "is_configured": true, 00:19:59.897 "data_offset": 2048, 00:19:59.897 "data_size": 63488 00:19:59.897 } 00:19:59.897 ] 00:19:59.897 }' 00:19:59.897 18:56:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:59.897 18:56:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:59.897 18:56:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:59.897 18:56:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:59.897 18:56:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:00.156 [2024-07-24 18:56:44.943091] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:00.156 [2024-07-24 18:56:45.033377] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:00.156 [2024-07-24 18:56:45.033406] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:00.156 [2024-07-24 18:56:45.033415] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:00.156 [2024-07-24 18:56:45.033435] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:00.156 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:00.156 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:00.156 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:00.156 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:00.156 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:00.156 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:00.156 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:00.156 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:00.156 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:00.156 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:00.156 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.156 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:00.415 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:00.415 "name": "raid_bdev1", 00:20:00.415 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:00.415 "strip_size_kb": 0, 00:20:00.415 "state": "online", 00:20:00.415 "raid_level": "raid1", 00:20:00.415 "superblock": true, 00:20:00.415 "num_base_bdevs": 4, 00:20:00.415 "num_base_bdevs_discovered": 3, 00:20:00.415 "num_base_bdevs_operational": 3, 00:20:00.415 "base_bdevs_list": [ 00:20:00.415 { 00:20:00.415 "name": null, 00:20:00.415 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:00.415 "is_configured": false, 00:20:00.415 "data_offset": 2048, 00:20:00.415 "data_size": 63488 00:20:00.415 }, 00:20:00.415 { 00:20:00.415 "name": "BaseBdev2", 00:20:00.415 "uuid": "036d60ed-4f8a-56e1-9a03-f7462d780adb", 00:20:00.415 "is_configured": true, 00:20:00.415 "data_offset": 2048, 00:20:00.415 "data_size": 63488 00:20:00.415 }, 00:20:00.415 { 00:20:00.415 "name": "BaseBdev3", 00:20:00.415 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:00.415 "is_configured": true, 00:20:00.415 "data_offset": 2048, 00:20:00.415 "data_size": 63488 00:20:00.415 }, 00:20:00.415 { 00:20:00.415 "name": "BaseBdev4", 00:20:00.415 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:00.415 "is_configured": true, 00:20:00.415 "data_offset": 2048, 00:20:00.415 "data_size": 63488 00:20:00.415 } 00:20:00.415 ] 00:20:00.415 }' 00:20:00.415 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:00.415 18:56:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:00.981 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:00.981 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:00.981 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:00.981 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:00.981 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:00.981 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:00.981 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.981 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:00.981 "name": "raid_bdev1", 00:20:00.981 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:00.981 "strip_size_kb": 0, 00:20:00.981 "state": "online", 00:20:00.981 "raid_level": "raid1", 00:20:00.981 "superblock": true, 00:20:00.981 "num_base_bdevs": 4, 00:20:00.981 "num_base_bdevs_discovered": 3, 00:20:00.981 "num_base_bdevs_operational": 3, 00:20:00.981 "base_bdevs_list": [ 00:20:00.981 { 00:20:00.981 "name": null, 00:20:00.981 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:00.981 "is_configured": false, 00:20:00.981 "data_offset": 2048, 00:20:00.981 "data_size": 63488 00:20:00.981 }, 00:20:00.981 { 00:20:00.981 "name": "BaseBdev2", 00:20:00.981 "uuid": "036d60ed-4f8a-56e1-9a03-f7462d780adb", 00:20:00.981 "is_configured": true, 00:20:00.981 "data_offset": 2048, 00:20:00.981 "data_size": 63488 00:20:00.981 }, 00:20:00.981 { 00:20:00.981 "name": "BaseBdev3", 00:20:00.981 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:00.981 "is_configured": true, 00:20:00.981 "data_offset": 2048, 00:20:00.981 "data_size": 63488 00:20:00.981 }, 00:20:00.981 { 00:20:00.981 "name": "BaseBdev4", 00:20:00.981 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:00.981 "is_configured": true, 00:20:00.981 "data_offset": 2048, 00:20:00.981 "data_size": 63488 00:20:00.981 } 00:20:00.981 ] 00:20:00.981 }' 00:20:00.981 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:00.981 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:00.981 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:00.981 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:00.981 18:56:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:01.240 [2024-07-24 18:56:46.139816] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:01.240 [2024-07-24 18:56:46.143283] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1da8a80 00:20:01.240 [2024-07-24 18:56:46.144298] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:01.240 18:56:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:20:02.207 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:02.207 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:02.207 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:02.207 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:02.207 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:02.207 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.207 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:02.465 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:02.465 "name": "raid_bdev1", 00:20:02.465 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:02.465 "strip_size_kb": 0, 00:20:02.465 "state": "online", 00:20:02.465 "raid_level": "raid1", 00:20:02.465 "superblock": true, 00:20:02.465 "num_base_bdevs": 4, 00:20:02.466 "num_base_bdevs_discovered": 4, 00:20:02.466 "num_base_bdevs_operational": 4, 00:20:02.466 "process": { 00:20:02.466 "type": "rebuild", 00:20:02.466 "target": "spare", 00:20:02.466 "progress": { 00:20:02.466 "blocks": 22528, 00:20:02.466 "percent": 35 00:20:02.466 } 00:20:02.466 }, 00:20:02.466 "base_bdevs_list": [ 00:20:02.466 { 00:20:02.466 "name": "spare", 00:20:02.466 "uuid": "851967cb-e92a-542c-9b43-adb64efec82c", 00:20:02.466 "is_configured": true, 00:20:02.466 "data_offset": 2048, 00:20:02.466 "data_size": 63488 00:20:02.466 }, 00:20:02.466 { 00:20:02.466 "name": "BaseBdev2", 00:20:02.466 "uuid": "036d60ed-4f8a-56e1-9a03-f7462d780adb", 00:20:02.466 "is_configured": true, 00:20:02.466 "data_offset": 2048, 00:20:02.466 "data_size": 63488 00:20:02.466 }, 00:20:02.466 { 00:20:02.466 "name": "BaseBdev3", 00:20:02.466 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:02.466 "is_configured": true, 00:20:02.466 "data_offset": 2048, 00:20:02.466 "data_size": 63488 00:20:02.466 }, 00:20:02.466 { 00:20:02.466 "name": "BaseBdev4", 00:20:02.466 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:02.466 "is_configured": true, 00:20:02.466 "data_offset": 2048, 00:20:02.466 "data_size": 63488 00:20:02.466 } 00:20:02.466 ] 00:20:02.466 }' 00:20:02.466 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:02.466 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:02.466 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:02.466 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:02.466 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:20:02.466 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:20:02.466 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:20:02.466 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:20:02.466 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:20:02.466 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:20:02.466 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:02.724 [2024-07-24 18:56:47.568550] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:02.982 [2024-07-24 18:56:47.755097] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1da8a80 00:20:02.982 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:20:02.982 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:20:02.982 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:02.982 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:02.982 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:02.982 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:02.982 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:02.982 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:02.982 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.982 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:02.982 "name": "raid_bdev1", 00:20:02.982 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:02.982 "strip_size_kb": 0, 00:20:02.982 "state": "online", 00:20:02.982 "raid_level": "raid1", 00:20:02.982 "superblock": true, 00:20:02.982 "num_base_bdevs": 4, 00:20:02.982 "num_base_bdevs_discovered": 3, 00:20:02.982 "num_base_bdevs_operational": 3, 00:20:02.982 "process": { 00:20:02.982 "type": "rebuild", 00:20:02.982 "target": "spare", 00:20:02.982 "progress": { 00:20:02.982 "blocks": 32768, 00:20:02.982 "percent": 51 00:20:02.982 } 00:20:02.982 }, 00:20:02.982 "base_bdevs_list": [ 00:20:02.982 { 00:20:02.982 "name": "spare", 00:20:02.982 "uuid": "851967cb-e92a-542c-9b43-adb64efec82c", 00:20:02.982 "is_configured": true, 00:20:02.982 "data_offset": 2048, 00:20:02.982 "data_size": 63488 00:20:02.982 }, 00:20:02.982 { 00:20:02.982 "name": null, 00:20:02.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.982 "is_configured": false, 00:20:02.982 "data_offset": 2048, 00:20:02.982 "data_size": 63488 00:20:02.982 }, 00:20:02.982 { 00:20:02.982 "name": "BaseBdev3", 00:20:02.982 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:02.982 "is_configured": true, 00:20:02.982 "data_offset": 2048, 00:20:02.982 "data_size": 63488 00:20:02.982 }, 00:20:02.982 { 00:20:02.982 "name": "BaseBdev4", 00:20:02.982 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:02.982 "is_configured": true, 00:20:02.982 "data_offset": 2048, 00:20:02.982 "data_size": 63488 00:20:02.982 } 00:20:02.982 ] 00:20:02.982 }' 00:20:02.982 18:56:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:03.241 18:56:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:03.241 18:56:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:03.241 18:56:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:03.241 18:56:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=689 00:20:03.241 18:56:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:03.241 18:56:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:03.241 18:56:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:03.241 18:56:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:03.241 18:56:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:03.241 18:56:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:03.241 18:56:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.241 18:56:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:03.241 18:56:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:03.241 "name": "raid_bdev1", 00:20:03.241 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:03.241 "strip_size_kb": 0, 00:20:03.241 "state": "online", 00:20:03.241 "raid_level": "raid1", 00:20:03.241 "superblock": true, 00:20:03.241 "num_base_bdevs": 4, 00:20:03.241 "num_base_bdevs_discovered": 3, 00:20:03.241 "num_base_bdevs_operational": 3, 00:20:03.241 "process": { 00:20:03.241 "type": "rebuild", 00:20:03.241 "target": "spare", 00:20:03.241 "progress": { 00:20:03.241 "blocks": 38912, 00:20:03.241 "percent": 61 00:20:03.241 } 00:20:03.241 }, 00:20:03.241 "base_bdevs_list": [ 00:20:03.241 { 00:20:03.241 "name": "spare", 00:20:03.241 "uuid": "851967cb-e92a-542c-9b43-adb64efec82c", 00:20:03.241 "is_configured": true, 00:20:03.241 "data_offset": 2048, 00:20:03.241 "data_size": 63488 00:20:03.241 }, 00:20:03.241 { 00:20:03.241 "name": null, 00:20:03.241 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.241 "is_configured": false, 00:20:03.241 "data_offset": 2048, 00:20:03.241 "data_size": 63488 00:20:03.241 }, 00:20:03.241 { 00:20:03.241 "name": "BaseBdev3", 00:20:03.241 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:03.241 "is_configured": true, 00:20:03.241 "data_offset": 2048, 00:20:03.241 "data_size": 63488 00:20:03.241 }, 00:20:03.241 { 00:20:03.241 "name": "BaseBdev4", 00:20:03.241 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:03.241 "is_configured": true, 00:20:03.241 "data_offset": 2048, 00:20:03.241 "data_size": 63488 00:20:03.241 } 00:20:03.241 ] 00:20:03.241 }' 00:20:03.241 18:56:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:03.499 18:56:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:03.500 18:56:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:03.500 18:56:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:03.500 18:56:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:04.434 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:04.434 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:04.434 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:04.434 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:04.434 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:04.434 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:04.434 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.434 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:04.434 [2024-07-24 18:56:49.366218] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:04.434 [2024-07-24 18:56:49.366262] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:04.434 [2024-07-24 18:56:49.366350] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:04.692 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:04.692 "name": "raid_bdev1", 00:20:04.692 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:04.692 "strip_size_kb": 0, 00:20:04.692 "state": "online", 00:20:04.692 "raid_level": "raid1", 00:20:04.692 "superblock": true, 00:20:04.692 "num_base_bdevs": 4, 00:20:04.692 "num_base_bdevs_discovered": 3, 00:20:04.692 "num_base_bdevs_operational": 3, 00:20:04.692 "base_bdevs_list": [ 00:20:04.692 { 00:20:04.692 "name": "spare", 00:20:04.692 "uuid": "851967cb-e92a-542c-9b43-adb64efec82c", 00:20:04.692 "is_configured": true, 00:20:04.692 "data_offset": 2048, 00:20:04.692 "data_size": 63488 00:20:04.692 }, 00:20:04.692 { 00:20:04.692 "name": null, 00:20:04.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.692 "is_configured": false, 00:20:04.692 "data_offset": 2048, 00:20:04.692 "data_size": 63488 00:20:04.692 }, 00:20:04.692 { 00:20:04.692 "name": "BaseBdev3", 00:20:04.692 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:04.692 "is_configured": true, 00:20:04.692 "data_offset": 2048, 00:20:04.692 "data_size": 63488 00:20:04.692 }, 00:20:04.692 { 00:20:04.692 "name": "BaseBdev4", 00:20:04.692 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:04.692 "is_configured": true, 00:20:04.692 "data_offset": 2048, 00:20:04.692 "data_size": 63488 00:20:04.692 } 00:20:04.692 ] 00:20:04.692 }' 00:20:04.692 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:04.692 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:04.692 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:04.692 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:20:04.692 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:20:04.692 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:04.692 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:04.692 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:04.692 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:04.692 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:04.692 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.692 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:04.950 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:04.950 "name": "raid_bdev1", 00:20:04.950 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:04.950 "strip_size_kb": 0, 00:20:04.950 "state": "online", 00:20:04.950 "raid_level": "raid1", 00:20:04.950 "superblock": true, 00:20:04.950 "num_base_bdevs": 4, 00:20:04.950 "num_base_bdevs_discovered": 3, 00:20:04.950 "num_base_bdevs_operational": 3, 00:20:04.950 "base_bdevs_list": [ 00:20:04.950 { 00:20:04.950 "name": "spare", 00:20:04.950 "uuid": "851967cb-e92a-542c-9b43-adb64efec82c", 00:20:04.950 "is_configured": true, 00:20:04.950 "data_offset": 2048, 00:20:04.950 "data_size": 63488 00:20:04.950 }, 00:20:04.950 { 00:20:04.950 "name": null, 00:20:04.950 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.950 "is_configured": false, 00:20:04.950 "data_offset": 2048, 00:20:04.950 "data_size": 63488 00:20:04.950 }, 00:20:04.950 { 00:20:04.950 "name": "BaseBdev3", 00:20:04.950 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:04.950 "is_configured": true, 00:20:04.950 "data_offset": 2048, 00:20:04.950 "data_size": 63488 00:20:04.950 }, 00:20:04.950 { 00:20:04.950 "name": "BaseBdev4", 00:20:04.950 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:04.950 "is_configured": true, 00:20:04.950 "data_offset": 2048, 00:20:04.950 "data_size": 63488 00:20:04.950 } 00:20:04.950 ] 00:20:04.950 }' 00:20:04.950 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:04.950 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:04.950 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:04.950 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:04.950 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:04.950 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:04.950 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:04.951 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:04.951 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:04.951 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:04.951 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:04.951 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:04.951 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:04.951 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:04.951 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.951 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:05.209 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:05.209 "name": "raid_bdev1", 00:20:05.209 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:05.209 "strip_size_kb": 0, 00:20:05.209 "state": "online", 00:20:05.209 "raid_level": "raid1", 00:20:05.209 "superblock": true, 00:20:05.209 "num_base_bdevs": 4, 00:20:05.209 "num_base_bdevs_discovered": 3, 00:20:05.209 "num_base_bdevs_operational": 3, 00:20:05.209 "base_bdevs_list": [ 00:20:05.209 { 00:20:05.209 "name": "spare", 00:20:05.209 "uuid": "851967cb-e92a-542c-9b43-adb64efec82c", 00:20:05.209 "is_configured": true, 00:20:05.209 "data_offset": 2048, 00:20:05.209 "data_size": 63488 00:20:05.209 }, 00:20:05.209 { 00:20:05.209 "name": null, 00:20:05.209 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:05.209 "is_configured": false, 00:20:05.209 "data_offset": 2048, 00:20:05.209 "data_size": 63488 00:20:05.209 }, 00:20:05.209 { 00:20:05.209 "name": "BaseBdev3", 00:20:05.209 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:05.209 "is_configured": true, 00:20:05.209 "data_offset": 2048, 00:20:05.209 "data_size": 63488 00:20:05.209 }, 00:20:05.209 { 00:20:05.209 "name": "BaseBdev4", 00:20:05.209 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:05.209 "is_configured": true, 00:20:05.209 "data_offset": 2048, 00:20:05.209 "data_size": 63488 00:20:05.209 } 00:20:05.209 ] 00:20:05.209 }' 00:20:05.209 18:56:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:05.209 18:56:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:05.466 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:05.724 [2024-07-24 18:56:50.621252] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:05.724 [2024-07-24 18:56:50.621273] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:05.724 [2024-07-24 18:56:50.621318] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:05.724 [2024-07-24 18:56:50.621367] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:05.724 [2024-07-24 18:56:50.621372] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1da3990 name raid_bdev1, state offline 00:20:05.724 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.724 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:20:05.982 /dev/nbd0 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:05.982 18:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:05.982 1+0 records in 00:20:05.982 1+0 records out 00:20:05.982 4096 bytes (4.1 kB, 4.0 KiB) copied, 9.2645e-05 s, 44.2 MB/s 00:20:06.240 18:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:06.240 18:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:20:06.240 18:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:06.240 18:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:06.240 18:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:20:06.240 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:06.240 18:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:20:06.241 /dev/nbd1 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:06.241 1+0 records in 00:20:06.241 1+0 records out 00:20:06.241 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000198126 s, 20.7 MB/s 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:06.241 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:06.498 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:06.498 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:20:06.498 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:06.498 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:06.498 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:06.498 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:06.498 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:06.498 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:06.499 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:06.499 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:06.499 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:06.499 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:06.499 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:06.499 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:06.756 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:06.756 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:06.756 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:06.756 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:06.756 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:06.756 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:06.756 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:06.756 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:06.756 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:20:06.756 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:07.014 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:07.014 [2024-07-24 18:56:51.976778] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:07.014 [2024-07-24 18:56:51.976814] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:07.014 [2024-07-24 18:56:51.976828] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f4dba0 00:20:07.014 [2024-07-24 18:56:51.976835] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:07.014 [2024-07-24 18:56:51.978014] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:07.014 [2024-07-24 18:56:51.978036] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:07.014 [2024-07-24 18:56:51.978092] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:07.014 [2024-07-24 18:56:51.978111] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:07.014 [2024-07-24 18:56:51.978185] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:07.014 [2024-07-24 18:56:51.978235] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:07.014 spare 00:20:07.014 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:07.014 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:07.014 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:07.014 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:07.014 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:07.014 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:07.014 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:07.014 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:07.014 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:07.014 18:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:07.014 18:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:07.014 18:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.272 [2024-07-24 18:56:52.078527] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1da7210 00:20:07.272 [2024-07-24 18:56:52.078538] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:07.272 [2024-07-24 18:56:52.078674] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1da0c30 00:20:07.272 [2024-07-24 18:56:52.078776] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1da7210 00:20:07.272 [2024-07-24 18:56:52.078781] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1da7210 00:20:07.272 [2024-07-24 18:56:52.078850] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:07.272 18:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:07.272 "name": "raid_bdev1", 00:20:07.272 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:07.272 "strip_size_kb": 0, 00:20:07.273 "state": "online", 00:20:07.273 "raid_level": "raid1", 00:20:07.273 "superblock": true, 00:20:07.273 "num_base_bdevs": 4, 00:20:07.273 "num_base_bdevs_discovered": 3, 00:20:07.273 "num_base_bdevs_operational": 3, 00:20:07.273 "base_bdevs_list": [ 00:20:07.273 { 00:20:07.273 "name": "spare", 00:20:07.273 "uuid": "851967cb-e92a-542c-9b43-adb64efec82c", 00:20:07.273 "is_configured": true, 00:20:07.273 "data_offset": 2048, 00:20:07.273 "data_size": 63488 00:20:07.273 }, 00:20:07.273 { 00:20:07.273 "name": null, 00:20:07.273 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:07.273 "is_configured": false, 00:20:07.273 "data_offset": 2048, 00:20:07.273 "data_size": 63488 00:20:07.273 }, 00:20:07.273 { 00:20:07.273 "name": "BaseBdev3", 00:20:07.273 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:07.273 "is_configured": true, 00:20:07.273 "data_offset": 2048, 00:20:07.273 "data_size": 63488 00:20:07.273 }, 00:20:07.273 { 00:20:07.273 "name": "BaseBdev4", 00:20:07.273 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:07.273 "is_configured": true, 00:20:07.273 "data_offset": 2048, 00:20:07.273 "data_size": 63488 00:20:07.273 } 00:20:07.273 ] 00:20:07.273 }' 00:20:07.273 18:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:07.273 18:56:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:07.837 18:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:07.837 18:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:07.837 18:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:07.837 18:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:07.837 18:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:07.837 18:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:07.837 18:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.095 18:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:08.095 "name": "raid_bdev1", 00:20:08.095 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:08.095 "strip_size_kb": 0, 00:20:08.095 "state": "online", 00:20:08.095 "raid_level": "raid1", 00:20:08.095 "superblock": true, 00:20:08.095 "num_base_bdevs": 4, 00:20:08.095 "num_base_bdevs_discovered": 3, 00:20:08.095 "num_base_bdevs_operational": 3, 00:20:08.095 "base_bdevs_list": [ 00:20:08.095 { 00:20:08.095 "name": "spare", 00:20:08.095 "uuid": "851967cb-e92a-542c-9b43-adb64efec82c", 00:20:08.095 "is_configured": true, 00:20:08.095 "data_offset": 2048, 00:20:08.095 "data_size": 63488 00:20:08.095 }, 00:20:08.095 { 00:20:08.095 "name": null, 00:20:08.095 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:08.095 "is_configured": false, 00:20:08.095 "data_offset": 2048, 00:20:08.095 "data_size": 63488 00:20:08.095 }, 00:20:08.095 { 00:20:08.095 "name": "BaseBdev3", 00:20:08.095 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:08.095 "is_configured": true, 00:20:08.095 "data_offset": 2048, 00:20:08.095 "data_size": 63488 00:20:08.095 }, 00:20:08.095 { 00:20:08.095 "name": "BaseBdev4", 00:20:08.095 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:08.095 "is_configured": true, 00:20:08.095 "data_offset": 2048, 00:20:08.095 "data_size": 63488 00:20:08.095 } 00:20:08.095 ] 00:20:08.095 }' 00:20:08.095 18:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:08.095 18:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:08.095 18:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:08.095 18:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:08.095 18:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.095 18:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:20:08.095 18:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:20:08.095 18:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:08.353 [2024-07-24 18:56:53.248110] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:08.353 18:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:08.353 18:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:08.353 18:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:08.353 18:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:08.353 18:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:08.353 18:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:08.353 18:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:08.353 18:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:08.353 18:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:08.353 18:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:08.353 18:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.353 18:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:08.611 18:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:08.611 "name": "raid_bdev1", 00:20:08.611 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:08.611 "strip_size_kb": 0, 00:20:08.611 "state": "online", 00:20:08.611 "raid_level": "raid1", 00:20:08.611 "superblock": true, 00:20:08.611 "num_base_bdevs": 4, 00:20:08.611 "num_base_bdevs_discovered": 2, 00:20:08.611 "num_base_bdevs_operational": 2, 00:20:08.611 "base_bdevs_list": [ 00:20:08.611 { 00:20:08.611 "name": null, 00:20:08.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:08.611 "is_configured": false, 00:20:08.611 "data_offset": 2048, 00:20:08.611 "data_size": 63488 00:20:08.611 }, 00:20:08.611 { 00:20:08.611 "name": null, 00:20:08.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:08.611 "is_configured": false, 00:20:08.611 "data_offset": 2048, 00:20:08.611 "data_size": 63488 00:20:08.611 }, 00:20:08.611 { 00:20:08.611 "name": "BaseBdev3", 00:20:08.611 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:08.611 "is_configured": true, 00:20:08.611 "data_offset": 2048, 00:20:08.612 "data_size": 63488 00:20:08.612 }, 00:20:08.612 { 00:20:08.612 "name": "BaseBdev4", 00:20:08.612 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:08.612 "is_configured": true, 00:20:08.612 "data_offset": 2048, 00:20:08.612 "data_size": 63488 00:20:08.612 } 00:20:08.612 ] 00:20:08.612 }' 00:20:08.612 18:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:08.612 18:56:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:09.176 18:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:09.176 [2024-07-24 18:56:54.086304] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:09.176 [2024-07-24 18:56:54.086405] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:20:09.176 [2024-07-24 18:56:54.086414] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:09.176 [2024-07-24 18:56:54.086432] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:09.176 [2024-07-24 18:56:54.089835] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1da8aa0 00:20:09.176 [2024-07-24 18:56:54.090778] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:09.176 18:56:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:20:10.109 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:10.109 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:10.109 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:10.109 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:10.109 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:10.109 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:10.109 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:10.367 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:10.367 "name": "raid_bdev1", 00:20:10.367 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:10.367 "strip_size_kb": 0, 00:20:10.367 "state": "online", 00:20:10.367 "raid_level": "raid1", 00:20:10.367 "superblock": true, 00:20:10.367 "num_base_bdevs": 4, 00:20:10.368 "num_base_bdevs_discovered": 3, 00:20:10.368 "num_base_bdevs_operational": 3, 00:20:10.368 "process": { 00:20:10.368 "type": "rebuild", 00:20:10.368 "target": "spare", 00:20:10.368 "progress": { 00:20:10.368 "blocks": 22528, 00:20:10.368 "percent": 35 00:20:10.368 } 00:20:10.368 }, 00:20:10.368 "base_bdevs_list": [ 00:20:10.368 { 00:20:10.368 "name": "spare", 00:20:10.368 "uuid": "851967cb-e92a-542c-9b43-adb64efec82c", 00:20:10.368 "is_configured": true, 00:20:10.368 "data_offset": 2048, 00:20:10.368 "data_size": 63488 00:20:10.368 }, 00:20:10.368 { 00:20:10.368 "name": null, 00:20:10.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:10.368 "is_configured": false, 00:20:10.368 "data_offset": 2048, 00:20:10.368 "data_size": 63488 00:20:10.368 }, 00:20:10.368 { 00:20:10.368 "name": "BaseBdev3", 00:20:10.368 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:10.368 "is_configured": true, 00:20:10.368 "data_offset": 2048, 00:20:10.368 "data_size": 63488 00:20:10.368 }, 00:20:10.368 { 00:20:10.368 "name": "BaseBdev4", 00:20:10.368 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:10.368 "is_configured": true, 00:20:10.368 "data_offset": 2048, 00:20:10.368 "data_size": 63488 00:20:10.368 } 00:20:10.368 ] 00:20:10.368 }' 00:20:10.368 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:10.368 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:10.368 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:10.368 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:10.368 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:10.626 [2024-07-24 18:56:55.515034] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:10.626 [2024-07-24 18:56:55.601250] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:10.626 [2024-07-24 18:56:55.601281] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:10.626 [2024-07-24 18:56:55.601290] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:10.626 [2024-07-24 18:56:55.601294] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:10.626 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:10.626 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:10.626 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:10.626 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:10.626 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:10.626 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:10.626 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:10.626 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:10.626 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:10.626 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:10.626 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:10.626 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:10.883 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:10.883 "name": "raid_bdev1", 00:20:10.883 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:10.883 "strip_size_kb": 0, 00:20:10.883 "state": "online", 00:20:10.883 "raid_level": "raid1", 00:20:10.883 "superblock": true, 00:20:10.883 "num_base_bdevs": 4, 00:20:10.883 "num_base_bdevs_discovered": 2, 00:20:10.883 "num_base_bdevs_operational": 2, 00:20:10.883 "base_bdevs_list": [ 00:20:10.883 { 00:20:10.883 "name": null, 00:20:10.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:10.883 "is_configured": false, 00:20:10.883 "data_offset": 2048, 00:20:10.883 "data_size": 63488 00:20:10.883 }, 00:20:10.883 { 00:20:10.883 "name": null, 00:20:10.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:10.883 "is_configured": false, 00:20:10.883 "data_offset": 2048, 00:20:10.883 "data_size": 63488 00:20:10.883 }, 00:20:10.883 { 00:20:10.883 "name": "BaseBdev3", 00:20:10.883 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:10.883 "is_configured": true, 00:20:10.883 "data_offset": 2048, 00:20:10.883 "data_size": 63488 00:20:10.883 }, 00:20:10.883 { 00:20:10.883 "name": "BaseBdev4", 00:20:10.883 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:10.883 "is_configured": true, 00:20:10.883 "data_offset": 2048, 00:20:10.883 "data_size": 63488 00:20:10.883 } 00:20:10.883 ] 00:20:10.883 }' 00:20:10.883 18:56:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:10.883 18:56:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:11.450 18:56:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:11.450 [2024-07-24 18:56:56.378806] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:11.450 [2024-07-24 18:56:56.378843] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:11.450 [2024-07-24 18:56:56.378873] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e3fc10 00:20:11.450 [2024-07-24 18:56:56.378880] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:11.450 [2024-07-24 18:56:56.379162] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:11.450 [2024-07-24 18:56:56.379173] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:11.450 [2024-07-24 18:56:56.379245] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:11.450 [2024-07-24 18:56:56.379253] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:20:11.450 [2024-07-24 18:56:56.379258] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:11.450 [2024-07-24 18:56:56.379269] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:11.450 [2024-07-24 18:56:56.382720] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f514f0 00:20:11.450 [2024-07-24 18:56:56.383714] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:11.450 spare 00:20:11.450 18:56:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:20:12.826 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:12.826 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:12.826 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:12.826 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:12.826 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:12.826 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.826 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:12.826 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:12.826 "name": "raid_bdev1", 00:20:12.826 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:12.826 "strip_size_kb": 0, 00:20:12.826 "state": "online", 00:20:12.826 "raid_level": "raid1", 00:20:12.826 "superblock": true, 00:20:12.826 "num_base_bdevs": 4, 00:20:12.826 "num_base_bdevs_discovered": 3, 00:20:12.826 "num_base_bdevs_operational": 3, 00:20:12.826 "process": { 00:20:12.826 "type": "rebuild", 00:20:12.826 "target": "spare", 00:20:12.826 "progress": { 00:20:12.826 "blocks": 22528, 00:20:12.826 "percent": 35 00:20:12.826 } 00:20:12.826 }, 00:20:12.827 "base_bdevs_list": [ 00:20:12.827 { 00:20:12.827 "name": "spare", 00:20:12.827 "uuid": "851967cb-e92a-542c-9b43-adb64efec82c", 00:20:12.827 "is_configured": true, 00:20:12.827 "data_offset": 2048, 00:20:12.827 "data_size": 63488 00:20:12.827 }, 00:20:12.827 { 00:20:12.827 "name": null, 00:20:12.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:12.827 "is_configured": false, 00:20:12.827 "data_offset": 2048, 00:20:12.827 "data_size": 63488 00:20:12.827 }, 00:20:12.827 { 00:20:12.827 "name": "BaseBdev3", 00:20:12.827 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:12.827 "is_configured": true, 00:20:12.827 "data_offset": 2048, 00:20:12.827 "data_size": 63488 00:20:12.827 }, 00:20:12.827 { 00:20:12.827 "name": "BaseBdev4", 00:20:12.827 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:12.827 "is_configured": true, 00:20:12.827 "data_offset": 2048, 00:20:12.827 "data_size": 63488 00:20:12.827 } 00:20:12.827 ] 00:20:12.827 }' 00:20:12.827 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:12.827 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:12.827 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:12.827 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:12.827 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:12.827 [2024-07-24 18:56:57.807972] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:13.086 [2024-07-24 18:56:57.894235] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:13.086 [2024-07-24 18:56:57.894265] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:13.086 [2024-07-24 18:56:57.894274] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:13.086 [2024-07-24 18:56:57.894277] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:13.086 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:13.086 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:13.086 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:13.086 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:13.086 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:13.086 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:13.086 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:13.086 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:13.086 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:13.086 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:13.086 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.086 18:56:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:13.086 18:56:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:13.086 "name": "raid_bdev1", 00:20:13.086 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:13.086 "strip_size_kb": 0, 00:20:13.086 "state": "online", 00:20:13.086 "raid_level": "raid1", 00:20:13.086 "superblock": true, 00:20:13.086 "num_base_bdevs": 4, 00:20:13.086 "num_base_bdevs_discovered": 2, 00:20:13.086 "num_base_bdevs_operational": 2, 00:20:13.086 "base_bdevs_list": [ 00:20:13.086 { 00:20:13.086 "name": null, 00:20:13.086 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.086 "is_configured": false, 00:20:13.086 "data_offset": 2048, 00:20:13.086 "data_size": 63488 00:20:13.086 }, 00:20:13.086 { 00:20:13.086 "name": null, 00:20:13.086 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.086 "is_configured": false, 00:20:13.086 "data_offset": 2048, 00:20:13.086 "data_size": 63488 00:20:13.086 }, 00:20:13.086 { 00:20:13.086 "name": "BaseBdev3", 00:20:13.086 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:13.086 "is_configured": true, 00:20:13.086 "data_offset": 2048, 00:20:13.086 "data_size": 63488 00:20:13.086 }, 00:20:13.086 { 00:20:13.086 "name": "BaseBdev4", 00:20:13.086 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:13.086 "is_configured": true, 00:20:13.086 "data_offset": 2048, 00:20:13.086 "data_size": 63488 00:20:13.086 } 00:20:13.086 ] 00:20:13.086 }' 00:20:13.086 18:56:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:13.086 18:56:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:13.652 18:56:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:13.652 18:56:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:13.652 18:56:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:13.652 18:56:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:13.652 18:56:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:13.652 18:56:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.652 18:56:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:13.910 18:56:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:13.910 "name": "raid_bdev1", 00:20:13.910 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:13.910 "strip_size_kb": 0, 00:20:13.910 "state": "online", 00:20:13.910 "raid_level": "raid1", 00:20:13.910 "superblock": true, 00:20:13.910 "num_base_bdevs": 4, 00:20:13.910 "num_base_bdevs_discovered": 2, 00:20:13.910 "num_base_bdevs_operational": 2, 00:20:13.910 "base_bdevs_list": [ 00:20:13.910 { 00:20:13.910 "name": null, 00:20:13.910 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.910 "is_configured": false, 00:20:13.910 "data_offset": 2048, 00:20:13.910 "data_size": 63488 00:20:13.910 }, 00:20:13.910 { 00:20:13.910 "name": null, 00:20:13.910 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.910 "is_configured": false, 00:20:13.910 "data_offset": 2048, 00:20:13.910 "data_size": 63488 00:20:13.910 }, 00:20:13.910 { 00:20:13.910 "name": "BaseBdev3", 00:20:13.910 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:13.910 "is_configured": true, 00:20:13.910 "data_offset": 2048, 00:20:13.910 "data_size": 63488 00:20:13.910 }, 00:20:13.910 { 00:20:13.910 "name": "BaseBdev4", 00:20:13.910 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:13.910 "is_configured": true, 00:20:13.910 "data_offset": 2048, 00:20:13.910 "data_size": 63488 00:20:13.910 } 00:20:13.910 ] 00:20:13.910 }' 00:20:13.910 18:56:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:13.910 18:56:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:13.910 18:56:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:13.910 18:56:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:13.910 18:56:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:20:14.167 18:56:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:14.167 [2024-07-24 18:56:59.165160] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:14.167 [2024-07-24 18:56:59.165199] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:14.167 [2024-07-24 18:56:59.165211] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1da9360 00:20:14.167 [2024-07-24 18:56:59.165217] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:14.167 [2024-07-24 18:56:59.165466] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:14.167 [2024-07-24 18:56:59.165484] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:14.167 [2024-07-24 18:56:59.165531] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:20:14.167 [2024-07-24 18:56:59.165538] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:20:14.167 [2024-07-24 18:56:59.165542] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:14.167 BaseBdev1 00:20:14.424 18:56:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:20:15.359 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:15.359 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:15.359 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:15.359 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:15.359 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:15.359 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:15.359 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:15.359 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:15.359 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:15.359 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:15.359 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:15.359 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:15.359 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:15.359 "name": "raid_bdev1", 00:20:15.359 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:15.359 "strip_size_kb": 0, 00:20:15.359 "state": "online", 00:20:15.359 "raid_level": "raid1", 00:20:15.359 "superblock": true, 00:20:15.359 "num_base_bdevs": 4, 00:20:15.359 "num_base_bdevs_discovered": 2, 00:20:15.359 "num_base_bdevs_operational": 2, 00:20:15.359 "base_bdevs_list": [ 00:20:15.359 { 00:20:15.359 "name": null, 00:20:15.359 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:15.359 "is_configured": false, 00:20:15.359 "data_offset": 2048, 00:20:15.359 "data_size": 63488 00:20:15.359 }, 00:20:15.359 { 00:20:15.359 "name": null, 00:20:15.359 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:15.359 "is_configured": false, 00:20:15.359 "data_offset": 2048, 00:20:15.359 "data_size": 63488 00:20:15.359 }, 00:20:15.359 { 00:20:15.359 "name": "BaseBdev3", 00:20:15.360 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:15.360 "is_configured": true, 00:20:15.360 "data_offset": 2048, 00:20:15.360 "data_size": 63488 00:20:15.360 }, 00:20:15.360 { 00:20:15.360 "name": "BaseBdev4", 00:20:15.360 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:15.360 "is_configured": true, 00:20:15.360 "data_offset": 2048, 00:20:15.360 "data_size": 63488 00:20:15.360 } 00:20:15.360 ] 00:20:15.360 }' 00:20:15.360 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:15.360 18:57:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:15.926 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:15.926 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:15.926 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:15.926 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:15.926 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:15.926 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:15.926 18:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:16.185 18:57:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:16.185 "name": "raid_bdev1", 00:20:16.185 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:16.185 "strip_size_kb": 0, 00:20:16.185 "state": "online", 00:20:16.185 "raid_level": "raid1", 00:20:16.185 "superblock": true, 00:20:16.185 "num_base_bdevs": 4, 00:20:16.185 "num_base_bdevs_discovered": 2, 00:20:16.185 "num_base_bdevs_operational": 2, 00:20:16.185 "base_bdevs_list": [ 00:20:16.185 { 00:20:16.185 "name": null, 00:20:16.185 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:16.185 "is_configured": false, 00:20:16.185 "data_offset": 2048, 00:20:16.185 "data_size": 63488 00:20:16.185 }, 00:20:16.185 { 00:20:16.186 "name": null, 00:20:16.186 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:16.186 "is_configured": false, 00:20:16.186 "data_offset": 2048, 00:20:16.186 "data_size": 63488 00:20:16.186 }, 00:20:16.186 { 00:20:16.186 "name": "BaseBdev3", 00:20:16.186 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:16.186 "is_configured": true, 00:20:16.186 "data_offset": 2048, 00:20:16.186 "data_size": 63488 00:20:16.186 }, 00:20:16.186 { 00:20:16.186 "name": "BaseBdev4", 00:20:16.186 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:16.186 "is_configured": true, 00:20:16.186 "data_offset": 2048, 00:20:16.186 "data_size": 63488 00:20:16.186 } 00:20:16.186 ] 00:20:16.186 }' 00:20:16.186 18:57:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:16.186 18:57:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:16.186 18:57:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:16.186 18:57:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:16.186 18:57:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:16.186 18:57:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:20:16.186 18:57:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:16.186 18:57:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:16.186 18:57:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:16.186 18:57:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:16.186 18:57:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:16.186 18:57:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:16.186 18:57:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:16.186 18:57:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:16.186 18:57:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:16.186 18:57:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:16.444 [2024-07-24 18:57:01.266690] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:16.444 [2024-07-24 18:57:01.266785] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:20:16.444 [2024-07-24 18:57:01.266795] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:16.444 request: 00:20:16.444 { 00:20:16.444 "base_bdev": "BaseBdev1", 00:20:16.444 "raid_bdev": "raid_bdev1", 00:20:16.444 "method": "bdev_raid_add_base_bdev", 00:20:16.444 "req_id": 1 00:20:16.444 } 00:20:16.444 Got JSON-RPC error response 00:20:16.444 response: 00:20:16.444 { 00:20:16.444 "code": -22, 00:20:16.444 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:20:16.444 } 00:20:16.444 18:57:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:20:16.444 18:57:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:20:16.444 18:57:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:20:16.444 18:57:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:20:16.444 18:57:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:20:17.379 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:17.379 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:17.379 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:17.379 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:17.379 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:17.379 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:17.379 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:17.379 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:17.379 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:17.379 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:17.379 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.379 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:17.638 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:17.638 "name": "raid_bdev1", 00:20:17.638 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:17.638 "strip_size_kb": 0, 00:20:17.638 "state": "online", 00:20:17.638 "raid_level": "raid1", 00:20:17.638 "superblock": true, 00:20:17.638 "num_base_bdevs": 4, 00:20:17.638 "num_base_bdevs_discovered": 2, 00:20:17.638 "num_base_bdevs_operational": 2, 00:20:17.638 "base_bdevs_list": [ 00:20:17.638 { 00:20:17.638 "name": null, 00:20:17.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:17.638 "is_configured": false, 00:20:17.638 "data_offset": 2048, 00:20:17.638 "data_size": 63488 00:20:17.638 }, 00:20:17.638 { 00:20:17.638 "name": null, 00:20:17.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:17.638 "is_configured": false, 00:20:17.638 "data_offset": 2048, 00:20:17.638 "data_size": 63488 00:20:17.638 }, 00:20:17.638 { 00:20:17.638 "name": "BaseBdev3", 00:20:17.638 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:17.638 "is_configured": true, 00:20:17.638 "data_offset": 2048, 00:20:17.638 "data_size": 63488 00:20:17.638 }, 00:20:17.638 { 00:20:17.638 "name": "BaseBdev4", 00:20:17.638 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:17.638 "is_configured": true, 00:20:17.638 "data_offset": 2048, 00:20:17.638 "data_size": 63488 00:20:17.638 } 00:20:17.638 ] 00:20:17.638 }' 00:20:17.639 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:17.639 18:57:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:18.205 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:18.205 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:18.205 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:18.205 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:18.206 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:18.206 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:18.206 18:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:18.206 18:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:18.206 "name": "raid_bdev1", 00:20:18.206 "uuid": "86b05c6e-f133-4274-bf96-94628104954a", 00:20:18.206 "strip_size_kb": 0, 00:20:18.206 "state": "online", 00:20:18.206 "raid_level": "raid1", 00:20:18.206 "superblock": true, 00:20:18.206 "num_base_bdevs": 4, 00:20:18.206 "num_base_bdevs_discovered": 2, 00:20:18.206 "num_base_bdevs_operational": 2, 00:20:18.206 "base_bdevs_list": [ 00:20:18.206 { 00:20:18.206 "name": null, 00:20:18.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:18.206 "is_configured": false, 00:20:18.206 "data_offset": 2048, 00:20:18.206 "data_size": 63488 00:20:18.206 }, 00:20:18.206 { 00:20:18.206 "name": null, 00:20:18.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:18.206 "is_configured": false, 00:20:18.206 "data_offset": 2048, 00:20:18.206 "data_size": 63488 00:20:18.206 }, 00:20:18.206 { 00:20:18.206 "name": "BaseBdev3", 00:20:18.206 "uuid": "3cebdbb3-20cc-54e2-aa7e-edcf0ba73137", 00:20:18.206 "is_configured": true, 00:20:18.206 "data_offset": 2048, 00:20:18.206 "data_size": 63488 00:20:18.206 }, 00:20:18.206 { 00:20:18.206 "name": "BaseBdev4", 00:20:18.206 "uuid": "60db6662-0b63-5603-996e-b5fbb5c74f21", 00:20:18.206 "is_configured": true, 00:20:18.206 "data_offset": 2048, 00:20:18.206 "data_size": 63488 00:20:18.206 } 00:20:18.206 ] 00:20:18.206 }' 00:20:18.206 18:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:18.206 18:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:18.206 18:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:18.206 18:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:18.206 18:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 2168625 00:20:18.206 18:57:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2168625 ']' 00:20:18.206 18:57:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 2168625 00:20:18.206 18:57:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:20:18.206 18:57:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:18.206 18:57:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2168625 00:20:18.464 18:57:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:18.464 18:57:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:18.464 18:57:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2168625' 00:20:18.464 killing process with pid 2168625 00:20:18.464 18:57:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 2168625 00:20:18.464 Received shutdown signal, test time was about 60.000000 seconds 00:20:18.464 00:20:18.464 Latency(us) 00:20:18.464 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:18.464 =================================================================================================================== 00:20:18.464 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:20:18.464 [2024-07-24 18:57:03.241805] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:18.464 [2024-07-24 18:57:03.241873] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:18.464 [2024-07-24 18:57:03.241912] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:18.464 [2024-07-24 18:57:03.241918] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1da7210 name raid_bdev1, state offline 00:20:18.464 18:57:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 2168625 00:20:18.464 [2024-07-24 18:57:03.281167] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:18.464 18:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:20:18.464 00:20:18.464 real 0m29.725s 00:20:18.464 user 0m43.402s 00:20:18.464 sys 0m4.205s 00:20:18.464 18:57:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:18.464 18:57:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:18.464 ************************************ 00:20:18.464 END TEST raid_rebuild_test_sb 00:20:18.464 ************************************ 00:20:18.723 18:57:03 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:20:18.723 18:57:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:20:18.723 18:57:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:18.723 18:57:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:18.723 ************************************ 00:20:18.723 START TEST raid_rebuild_test_io 00:20:18.723 ************************************ 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2174181 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2174181 /var/tmp/spdk-raid.sock 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 2174181 ']' 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:18.723 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:18.723 18:57:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:18.723 [2024-07-24 18:57:03.561151] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:20:18.723 [2024-07-24 18:57:03.561188] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2174181 ] 00:20:18.723 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:18.723 Zero copy mechanism will not be used. 00:20:18.723 [2024-07-24 18:57:03.618874] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:18.723 [2024-07-24 18:57:03.697409] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:18.982 [2024-07-24 18:57:03.754205] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:18.982 [2024-07-24 18:57:03.754231] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:19.590 18:57:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:19.590 18:57:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:20:19.590 18:57:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:19.590 18:57:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:19.590 BaseBdev1_malloc 00:20:19.590 18:57:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:19.848 [2024-07-24 18:57:04.686011] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:19.848 [2024-07-24 18:57:04.686047] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:19.848 [2024-07-24 18:57:04.686062] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2482130 00:20:19.848 [2024-07-24 18:57:04.686084] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:19.848 [2024-07-24 18:57:04.687250] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:19.848 [2024-07-24 18:57:04.687274] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:19.848 BaseBdev1 00:20:19.848 18:57:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:19.848 18:57:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:20.108 BaseBdev2_malloc 00:20:20.108 18:57:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:20.108 [2024-07-24 18:57:05.026506] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:20.108 [2024-07-24 18:57:05.026537] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:20.108 [2024-07-24 18:57:05.026548] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2627fa0 00:20:20.108 [2024-07-24 18:57:05.026554] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:20.108 [2024-07-24 18:57:05.027617] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:20.108 [2024-07-24 18:57:05.027638] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:20.108 BaseBdev2 00:20:20.108 18:57:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:20.108 18:57:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:20.366 BaseBdev3_malloc 00:20:20.366 18:57:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:20:20.367 [2024-07-24 18:57:05.342722] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:20:20.367 [2024-07-24 18:57:05.342753] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:20.367 [2024-07-24 18:57:05.342765] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2633970 00:20:20.367 [2024-07-24 18:57:05.342786] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:20.367 [2024-07-24 18:57:05.343832] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:20.367 [2024-07-24 18:57:05.343852] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:20.367 BaseBdev3 00:20:20.367 18:57:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:20.367 18:57:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:20.625 BaseBdev4_malloc 00:20:20.625 18:57:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:20:20.884 [2024-07-24 18:57:05.679100] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:20:20.884 [2024-07-24 18:57:05.679136] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:20.884 [2024-07-24 18:57:05.679151] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x262a8c0 00:20:20.884 [2024-07-24 18:57:05.679158] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:20.884 [2024-07-24 18:57:05.680229] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:20.884 [2024-07-24 18:57:05.680249] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:20.884 BaseBdev4 00:20:20.884 18:57:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:20.884 spare_malloc 00:20:20.884 18:57:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:21.140 spare_delay 00:20:21.140 18:57:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:21.397 [2024-07-24 18:57:06.159907] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:21.397 [2024-07-24 18:57:06.159937] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:21.397 [2024-07-24 18:57:06.159950] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x247abf0 00:20:21.397 [2024-07-24 18:57:06.159956] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:21.397 [2024-07-24 18:57:06.160992] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:21.397 [2024-07-24 18:57:06.161011] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:21.397 spare 00:20:21.397 18:57:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:20:21.397 [2024-07-24 18:57:06.328360] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:21.397 [2024-07-24 18:57:06.329275] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:21.397 [2024-07-24 18:57:06.329315] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:21.397 [2024-07-24 18:57:06.329343] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:21.397 [2024-07-24 18:57:06.329394] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x247c990 00:20:21.397 [2024-07-24 18:57:06.329399] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:21.397 [2024-07-24 18:57:06.329558] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24801f0 00:20:21.397 [2024-07-24 18:57:06.329665] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x247c990 00:20:21.397 [2024-07-24 18:57:06.329670] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x247c990 00:20:21.397 [2024-07-24 18:57:06.329754] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:21.397 18:57:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:21.397 18:57:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:21.397 18:57:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:21.397 18:57:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:21.397 18:57:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:21.397 18:57:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:21.397 18:57:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:21.397 18:57:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:21.397 18:57:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:21.398 18:57:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:21.398 18:57:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:21.398 18:57:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.655 18:57:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:21.655 "name": "raid_bdev1", 00:20:21.655 "uuid": "18c1d45b-a47d-48a5-978c-2814c77a97bc", 00:20:21.655 "strip_size_kb": 0, 00:20:21.655 "state": "online", 00:20:21.655 "raid_level": "raid1", 00:20:21.655 "superblock": false, 00:20:21.655 "num_base_bdevs": 4, 00:20:21.655 "num_base_bdevs_discovered": 4, 00:20:21.655 "num_base_bdevs_operational": 4, 00:20:21.655 "base_bdevs_list": [ 00:20:21.655 { 00:20:21.655 "name": "BaseBdev1", 00:20:21.655 "uuid": "66a23e54-75b6-5a60-bf39-ee7182db2008", 00:20:21.655 "is_configured": true, 00:20:21.655 "data_offset": 0, 00:20:21.655 "data_size": 65536 00:20:21.655 }, 00:20:21.655 { 00:20:21.655 "name": "BaseBdev2", 00:20:21.655 "uuid": "4693631a-7845-5d04-842e-d9edf3ec7807", 00:20:21.655 "is_configured": true, 00:20:21.655 "data_offset": 0, 00:20:21.655 "data_size": 65536 00:20:21.655 }, 00:20:21.655 { 00:20:21.655 "name": "BaseBdev3", 00:20:21.655 "uuid": "5466a1e3-9771-5a3b-8dcc-ca9e802a940c", 00:20:21.655 "is_configured": true, 00:20:21.655 "data_offset": 0, 00:20:21.655 "data_size": 65536 00:20:21.655 }, 00:20:21.655 { 00:20:21.655 "name": "BaseBdev4", 00:20:21.655 "uuid": "5be5e648-d674-5bf7-aece-10761a11a1f3", 00:20:21.655 "is_configured": true, 00:20:21.655 "data_offset": 0, 00:20:21.655 "data_size": 65536 00:20:21.655 } 00:20:21.655 ] 00:20:21.656 }' 00:20:21.656 18:57:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:21.656 18:57:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:22.220 18:57:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:22.220 18:57:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:20:22.220 [2024-07-24 18:57:07.130618] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:22.220 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:20:22.220 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.220 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:22.478 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:20:22.479 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:20:22.479 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:22.479 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:22.479 [2024-07-24 18:57:07.388913] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2479500 00:20:22.479 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:22.479 Zero copy mechanism will not be used. 00:20:22.479 Running I/O for 60 seconds... 00:20:22.479 [2024-07-24 18:57:07.481930] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:22.479 [2024-07-24 18:57:07.482092] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2479500 00:20:22.737 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:22.737 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:22.737 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:22.737 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:22.737 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:22.738 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:22.738 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:22.738 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:22.738 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:22.738 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:22.738 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.738 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:22.738 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:22.738 "name": "raid_bdev1", 00:20:22.738 "uuid": "18c1d45b-a47d-48a5-978c-2814c77a97bc", 00:20:22.738 "strip_size_kb": 0, 00:20:22.738 "state": "online", 00:20:22.738 "raid_level": "raid1", 00:20:22.738 "superblock": false, 00:20:22.738 "num_base_bdevs": 4, 00:20:22.738 "num_base_bdevs_discovered": 3, 00:20:22.738 "num_base_bdevs_operational": 3, 00:20:22.738 "base_bdevs_list": [ 00:20:22.738 { 00:20:22.738 "name": null, 00:20:22.738 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.738 "is_configured": false, 00:20:22.738 "data_offset": 0, 00:20:22.738 "data_size": 65536 00:20:22.738 }, 00:20:22.738 { 00:20:22.738 "name": "BaseBdev2", 00:20:22.738 "uuid": "4693631a-7845-5d04-842e-d9edf3ec7807", 00:20:22.738 "is_configured": true, 00:20:22.738 "data_offset": 0, 00:20:22.738 "data_size": 65536 00:20:22.738 }, 00:20:22.738 { 00:20:22.738 "name": "BaseBdev3", 00:20:22.738 "uuid": "5466a1e3-9771-5a3b-8dcc-ca9e802a940c", 00:20:22.738 "is_configured": true, 00:20:22.738 "data_offset": 0, 00:20:22.738 "data_size": 65536 00:20:22.738 }, 00:20:22.738 { 00:20:22.738 "name": "BaseBdev4", 00:20:22.738 "uuid": "5be5e648-d674-5bf7-aece-10761a11a1f3", 00:20:22.738 "is_configured": true, 00:20:22.738 "data_offset": 0, 00:20:22.738 "data_size": 65536 00:20:22.738 } 00:20:22.738 ] 00:20:22.738 }' 00:20:22.738 18:57:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:22.738 18:57:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:23.308 18:57:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:23.566 [2024-07-24 18:57:08.340610] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:23.566 18:57:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:20:23.566 [2024-07-24 18:57:08.392454] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25194a0 00:20:23.566 [2024-07-24 18:57:08.393955] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:23.566 [2024-07-24 18:57:08.528084] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:23.566 [2024-07-24 18:57:08.529132] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:23.823 [2024-07-24 18:57:08.746886] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:23.823 [2024-07-24 18:57:08.747142] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:24.081 [2024-07-24 18:57:08.991631] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:24.338 [2024-07-24 18:57:09.206167] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:24.338 [2024-07-24 18:57:09.206315] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:24.595 18:57:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:24.595 18:57:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:24.595 18:57:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:24.595 18:57:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:24.595 18:57:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:24.595 18:57:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.595 18:57:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:24.595 [2024-07-24 18:57:09.431704] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:24.595 [2024-07-24 18:57:09.546608] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:24.595 18:57:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:24.595 "name": "raid_bdev1", 00:20:24.595 "uuid": "18c1d45b-a47d-48a5-978c-2814c77a97bc", 00:20:24.595 "strip_size_kb": 0, 00:20:24.595 "state": "online", 00:20:24.595 "raid_level": "raid1", 00:20:24.595 "superblock": false, 00:20:24.595 "num_base_bdevs": 4, 00:20:24.595 "num_base_bdevs_discovered": 4, 00:20:24.595 "num_base_bdevs_operational": 4, 00:20:24.595 "process": { 00:20:24.595 "type": "rebuild", 00:20:24.595 "target": "spare", 00:20:24.595 "progress": { 00:20:24.595 "blocks": 14336, 00:20:24.595 "percent": 21 00:20:24.595 } 00:20:24.595 }, 00:20:24.595 "base_bdevs_list": [ 00:20:24.595 { 00:20:24.595 "name": "spare", 00:20:24.595 "uuid": "c091ab03-8530-5a80-bb94-570c691f5aff", 00:20:24.595 "is_configured": true, 00:20:24.595 "data_offset": 0, 00:20:24.595 "data_size": 65536 00:20:24.595 }, 00:20:24.595 { 00:20:24.595 "name": "BaseBdev2", 00:20:24.595 "uuid": "4693631a-7845-5d04-842e-d9edf3ec7807", 00:20:24.595 "is_configured": true, 00:20:24.595 "data_offset": 0, 00:20:24.595 "data_size": 65536 00:20:24.595 }, 00:20:24.595 { 00:20:24.595 "name": "BaseBdev3", 00:20:24.595 "uuid": "5466a1e3-9771-5a3b-8dcc-ca9e802a940c", 00:20:24.595 "is_configured": true, 00:20:24.595 "data_offset": 0, 00:20:24.595 "data_size": 65536 00:20:24.595 }, 00:20:24.595 { 00:20:24.595 "name": "BaseBdev4", 00:20:24.596 "uuid": "5be5e648-d674-5bf7-aece-10761a11a1f3", 00:20:24.596 "is_configured": true, 00:20:24.596 "data_offset": 0, 00:20:24.596 "data_size": 65536 00:20:24.596 } 00:20:24.596 ] 00:20:24.596 }' 00:20:24.596 18:57:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:24.854 18:57:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:24.854 18:57:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:24.854 18:57:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:24.854 18:57:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:24.854 [2024-07-24 18:57:09.815854] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:25.113 [2024-07-24 18:57:09.898305] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:20:25.113 [2024-07-24 18:57:10.006761] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:25.113 [2024-07-24 18:57:10.018314] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:25.113 [2024-07-24 18:57:10.018345] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:25.113 [2024-07-24 18:57:10.018352] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:25.113 [2024-07-24 18:57:10.040828] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2479500 00:20:25.113 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:25.113 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:25.113 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:25.113 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:25.113 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:25.113 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:25.113 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:25.113 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:25.113 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:25.113 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:25.113 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:25.113 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.371 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:25.371 "name": "raid_bdev1", 00:20:25.371 "uuid": "18c1d45b-a47d-48a5-978c-2814c77a97bc", 00:20:25.371 "strip_size_kb": 0, 00:20:25.371 "state": "online", 00:20:25.371 "raid_level": "raid1", 00:20:25.371 "superblock": false, 00:20:25.371 "num_base_bdevs": 4, 00:20:25.371 "num_base_bdevs_discovered": 3, 00:20:25.371 "num_base_bdevs_operational": 3, 00:20:25.371 "base_bdevs_list": [ 00:20:25.371 { 00:20:25.371 "name": null, 00:20:25.371 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.371 "is_configured": false, 00:20:25.371 "data_offset": 0, 00:20:25.371 "data_size": 65536 00:20:25.371 }, 00:20:25.371 { 00:20:25.371 "name": "BaseBdev2", 00:20:25.371 "uuid": "4693631a-7845-5d04-842e-d9edf3ec7807", 00:20:25.371 "is_configured": true, 00:20:25.371 "data_offset": 0, 00:20:25.371 "data_size": 65536 00:20:25.371 }, 00:20:25.371 { 00:20:25.371 "name": "BaseBdev3", 00:20:25.371 "uuid": "5466a1e3-9771-5a3b-8dcc-ca9e802a940c", 00:20:25.371 "is_configured": true, 00:20:25.371 "data_offset": 0, 00:20:25.371 "data_size": 65536 00:20:25.371 }, 00:20:25.371 { 00:20:25.371 "name": "BaseBdev4", 00:20:25.371 "uuid": "5be5e648-d674-5bf7-aece-10761a11a1f3", 00:20:25.371 "is_configured": true, 00:20:25.371 "data_offset": 0, 00:20:25.371 "data_size": 65536 00:20:25.371 } 00:20:25.371 ] 00:20:25.371 }' 00:20:25.371 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:25.371 18:57:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:25.938 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:25.938 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:25.938 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:25.938 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:25.938 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:25.938 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.938 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:25.938 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:25.938 "name": "raid_bdev1", 00:20:25.938 "uuid": "18c1d45b-a47d-48a5-978c-2814c77a97bc", 00:20:25.938 "strip_size_kb": 0, 00:20:25.938 "state": "online", 00:20:25.938 "raid_level": "raid1", 00:20:25.938 "superblock": false, 00:20:25.938 "num_base_bdevs": 4, 00:20:25.938 "num_base_bdevs_discovered": 3, 00:20:25.938 "num_base_bdevs_operational": 3, 00:20:25.938 "base_bdevs_list": [ 00:20:25.938 { 00:20:25.938 "name": null, 00:20:25.938 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.938 "is_configured": false, 00:20:25.938 "data_offset": 0, 00:20:25.938 "data_size": 65536 00:20:25.938 }, 00:20:25.938 { 00:20:25.938 "name": "BaseBdev2", 00:20:25.938 "uuid": "4693631a-7845-5d04-842e-d9edf3ec7807", 00:20:25.938 "is_configured": true, 00:20:25.938 "data_offset": 0, 00:20:25.938 "data_size": 65536 00:20:25.938 }, 00:20:25.938 { 00:20:25.938 "name": "BaseBdev3", 00:20:25.938 "uuid": "5466a1e3-9771-5a3b-8dcc-ca9e802a940c", 00:20:25.938 "is_configured": true, 00:20:25.938 "data_offset": 0, 00:20:25.938 "data_size": 65536 00:20:25.938 }, 00:20:25.938 { 00:20:25.938 "name": "BaseBdev4", 00:20:25.938 "uuid": "5be5e648-d674-5bf7-aece-10761a11a1f3", 00:20:25.938 "is_configured": true, 00:20:25.938 "data_offset": 0, 00:20:25.938 "data_size": 65536 00:20:25.938 } 00:20:25.938 ] 00:20:25.938 }' 00:20:25.938 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:25.938 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:25.938 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:26.194 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:26.194 18:57:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:26.195 [2024-07-24 18:57:11.135156] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:26.195 18:57:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:20:26.195 [2024-07-24 18:57:11.186960] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25194a0 00:20:26.195 [2024-07-24 18:57:11.188049] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:26.451 [2024-07-24 18:57:11.425576] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:26.451 [2024-07-24 18:57:11.426109] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:27.016 [2024-07-24 18:57:11.758930] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:27.016 [2024-07-24 18:57:11.759202] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:27.016 [2024-07-24 18:57:11.860793] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:27.016 [2024-07-24 18:57:11.860938] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:27.274 [2024-07-24 18:57:12.116295] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:27.274 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:27.274 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:27.274 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:27.274 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:27.274 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:27.274 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.274 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:27.532 [2024-07-24 18:57:12.325356] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:27.532 [2024-07-24 18:57:12.325892] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:27.532 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:27.532 "name": "raid_bdev1", 00:20:27.532 "uuid": "18c1d45b-a47d-48a5-978c-2814c77a97bc", 00:20:27.532 "strip_size_kb": 0, 00:20:27.532 "state": "online", 00:20:27.532 "raid_level": "raid1", 00:20:27.532 "superblock": false, 00:20:27.532 "num_base_bdevs": 4, 00:20:27.532 "num_base_bdevs_discovered": 4, 00:20:27.532 "num_base_bdevs_operational": 4, 00:20:27.532 "process": { 00:20:27.532 "type": "rebuild", 00:20:27.532 "target": "spare", 00:20:27.532 "progress": { 00:20:27.532 "blocks": 16384, 00:20:27.532 "percent": 25 00:20:27.532 } 00:20:27.532 }, 00:20:27.532 "base_bdevs_list": [ 00:20:27.532 { 00:20:27.532 "name": "spare", 00:20:27.532 "uuid": "c091ab03-8530-5a80-bb94-570c691f5aff", 00:20:27.532 "is_configured": true, 00:20:27.532 "data_offset": 0, 00:20:27.532 "data_size": 65536 00:20:27.532 }, 00:20:27.532 { 00:20:27.532 "name": "BaseBdev2", 00:20:27.532 "uuid": "4693631a-7845-5d04-842e-d9edf3ec7807", 00:20:27.532 "is_configured": true, 00:20:27.532 "data_offset": 0, 00:20:27.532 "data_size": 65536 00:20:27.532 }, 00:20:27.532 { 00:20:27.532 "name": "BaseBdev3", 00:20:27.532 "uuid": "5466a1e3-9771-5a3b-8dcc-ca9e802a940c", 00:20:27.532 "is_configured": true, 00:20:27.532 "data_offset": 0, 00:20:27.532 "data_size": 65536 00:20:27.532 }, 00:20:27.532 { 00:20:27.532 "name": "BaseBdev4", 00:20:27.532 "uuid": "5be5e648-d674-5bf7-aece-10761a11a1f3", 00:20:27.532 "is_configured": true, 00:20:27.532 "data_offset": 0, 00:20:27.532 "data_size": 65536 00:20:27.532 } 00:20:27.532 ] 00:20:27.532 }' 00:20:27.532 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:27.532 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:27.532 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:27.532 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:27.532 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:20:27.532 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:20:27.532 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:20:27.532 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:20:27.532 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:27.790 [2024-07-24 18:57:12.576697] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:27.790 [2024-07-24 18:57:12.631069] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:27.790 [2024-07-24 18:57:12.657821] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x2479500 00:20:27.790 [2024-07-24 18:57:12.657839] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x25194a0 00:20:27.790 [2024-07-24 18:57:12.659018] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:27.790 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:20:27.790 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:20:27.790 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:27.790 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:27.790 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:27.790 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:27.790 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:27.790 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.790 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:27.790 [2024-07-24 18:57:12.774599] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:20:28.049 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:28.049 "name": "raid_bdev1", 00:20:28.049 "uuid": "18c1d45b-a47d-48a5-978c-2814c77a97bc", 00:20:28.049 "strip_size_kb": 0, 00:20:28.049 "state": "online", 00:20:28.049 "raid_level": "raid1", 00:20:28.049 "superblock": false, 00:20:28.049 "num_base_bdevs": 4, 00:20:28.049 "num_base_bdevs_discovered": 3, 00:20:28.049 "num_base_bdevs_operational": 3, 00:20:28.049 "process": { 00:20:28.049 "type": "rebuild", 00:20:28.049 "target": "spare", 00:20:28.049 "progress": { 00:20:28.049 "blocks": 22528, 00:20:28.049 "percent": 34 00:20:28.049 } 00:20:28.049 }, 00:20:28.049 "base_bdevs_list": [ 00:20:28.049 { 00:20:28.049 "name": "spare", 00:20:28.049 "uuid": "c091ab03-8530-5a80-bb94-570c691f5aff", 00:20:28.049 "is_configured": true, 00:20:28.049 "data_offset": 0, 00:20:28.049 "data_size": 65536 00:20:28.049 }, 00:20:28.049 { 00:20:28.049 "name": null, 00:20:28.049 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.049 "is_configured": false, 00:20:28.049 "data_offset": 0, 00:20:28.049 "data_size": 65536 00:20:28.049 }, 00:20:28.049 { 00:20:28.049 "name": "BaseBdev3", 00:20:28.049 "uuid": "5466a1e3-9771-5a3b-8dcc-ca9e802a940c", 00:20:28.049 "is_configured": true, 00:20:28.049 "data_offset": 0, 00:20:28.049 "data_size": 65536 00:20:28.049 }, 00:20:28.049 { 00:20:28.049 "name": "BaseBdev4", 00:20:28.049 "uuid": "5be5e648-d674-5bf7-aece-10761a11a1f3", 00:20:28.049 "is_configured": true, 00:20:28.049 "data_offset": 0, 00:20:28.049 "data_size": 65536 00:20:28.049 } 00:20:28.049 ] 00:20:28.049 }' 00:20:28.049 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:28.049 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:28.049 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:28.049 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:28.049 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=713 00:20:28.049 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:28.049 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:28.049 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:28.049 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:28.049 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:28.049 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:28.049 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.049 18:57:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:28.308 [2024-07-24 18:57:13.084233] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:20:28.308 18:57:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:28.308 "name": "raid_bdev1", 00:20:28.308 "uuid": "18c1d45b-a47d-48a5-978c-2814c77a97bc", 00:20:28.308 "strip_size_kb": 0, 00:20:28.308 "state": "online", 00:20:28.308 "raid_level": "raid1", 00:20:28.308 "superblock": false, 00:20:28.308 "num_base_bdevs": 4, 00:20:28.308 "num_base_bdevs_discovered": 3, 00:20:28.308 "num_base_bdevs_operational": 3, 00:20:28.308 "process": { 00:20:28.308 "type": "rebuild", 00:20:28.308 "target": "spare", 00:20:28.308 "progress": { 00:20:28.308 "blocks": 26624, 00:20:28.308 "percent": 40 00:20:28.308 } 00:20:28.308 }, 00:20:28.308 "base_bdevs_list": [ 00:20:28.308 { 00:20:28.308 "name": "spare", 00:20:28.308 "uuid": "c091ab03-8530-5a80-bb94-570c691f5aff", 00:20:28.308 "is_configured": true, 00:20:28.308 "data_offset": 0, 00:20:28.308 "data_size": 65536 00:20:28.308 }, 00:20:28.308 { 00:20:28.308 "name": null, 00:20:28.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.308 "is_configured": false, 00:20:28.308 "data_offset": 0, 00:20:28.308 "data_size": 65536 00:20:28.308 }, 00:20:28.308 { 00:20:28.308 "name": "BaseBdev3", 00:20:28.308 "uuid": "5466a1e3-9771-5a3b-8dcc-ca9e802a940c", 00:20:28.308 "is_configured": true, 00:20:28.308 "data_offset": 0, 00:20:28.308 "data_size": 65536 00:20:28.308 }, 00:20:28.308 { 00:20:28.308 "name": "BaseBdev4", 00:20:28.308 "uuid": "5be5e648-d674-5bf7-aece-10761a11a1f3", 00:20:28.308 "is_configured": true, 00:20:28.308 "data_offset": 0, 00:20:28.308 "data_size": 65536 00:20:28.308 } 00:20:28.308 ] 00:20:28.308 }' 00:20:28.308 18:57:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:28.308 18:57:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:28.308 18:57:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:28.308 18:57:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:28.308 18:57:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:28.565 [2024-07-24 18:57:13.406810] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:20:28.565 [2024-07-24 18:57:13.406959] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:20:28.837 [2024-07-24 18:57:13.739777] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:20:29.100 [2024-07-24 18:57:13.943407] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:20:29.358 18:57:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:29.358 18:57:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:29.358 18:57:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:29.358 18:57:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:29.358 18:57:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:29.358 18:57:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:29.358 18:57:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.358 18:57:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:29.358 [2024-07-24 18:57:14.290592] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:20:29.617 18:57:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:29.617 "name": "raid_bdev1", 00:20:29.617 "uuid": "18c1d45b-a47d-48a5-978c-2814c77a97bc", 00:20:29.617 "strip_size_kb": 0, 00:20:29.617 "state": "online", 00:20:29.617 "raid_level": "raid1", 00:20:29.617 "superblock": false, 00:20:29.617 "num_base_bdevs": 4, 00:20:29.617 "num_base_bdevs_discovered": 3, 00:20:29.617 "num_base_bdevs_operational": 3, 00:20:29.617 "process": { 00:20:29.617 "type": "rebuild", 00:20:29.617 "target": "spare", 00:20:29.617 "progress": { 00:20:29.617 "blocks": 47104, 00:20:29.617 "percent": 71 00:20:29.617 } 00:20:29.617 }, 00:20:29.617 "base_bdevs_list": [ 00:20:29.617 { 00:20:29.617 "name": "spare", 00:20:29.617 "uuid": "c091ab03-8530-5a80-bb94-570c691f5aff", 00:20:29.617 "is_configured": true, 00:20:29.617 "data_offset": 0, 00:20:29.617 "data_size": 65536 00:20:29.617 }, 00:20:29.617 { 00:20:29.617 "name": null, 00:20:29.617 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:29.617 "is_configured": false, 00:20:29.617 "data_offset": 0, 00:20:29.617 "data_size": 65536 00:20:29.617 }, 00:20:29.617 { 00:20:29.617 "name": "BaseBdev3", 00:20:29.617 "uuid": "5466a1e3-9771-5a3b-8dcc-ca9e802a940c", 00:20:29.617 "is_configured": true, 00:20:29.617 "data_offset": 0, 00:20:29.617 "data_size": 65536 00:20:29.617 }, 00:20:29.617 { 00:20:29.617 "name": "BaseBdev4", 00:20:29.617 "uuid": "5be5e648-d674-5bf7-aece-10761a11a1f3", 00:20:29.617 "is_configured": true, 00:20:29.617 "data_offset": 0, 00:20:29.617 "data_size": 65536 00:20:29.617 } 00:20:29.617 ] 00:20:29.617 }' 00:20:29.617 18:57:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:29.617 18:57:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:29.617 18:57:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:29.617 18:57:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:29.617 18:57:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:29.617 [2024-07-24 18:57:14.517964] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:20:29.875 [2024-07-24 18:57:14.637334] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:20:30.134 [2024-07-24 18:57:14.950872] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:20:30.392 [2024-07-24 18:57:15.153394] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:20:30.651 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:30.651 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:30.651 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:30.651 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:30.651 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:30.651 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:30.651 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.651 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:30.651 [2024-07-24 18:57:15.487426] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:30.651 [2024-07-24 18:57:15.592955] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:30.651 [2024-07-24 18:57:15.594931] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:30.651 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:30.651 "name": "raid_bdev1", 00:20:30.651 "uuid": "18c1d45b-a47d-48a5-978c-2814c77a97bc", 00:20:30.651 "strip_size_kb": 0, 00:20:30.651 "state": "online", 00:20:30.651 "raid_level": "raid1", 00:20:30.651 "superblock": false, 00:20:30.651 "num_base_bdevs": 4, 00:20:30.651 "num_base_bdevs_discovered": 3, 00:20:30.651 "num_base_bdevs_operational": 3, 00:20:30.651 "base_bdevs_list": [ 00:20:30.651 { 00:20:30.651 "name": "spare", 00:20:30.651 "uuid": "c091ab03-8530-5a80-bb94-570c691f5aff", 00:20:30.651 "is_configured": true, 00:20:30.651 "data_offset": 0, 00:20:30.651 "data_size": 65536 00:20:30.651 }, 00:20:30.651 { 00:20:30.651 "name": null, 00:20:30.651 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:30.651 "is_configured": false, 00:20:30.651 "data_offset": 0, 00:20:30.651 "data_size": 65536 00:20:30.651 }, 00:20:30.651 { 00:20:30.651 "name": "BaseBdev3", 00:20:30.651 "uuid": "5466a1e3-9771-5a3b-8dcc-ca9e802a940c", 00:20:30.651 "is_configured": true, 00:20:30.651 "data_offset": 0, 00:20:30.651 "data_size": 65536 00:20:30.651 }, 00:20:30.651 { 00:20:30.651 "name": "BaseBdev4", 00:20:30.651 "uuid": "5be5e648-d674-5bf7-aece-10761a11a1f3", 00:20:30.651 "is_configured": true, 00:20:30.651 "data_offset": 0, 00:20:30.651 "data_size": 65536 00:20:30.651 } 00:20:30.651 ] 00:20:30.651 }' 00:20:30.651 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:30.909 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:30.909 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:30.909 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:20:30.909 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:20:30.909 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:30.909 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:30.909 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:30.909 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:30.909 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:30.909 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.909 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:30.909 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:30.909 "name": "raid_bdev1", 00:20:30.909 "uuid": "18c1d45b-a47d-48a5-978c-2814c77a97bc", 00:20:30.909 "strip_size_kb": 0, 00:20:30.909 "state": "online", 00:20:30.909 "raid_level": "raid1", 00:20:30.909 "superblock": false, 00:20:30.909 "num_base_bdevs": 4, 00:20:30.909 "num_base_bdevs_discovered": 3, 00:20:30.909 "num_base_bdevs_operational": 3, 00:20:30.909 "base_bdevs_list": [ 00:20:30.909 { 00:20:30.909 "name": "spare", 00:20:30.909 "uuid": "c091ab03-8530-5a80-bb94-570c691f5aff", 00:20:30.909 "is_configured": true, 00:20:30.909 "data_offset": 0, 00:20:30.909 "data_size": 65536 00:20:30.909 }, 00:20:30.909 { 00:20:30.909 "name": null, 00:20:30.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:30.909 "is_configured": false, 00:20:30.909 "data_offset": 0, 00:20:30.909 "data_size": 65536 00:20:30.909 }, 00:20:30.909 { 00:20:30.909 "name": "BaseBdev3", 00:20:30.909 "uuid": "5466a1e3-9771-5a3b-8dcc-ca9e802a940c", 00:20:30.909 "is_configured": true, 00:20:30.909 "data_offset": 0, 00:20:30.909 "data_size": 65536 00:20:30.909 }, 00:20:30.909 { 00:20:30.909 "name": "BaseBdev4", 00:20:30.909 "uuid": "5be5e648-d674-5bf7-aece-10761a11a1f3", 00:20:30.909 "is_configured": true, 00:20:30.909 "data_offset": 0, 00:20:30.909 "data_size": 65536 00:20:30.909 } 00:20:30.909 ] 00:20:30.909 }' 00:20:30.909 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:31.167 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:31.167 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:31.167 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:31.167 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:31.167 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:31.167 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:31.167 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:31.167 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:31.167 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:31.167 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:31.167 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:31.167 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:31.167 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:31.167 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.167 18:57:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:31.167 18:57:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:31.167 "name": "raid_bdev1", 00:20:31.167 "uuid": "18c1d45b-a47d-48a5-978c-2814c77a97bc", 00:20:31.167 "strip_size_kb": 0, 00:20:31.167 "state": "online", 00:20:31.167 "raid_level": "raid1", 00:20:31.167 "superblock": false, 00:20:31.167 "num_base_bdevs": 4, 00:20:31.167 "num_base_bdevs_discovered": 3, 00:20:31.167 "num_base_bdevs_operational": 3, 00:20:31.167 "base_bdevs_list": [ 00:20:31.167 { 00:20:31.167 "name": "spare", 00:20:31.167 "uuid": "c091ab03-8530-5a80-bb94-570c691f5aff", 00:20:31.167 "is_configured": true, 00:20:31.167 "data_offset": 0, 00:20:31.167 "data_size": 65536 00:20:31.167 }, 00:20:31.167 { 00:20:31.167 "name": null, 00:20:31.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:31.167 "is_configured": false, 00:20:31.167 "data_offset": 0, 00:20:31.167 "data_size": 65536 00:20:31.167 }, 00:20:31.167 { 00:20:31.167 "name": "BaseBdev3", 00:20:31.168 "uuid": "5466a1e3-9771-5a3b-8dcc-ca9e802a940c", 00:20:31.168 "is_configured": true, 00:20:31.168 "data_offset": 0, 00:20:31.168 "data_size": 65536 00:20:31.168 }, 00:20:31.168 { 00:20:31.168 "name": "BaseBdev4", 00:20:31.168 "uuid": "5be5e648-d674-5bf7-aece-10761a11a1f3", 00:20:31.168 "is_configured": true, 00:20:31.168 "data_offset": 0, 00:20:31.168 "data_size": 65536 00:20:31.168 } 00:20:31.168 ] 00:20:31.168 }' 00:20:31.168 18:57:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:31.168 18:57:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:31.734 18:57:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:31.993 [2024-07-24 18:57:16.788028] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:31.993 [2024-07-24 18:57:16.788052] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:31.993 00:20:31.993 Latency(us) 00:20:31.993 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:31.993 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:20:31.993 raid_bdev1 : 9.47 120.43 361.28 0.00 0.00 11521.83 241.86 109351.50 00:20:31.993 =================================================================================================================== 00:20:31.993 Total : 120.43 361.28 0.00 0.00 11521.83 241.86 109351.50 00:20:31.993 [2024-07-24 18:57:16.891067] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:31.993 [2024-07-24 18:57:16.891087] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:31.993 [2024-07-24 18:57:16.891151] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:31.993 [2024-07-24 18:57:16.891157] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x247c990 name raid_bdev1, state offline 00:20:31.993 0 00:20:31.993 18:57:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.993 18:57:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:20:32.251 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:20:32.251 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:20:32.251 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:20:32.251 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:20:32.251 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:32.251 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:20:32.251 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:32.251 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:32.251 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:32.251 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:20:32.251 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:32.251 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:32.251 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:20:32.251 /dev/nbd0 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:32.512 1+0 records in 00:20:32.512 1+0 records out 00:20:32.512 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021977 s, 18.6 MB/s 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:20:32.512 /dev/nbd1 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:32.512 1+0 records in 00:20:32.512 1+0 records out 00:20:32.512 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000221665 s, 18.5 MB/s 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:32.512 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:32.771 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:20:33.030 /dev/nbd1 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:33.030 1+0 records in 00:20:33.030 1+0 records out 00:20:33.030 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000198789 s, 20.6 MB/s 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:33.030 18:57:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:33.289 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:33.289 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:33.289 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:33.289 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:33.289 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:33.289 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:33.289 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:20:33.289 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:33.289 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:33.289 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:33.289 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:33.289 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:33.289 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:20:33.289 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:33.289 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 2174181 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 2174181 ']' 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 2174181 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2174181 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2174181' 00:20:33.548 killing process with pid 2174181 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 2174181 00:20:33.548 Received shutdown signal, test time was about 11.005446 seconds 00:20:33.548 00:20:33.548 Latency(us) 00:20:33.548 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:33.548 =================================================================================================================== 00:20:33.548 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:20:33.548 [2024-07-24 18:57:18.422779] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:33.548 18:57:18 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 2174181 00:20:33.548 [2024-07-24 18:57:18.457240] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:33.807 18:57:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:20:33.807 00:20:33.807 real 0m15.118s 00:20:33.807 user 0m23.078s 00:20:33.807 sys 0m2.141s 00:20:33.807 18:57:18 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:33.807 18:57:18 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:33.807 ************************************ 00:20:33.807 END TEST raid_rebuild_test_io 00:20:33.807 ************************************ 00:20:33.807 18:57:18 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:20:33.807 18:57:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:20:33.807 18:57:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:33.807 18:57:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:33.807 ************************************ 00:20:33.807 START TEST raid_rebuild_test_sb_io 00:20:33.807 ************************************ 00:20:33.807 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:20:33.807 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:33.807 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:20:33.807 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:20:33.807 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:20:33.807 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:33.807 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:33.807 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:33.807 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:33.807 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:33.807 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:33.807 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:33.807 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:33.807 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:33.807 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2177324 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2177324 /var/tmp/spdk-raid.sock 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 2177324 ']' 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:33.808 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:33.808 18:57:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:33.808 [2024-07-24 18:57:18.751517] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:20:33.808 [2024-07-24 18:57:18.751556] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2177324 ] 00:20:33.808 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:33.808 Zero copy mechanism will not be used. 00:20:33.808 [2024-07-24 18:57:18.816369] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:34.067 [2024-07-24 18:57:18.895769] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:34.067 [2024-07-24 18:57:18.946488] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:34.067 [2024-07-24 18:57:18.946516] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:34.635 18:57:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:34.635 18:57:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:20:34.635 18:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:34.635 18:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:34.896 BaseBdev1_malloc 00:20:34.896 18:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:34.896 [2024-07-24 18:57:19.870370] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:34.896 [2024-07-24 18:57:19.870402] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:34.896 [2024-07-24 18:57:19.870415] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e89130 00:20:34.896 [2024-07-24 18:57:19.870421] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:34.896 [2024-07-24 18:57:19.871562] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:34.896 [2024-07-24 18:57:19.871583] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:34.896 BaseBdev1 00:20:34.896 18:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:34.896 18:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:35.185 BaseBdev2_malloc 00:20:35.185 18:57:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:35.452 [2024-07-24 18:57:20.203698] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:35.452 [2024-07-24 18:57:20.203733] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:35.452 [2024-07-24 18:57:20.203744] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x202efa0 00:20:35.452 [2024-07-24 18:57:20.203749] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:35.452 [2024-07-24 18:57:20.204853] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:35.452 [2024-07-24 18:57:20.204872] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:35.452 BaseBdev2 00:20:35.452 18:57:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:35.452 18:57:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:35.452 BaseBdev3_malloc 00:20:35.452 18:57:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:20:35.711 [2024-07-24 18:57:20.535951] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:20:35.711 [2024-07-24 18:57:20.535982] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:35.711 [2024-07-24 18:57:20.535998] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x203a970 00:20:35.711 [2024-07-24 18:57:20.536020] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:35.711 [2024-07-24 18:57:20.537042] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:35.711 [2024-07-24 18:57:20.537062] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:35.711 BaseBdev3 00:20:35.711 18:57:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:35.711 18:57:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:35.711 BaseBdev4_malloc 00:20:35.711 18:57:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:20:35.969 [2024-07-24 18:57:20.856320] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:20:35.969 [2024-07-24 18:57:20.856348] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:35.969 [2024-07-24 18:57:20.856361] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20318c0 00:20:35.969 [2024-07-24 18:57:20.856383] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:35.969 [2024-07-24 18:57:20.857305] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:35.969 [2024-07-24 18:57:20.857325] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:35.969 BaseBdev4 00:20:35.969 18:57:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:36.228 spare_malloc 00:20:36.228 18:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:36.228 spare_delay 00:20:36.228 18:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:36.487 [2024-07-24 18:57:21.357196] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:36.487 [2024-07-24 18:57:21.357226] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:36.487 [2024-07-24 18:57:21.357236] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e81bf0 00:20:36.487 [2024-07-24 18:57:21.357242] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:36.487 [2024-07-24 18:57:21.358308] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:36.487 [2024-07-24 18:57:21.358327] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:36.487 spare 00:20:36.487 18:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:20:36.746 [2024-07-24 18:57:21.509623] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:36.746 [2024-07-24 18:57:21.510445] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:36.746 [2024-07-24 18:57:21.510487] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:36.746 [2024-07-24 18:57:21.510515] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:36.746 [2024-07-24 18:57:21.510638] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e83990 00:20:36.747 [2024-07-24 18:57:21.510643] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:36.747 [2024-07-24 18:57:21.510767] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x202e3b0 00:20:36.747 [2024-07-24 18:57:21.510865] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e83990 00:20:36.747 [2024-07-24 18:57:21.510870] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e83990 00:20:36.747 [2024-07-24 18:57:21.510929] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:36.747 18:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:36.747 18:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:36.747 18:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:36.747 18:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:36.747 18:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:36.747 18:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:36.747 18:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:36.747 18:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:36.747 18:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:36.747 18:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:36.747 18:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.747 18:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:36.747 18:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:36.747 "name": "raid_bdev1", 00:20:36.747 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:36.747 "strip_size_kb": 0, 00:20:36.747 "state": "online", 00:20:36.747 "raid_level": "raid1", 00:20:36.747 "superblock": true, 00:20:36.747 "num_base_bdevs": 4, 00:20:36.747 "num_base_bdevs_discovered": 4, 00:20:36.747 "num_base_bdevs_operational": 4, 00:20:36.747 "base_bdevs_list": [ 00:20:36.747 { 00:20:36.747 "name": "BaseBdev1", 00:20:36.747 "uuid": "9e503a66-d5f5-5d6f-a55d-857d6421a9a9", 00:20:36.747 "is_configured": true, 00:20:36.747 "data_offset": 2048, 00:20:36.747 "data_size": 63488 00:20:36.747 }, 00:20:36.747 { 00:20:36.747 "name": "BaseBdev2", 00:20:36.747 "uuid": "11a03835-a053-5c3f-bebe-3aa1376df995", 00:20:36.747 "is_configured": true, 00:20:36.747 "data_offset": 2048, 00:20:36.747 "data_size": 63488 00:20:36.747 }, 00:20:36.747 { 00:20:36.747 "name": "BaseBdev3", 00:20:36.747 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:36.747 "is_configured": true, 00:20:36.747 "data_offset": 2048, 00:20:36.747 "data_size": 63488 00:20:36.747 }, 00:20:36.747 { 00:20:36.747 "name": "BaseBdev4", 00:20:36.747 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:36.747 "is_configured": true, 00:20:36.747 "data_offset": 2048, 00:20:36.747 "data_size": 63488 00:20:36.747 } 00:20:36.747 ] 00:20:36.747 }' 00:20:36.747 18:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:36.747 18:57:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:37.314 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:20:37.314 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:37.314 [2024-07-24 18:57:22.287826] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:37.314 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:20:37.314 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:37.314 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.572 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:20:37.572 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:20:37.572 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:37.572 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:37.573 [2024-07-24 18:57:22.558185] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x202ddf0 00:20:37.573 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:37.573 Zero copy mechanism will not be used. 00:20:37.573 Running I/O for 60 seconds... 00:20:37.832 [2024-07-24 18:57:22.628249] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:37.832 [2024-07-24 18:57:22.628442] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x202ddf0 00:20:37.832 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:37.832 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:37.832 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:37.832 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:37.832 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:37.832 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:37.832 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:37.832 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:37.832 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:37.832 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:37.832 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.832 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:38.091 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:38.091 "name": "raid_bdev1", 00:20:38.091 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:38.091 "strip_size_kb": 0, 00:20:38.091 "state": "online", 00:20:38.091 "raid_level": "raid1", 00:20:38.091 "superblock": true, 00:20:38.091 "num_base_bdevs": 4, 00:20:38.091 "num_base_bdevs_discovered": 3, 00:20:38.091 "num_base_bdevs_operational": 3, 00:20:38.091 "base_bdevs_list": [ 00:20:38.091 { 00:20:38.091 "name": null, 00:20:38.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:38.091 "is_configured": false, 00:20:38.091 "data_offset": 2048, 00:20:38.091 "data_size": 63488 00:20:38.091 }, 00:20:38.091 { 00:20:38.091 "name": "BaseBdev2", 00:20:38.091 "uuid": "11a03835-a053-5c3f-bebe-3aa1376df995", 00:20:38.091 "is_configured": true, 00:20:38.091 "data_offset": 2048, 00:20:38.091 "data_size": 63488 00:20:38.091 }, 00:20:38.091 { 00:20:38.091 "name": "BaseBdev3", 00:20:38.091 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:38.091 "is_configured": true, 00:20:38.091 "data_offset": 2048, 00:20:38.091 "data_size": 63488 00:20:38.091 }, 00:20:38.091 { 00:20:38.091 "name": "BaseBdev4", 00:20:38.091 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:38.091 "is_configured": true, 00:20:38.091 "data_offset": 2048, 00:20:38.091 "data_size": 63488 00:20:38.091 } 00:20:38.091 ] 00:20:38.091 }' 00:20:38.091 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:38.091 18:57:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:38.358 18:57:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:38.616 [2024-07-24 18:57:23.501785] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:38.616 [2024-07-24 18:57:23.541534] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e87860 00:20:38.616 [2024-07-24 18:57:23.543182] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:38.616 18:57:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:20:38.875 [2024-07-24 18:57:23.651882] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:38.875 [2024-07-24 18:57:23.652908] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:38.875 [2024-07-24 18:57:23.855256] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:38.875 [2024-07-24 18:57:23.855698] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:39.442 [2024-07-24 18:57:24.198924] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:39.442 [2024-07-24 18:57:24.199216] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:39.442 [2024-07-24 18:57:24.415801] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:39.701 18:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:39.701 18:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:39.701 18:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:39.701 18:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:39.701 18:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:39.701 18:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.701 18:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:39.701 [2024-07-24 18:57:24.655241] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:39.960 18:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:39.960 "name": "raid_bdev1", 00:20:39.960 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:39.960 "strip_size_kb": 0, 00:20:39.960 "state": "online", 00:20:39.960 "raid_level": "raid1", 00:20:39.960 "superblock": true, 00:20:39.960 "num_base_bdevs": 4, 00:20:39.960 "num_base_bdevs_discovered": 4, 00:20:39.960 "num_base_bdevs_operational": 4, 00:20:39.960 "process": { 00:20:39.960 "type": "rebuild", 00:20:39.960 "target": "spare", 00:20:39.960 "progress": { 00:20:39.960 "blocks": 14336, 00:20:39.960 "percent": 22 00:20:39.960 } 00:20:39.960 }, 00:20:39.960 "base_bdevs_list": [ 00:20:39.960 { 00:20:39.960 "name": "spare", 00:20:39.960 "uuid": "0197bd46-dd18-568e-a89f-74ce11c92137", 00:20:39.960 "is_configured": true, 00:20:39.960 "data_offset": 2048, 00:20:39.960 "data_size": 63488 00:20:39.960 }, 00:20:39.960 { 00:20:39.960 "name": "BaseBdev2", 00:20:39.960 "uuid": "11a03835-a053-5c3f-bebe-3aa1376df995", 00:20:39.960 "is_configured": true, 00:20:39.960 "data_offset": 2048, 00:20:39.960 "data_size": 63488 00:20:39.960 }, 00:20:39.960 { 00:20:39.960 "name": "BaseBdev3", 00:20:39.960 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:39.960 "is_configured": true, 00:20:39.960 "data_offset": 2048, 00:20:39.960 "data_size": 63488 00:20:39.960 }, 00:20:39.960 { 00:20:39.960 "name": "BaseBdev4", 00:20:39.960 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:39.960 "is_configured": true, 00:20:39.960 "data_offset": 2048, 00:20:39.960 "data_size": 63488 00:20:39.960 } 00:20:39.960 ] 00:20:39.960 }' 00:20:39.960 18:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:39.960 18:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:39.960 18:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:39.960 18:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:39.960 18:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:39.960 [2024-07-24 18:57:24.885438] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:39.960 [2024-07-24 18:57:24.955771] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:40.219 [2024-07-24 18:57:24.992779] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:40.219 [2024-07-24 18:57:24.992923] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:40.219 [2024-07-24 18:57:25.100565] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:40.219 [2024-07-24 18:57:25.109833] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:40.219 [2024-07-24 18:57:25.109853] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:40.219 [2024-07-24 18:57:25.109858] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:40.219 [2024-07-24 18:57:25.131950] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x202ddf0 00:20:40.219 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:40.219 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:40.219 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:40.219 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:40.219 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:40.219 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:40.219 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:40.219 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:40.219 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:40.219 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:40.219 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.219 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:40.477 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:40.477 "name": "raid_bdev1", 00:20:40.477 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:40.477 "strip_size_kb": 0, 00:20:40.477 "state": "online", 00:20:40.477 "raid_level": "raid1", 00:20:40.477 "superblock": true, 00:20:40.477 "num_base_bdevs": 4, 00:20:40.477 "num_base_bdevs_discovered": 3, 00:20:40.477 "num_base_bdevs_operational": 3, 00:20:40.477 "base_bdevs_list": [ 00:20:40.477 { 00:20:40.477 "name": null, 00:20:40.477 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.477 "is_configured": false, 00:20:40.477 "data_offset": 2048, 00:20:40.477 "data_size": 63488 00:20:40.477 }, 00:20:40.477 { 00:20:40.477 "name": "BaseBdev2", 00:20:40.477 "uuid": "11a03835-a053-5c3f-bebe-3aa1376df995", 00:20:40.477 "is_configured": true, 00:20:40.477 "data_offset": 2048, 00:20:40.477 "data_size": 63488 00:20:40.477 }, 00:20:40.477 { 00:20:40.477 "name": "BaseBdev3", 00:20:40.477 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:40.477 "is_configured": true, 00:20:40.477 "data_offset": 2048, 00:20:40.477 "data_size": 63488 00:20:40.477 }, 00:20:40.477 { 00:20:40.477 "name": "BaseBdev4", 00:20:40.477 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:40.477 "is_configured": true, 00:20:40.477 "data_offset": 2048, 00:20:40.477 "data_size": 63488 00:20:40.477 } 00:20:40.477 ] 00:20:40.477 }' 00:20:40.477 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:40.477 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:41.044 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:41.044 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:41.044 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:41.044 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:41.044 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:41.044 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:41.044 18:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.044 18:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:41.044 "name": "raid_bdev1", 00:20:41.044 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:41.044 "strip_size_kb": 0, 00:20:41.044 "state": "online", 00:20:41.044 "raid_level": "raid1", 00:20:41.044 "superblock": true, 00:20:41.044 "num_base_bdevs": 4, 00:20:41.044 "num_base_bdevs_discovered": 3, 00:20:41.044 "num_base_bdevs_operational": 3, 00:20:41.044 "base_bdevs_list": [ 00:20:41.044 { 00:20:41.044 "name": null, 00:20:41.044 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.044 "is_configured": false, 00:20:41.044 "data_offset": 2048, 00:20:41.044 "data_size": 63488 00:20:41.044 }, 00:20:41.044 { 00:20:41.044 "name": "BaseBdev2", 00:20:41.044 "uuid": "11a03835-a053-5c3f-bebe-3aa1376df995", 00:20:41.044 "is_configured": true, 00:20:41.044 "data_offset": 2048, 00:20:41.044 "data_size": 63488 00:20:41.044 }, 00:20:41.044 { 00:20:41.044 "name": "BaseBdev3", 00:20:41.044 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:41.044 "is_configured": true, 00:20:41.044 "data_offset": 2048, 00:20:41.044 "data_size": 63488 00:20:41.044 }, 00:20:41.044 { 00:20:41.044 "name": "BaseBdev4", 00:20:41.044 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:41.044 "is_configured": true, 00:20:41.044 "data_offset": 2048, 00:20:41.044 "data_size": 63488 00:20:41.044 } 00:20:41.044 ] 00:20:41.044 }' 00:20:41.044 18:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:41.303 18:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:41.303 18:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:41.303 18:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:41.303 18:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:41.303 [2024-07-24 18:57:26.247144] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:41.303 18:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:20:41.303 [2024-07-24 18:57:26.287011] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b8fb40 00:20:41.303 [2024-07-24 18:57:26.288107] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:41.562 [2024-07-24 18:57:26.409240] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:41.562 [2024-07-24 18:57:26.409551] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:41.562 [2024-07-24 18:57:26.547531] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:41.562 [2024-07-24 18:57:26.548059] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:42.130 [2024-07-24 18:57:26.880458] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:42.130 [2024-07-24 18:57:26.989512] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:42.130 [2024-07-24 18:57:26.989712] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:42.389 18:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:42.389 18:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:42.389 18:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:42.389 18:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:42.389 18:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:42.389 18:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.389 18:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:42.389 [2024-07-24 18:57:27.327906] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:42.648 [2024-07-24 18:57:27.435938] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:42.648 18:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:42.648 "name": "raid_bdev1", 00:20:42.648 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:42.648 "strip_size_kb": 0, 00:20:42.648 "state": "online", 00:20:42.648 "raid_level": "raid1", 00:20:42.648 "superblock": true, 00:20:42.648 "num_base_bdevs": 4, 00:20:42.648 "num_base_bdevs_discovered": 4, 00:20:42.648 "num_base_bdevs_operational": 4, 00:20:42.648 "process": { 00:20:42.648 "type": "rebuild", 00:20:42.648 "target": "spare", 00:20:42.648 "progress": { 00:20:42.648 "blocks": 16384, 00:20:42.648 "percent": 25 00:20:42.648 } 00:20:42.648 }, 00:20:42.648 "base_bdevs_list": [ 00:20:42.648 { 00:20:42.648 "name": "spare", 00:20:42.648 "uuid": "0197bd46-dd18-568e-a89f-74ce11c92137", 00:20:42.648 "is_configured": true, 00:20:42.648 "data_offset": 2048, 00:20:42.648 "data_size": 63488 00:20:42.648 }, 00:20:42.648 { 00:20:42.648 "name": "BaseBdev2", 00:20:42.648 "uuid": "11a03835-a053-5c3f-bebe-3aa1376df995", 00:20:42.648 "is_configured": true, 00:20:42.648 "data_offset": 2048, 00:20:42.648 "data_size": 63488 00:20:42.648 }, 00:20:42.648 { 00:20:42.648 "name": "BaseBdev3", 00:20:42.648 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:42.648 "is_configured": true, 00:20:42.648 "data_offset": 2048, 00:20:42.648 "data_size": 63488 00:20:42.648 }, 00:20:42.648 { 00:20:42.648 "name": "BaseBdev4", 00:20:42.648 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:42.648 "is_configured": true, 00:20:42.648 "data_offset": 2048, 00:20:42.648 "data_size": 63488 00:20:42.648 } 00:20:42.648 ] 00:20:42.648 }' 00:20:42.648 18:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:42.648 18:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:42.648 18:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:42.648 18:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:42.648 18:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:20:42.648 18:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:20:42.648 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:20:42.648 18:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:20:42.648 18:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:20:42.648 18:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:20:42.648 18:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:42.907 [2024-07-24 18:57:27.661415] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:42.907 [2024-07-24 18:57:27.683385] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:42.907 [2024-07-24 18:57:27.873150] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:20:43.165 [2024-07-24 18:57:28.006672] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x202ddf0 00:20:43.166 [2024-07-24 18:57:28.006691] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1b8fb40 00:20:43.166 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:20:43.166 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:20:43.166 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:43.166 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:43.166 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:43.166 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:43.166 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:43.166 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.166 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:43.425 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:43.425 "name": "raid_bdev1", 00:20:43.425 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:43.425 "strip_size_kb": 0, 00:20:43.425 "state": "online", 00:20:43.425 "raid_level": "raid1", 00:20:43.425 "superblock": true, 00:20:43.425 "num_base_bdevs": 4, 00:20:43.425 "num_base_bdevs_discovered": 3, 00:20:43.425 "num_base_bdevs_operational": 3, 00:20:43.425 "process": { 00:20:43.425 "type": "rebuild", 00:20:43.425 "target": "spare", 00:20:43.425 "progress": { 00:20:43.425 "blocks": 24576, 00:20:43.425 "percent": 38 00:20:43.425 } 00:20:43.425 }, 00:20:43.425 "base_bdevs_list": [ 00:20:43.425 { 00:20:43.425 "name": "spare", 00:20:43.425 "uuid": "0197bd46-dd18-568e-a89f-74ce11c92137", 00:20:43.425 "is_configured": true, 00:20:43.425 "data_offset": 2048, 00:20:43.425 "data_size": 63488 00:20:43.425 }, 00:20:43.425 { 00:20:43.425 "name": null, 00:20:43.425 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.425 "is_configured": false, 00:20:43.425 "data_offset": 2048, 00:20:43.425 "data_size": 63488 00:20:43.425 }, 00:20:43.425 { 00:20:43.425 "name": "BaseBdev3", 00:20:43.425 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:43.425 "is_configured": true, 00:20:43.425 "data_offset": 2048, 00:20:43.425 "data_size": 63488 00:20:43.425 }, 00:20:43.425 { 00:20:43.425 "name": "BaseBdev4", 00:20:43.425 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:43.425 "is_configured": true, 00:20:43.425 "data_offset": 2048, 00:20:43.425 "data_size": 63488 00:20:43.425 } 00:20:43.425 ] 00:20:43.425 }' 00:20:43.425 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:43.425 [2024-07-24 18:57:28.255059] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:20:43.425 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:43.425 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:43.425 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:43.425 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=729 00:20:43.425 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:43.425 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:43.425 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:43.425 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:43.425 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:43.425 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:43.425 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.425 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:43.684 [2024-07-24 18:57:28.476036] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:20:43.684 [2024-07-24 18:57:28.476286] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:20:43.684 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:43.684 "name": "raid_bdev1", 00:20:43.684 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:43.684 "strip_size_kb": 0, 00:20:43.684 "state": "online", 00:20:43.684 "raid_level": "raid1", 00:20:43.684 "superblock": true, 00:20:43.684 "num_base_bdevs": 4, 00:20:43.684 "num_base_bdevs_discovered": 3, 00:20:43.684 "num_base_bdevs_operational": 3, 00:20:43.684 "process": { 00:20:43.684 "type": "rebuild", 00:20:43.684 "target": "spare", 00:20:43.684 "progress": { 00:20:43.684 "blocks": 26624, 00:20:43.684 "percent": 41 00:20:43.684 } 00:20:43.684 }, 00:20:43.684 "base_bdevs_list": [ 00:20:43.684 { 00:20:43.684 "name": "spare", 00:20:43.684 "uuid": "0197bd46-dd18-568e-a89f-74ce11c92137", 00:20:43.684 "is_configured": true, 00:20:43.684 "data_offset": 2048, 00:20:43.684 "data_size": 63488 00:20:43.684 }, 00:20:43.684 { 00:20:43.684 "name": null, 00:20:43.684 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.684 "is_configured": false, 00:20:43.684 "data_offset": 2048, 00:20:43.684 "data_size": 63488 00:20:43.684 }, 00:20:43.684 { 00:20:43.684 "name": "BaseBdev3", 00:20:43.684 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:43.684 "is_configured": true, 00:20:43.684 "data_offset": 2048, 00:20:43.684 "data_size": 63488 00:20:43.684 }, 00:20:43.684 { 00:20:43.684 "name": "BaseBdev4", 00:20:43.684 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:43.684 "is_configured": true, 00:20:43.684 "data_offset": 2048, 00:20:43.684 "data_size": 63488 00:20:43.684 } 00:20:43.684 ] 00:20:43.684 }' 00:20:43.684 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:43.684 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:43.684 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:43.684 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:43.684 18:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:43.942 [2024-07-24 18:57:28.933351] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:20:43.942 [2024-07-24 18:57:28.933514] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:20:44.201 [2024-07-24 18:57:29.160940] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:20:44.767 18:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:44.767 18:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:44.767 18:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:44.767 18:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:44.767 18:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:44.767 18:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:44.767 18:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.768 18:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:44.768 18:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:44.768 "name": "raid_bdev1", 00:20:44.768 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:44.768 "strip_size_kb": 0, 00:20:44.768 "state": "online", 00:20:44.768 "raid_level": "raid1", 00:20:44.768 "superblock": true, 00:20:44.768 "num_base_bdevs": 4, 00:20:44.768 "num_base_bdevs_discovered": 3, 00:20:44.768 "num_base_bdevs_operational": 3, 00:20:44.768 "process": { 00:20:44.768 "type": "rebuild", 00:20:44.768 "target": "spare", 00:20:44.768 "progress": { 00:20:44.768 "blocks": 47104, 00:20:44.768 "percent": 74 00:20:44.768 } 00:20:44.768 }, 00:20:44.768 "base_bdevs_list": [ 00:20:44.768 { 00:20:44.768 "name": "spare", 00:20:44.768 "uuid": "0197bd46-dd18-568e-a89f-74ce11c92137", 00:20:44.768 "is_configured": true, 00:20:44.768 "data_offset": 2048, 00:20:44.768 "data_size": 63488 00:20:44.768 }, 00:20:44.768 { 00:20:44.768 "name": null, 00:20:44.768 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:44.768 "is_configured": false, 00:20:44.768 "data_offset": 2048, 00:20:44.768 "data_size": 63488 00:20:44.768 }, 00:20:44.768 { 00:20:44.768 "name": "BaseBdev3", 00:20:44.768 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:44.768 "is_configured": true, 00:20:44.768 "data_offset": 2048, 00:20:44.768 "data_size": 63488 00:20:44.768 }, 00:20:44.768 { 00:20:44.768 "name": "BaseBdev4", 00:20:44.768 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:44.768 "is_configured": true, 00:20:44.768 "data_offset": 2048, 00:20:44.768 "data_size": 63488 00:20:44.768 } 00:20:44.768 ] 00:20:44.768 }' 00:20:45.027 18:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:45.027 18:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:45.027 18:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:45.027 [2024-07-24 18:57:29.862251] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:20:45.027 [2024-07-24 18:57:29.862934] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:20:45.027 18:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:45.027 18:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:45.286 [2024-07-24 18:57:30.071145] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:20:45.286 [2024-07-24 18:57:30.071578] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:20:45.545 [2024-07-24 18:57:30.405674] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:20:46.112 [2024-07-24 18:57:30.847279] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:46.112 18:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:46.112 18:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:46.112 18:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:46.112 18:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:46.112 18:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:46.112 18:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:46.112 18:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.112 18:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:46.112 [2024-07-24 18:57:30.952960] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:46.112 [2024-07-24 18:57:30.954879] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:46.112 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:46.112 "name": "raid_bdev1", 00:20:46.112 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:46.112 "strip_size_kb": 0, 00:20:46.112 "state": "online", 00:20:46.112 "raid_level": "raid1", 00:20:46.112 "superblock": true, 00:20:46.112 "num_base_bdevs": 4, 00:20:46.112 "num_base_bdevs_discovered": 3, 00:20:46.112 "num_base_bdevs_operational": 3, 00:20:46.112 "base_bdevs_list": [ 00:20:46.112 { 00:20:46.112 "name": "spare", 00:20:46.112 "uuid": "0197bd46-dd18-568e-a89f-74ce11c92137", 00:20:46.113 "is_configured": true, 00:20:46.113 "data_offset": 2048, 00:20:46.113 "data_size": 63488 00:20:46.113 }, 00:20:46.113 { 00:20:46.113 "name": null, 00:20:46.113 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:46.113 "is_configured": false, 00:20:46.113 "data_offset": 2048, 00:20:46.113 "data_size": 63488 00:20:46.113 }, 00:20:46.113 { 00:20:46.113 "name": "BaseBdev3", 00:20:46.113 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:46.113 "is_configured": true, 00:20:46.113 "data_offset": 2048, 00:20:46.113 "data_size": 63488 00:20:46.113 }, 00:20:46.113 { 00:20:46.113 "name": "BaseBdev4", 00:20:46.113 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:46.113 "is_configured": true, 00:20:46.113 "data_offset": 2048, 00:20:46.113 "data_size": 63488 00:20:46.113 } 00:20:46.113 ] 00:20:46.113 }' 00:20:46.113 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:46.113 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:46.113 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:46.371 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:20:46.371 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:20:46.371 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:46.371 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:46.371 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:46.371 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:46.371 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:46.371 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.371 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:46.371 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:46.371 "name": "raid_bdev1", 00:20:46.371 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:46.371 "strip_size_kb": 0, 00:20:46.371 "state": "online", 00:20:46.371 "raid_level": "raid1", 00:20:46.371 "superblock": true, 00:20:46.371 "num_base_bdevs": 4, 00:20:46.371 "num_base_bdevs_discovered": 3, 00:20:46.371 "num_base_bdevs_operational": 3, 00:20:46.371 "base_bdevs_list": [ 00:20:46.371 { 00:20:46.371 "name": "spare", 00:20:46.371 "uuid": "0197bd46-dd18-568e-a89f-74ce11c92137", 00:20:46.371 "is_configured": true, 00:20:46.371 "data_offset": 2048, 00:20:46.371 "data_size": 63488 00:20:46.371 }, 00:20:46.371 { 00:20:46.371 "name": null, 00:20:46.371 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:46.371 "is_configured": false, 00:20:46.371 "data_offset": 2048, 00:20:46.371 "data_size": 63488 00:20:46.371 }, 00:20:46.371 { 00:20:46.371 "name": "BaseBdev3", 00:20:46.371 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:46.371 "is_configured": true, 00:20:46.371 "data_offset": 2048, 00:20:46.371 "data_size": 63488 00:20:46.371 }, 00:20:46.371 { 00:20:46.371 "name": "BaseBdev4", 00:20:46.371 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:46.371 "is_configured": true, 00:20:46.371 "data_offset": 2048, 00:20:46.371 "data_size": 63488 00:20:46.371 } 00:20:46.371 ] 00:20:46.371 }' 00:20:46.371 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:46.371 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:46.371 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:46.371 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:46.371 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:46.630 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:46.630 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:46.630 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:46.630 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:46.630 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:46.630 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:46.630 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:46.630 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:46.630 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:46.630 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:46.630 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.630 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:46.630 "name": "raid_bdev1", 00:20:46.630 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:46.630 "strip_size_kb": 0, 00:20:46.630 "state": "online", 00:20:46.630 "raid_level": "raid1", 00:20:46.630 "superblock": true, 00:20:46.630 "num_base_bdevs": 4, 00:20:46.630 "num_base_bdevs_discovered": 3, 00:20:46.630 "num_base_bdevs_operational": 3, 00:20:46.630 "base_bdevs_list": [ 00:20:46.630 { 00:20:46.630 "name": "spare", 00:20:46.630 "uuid": "0197bd46-dd18-568e-a89f-74ce11c92137", 00:20:46.630 "is_configured": true, 00:20:46.630 "data_offset": 2048, 00:20:46.630 "data_size": 63488 00:20:46.630 }, 00:20:46.630 { 00:20:46.630 "name": null, 00:20:46.630 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:46.630 "is_configured": false, 00:20:46.630 "data_offset": 2048, 00:20:46.630 "data_size": 63488 00:20:46.630 }, 00:20:46.630 { 00:20:46.630 "name": "BaseBdev3", 00:20:46.630 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:46.630 "is_configured": true, 00:20:46.630 "data_offset": 2048, 00:20:46.630 "data_size": 63488 00:20:46.630 }, 00:20:46.630 { 00:20:46.630 "name": "BaseBdev4", 00:20:46.630 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:46.630 "is_configured": true, 00:20:46.630 "data_offset": 2048, 00:20:46.630 "data_size": 63488 00:20:46.630 } 00:20:46.630 ] 00:20:46.630 }' 00:20:46.630 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:46.630 18:57:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:47.198 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:47.198 [2024-07-24 18:57:32.195748] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:47.198 [2024-07-24 18:57:32.195774] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:47.457 00:20:47.457 Latency(us) 00:20:47.457 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:47.457 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:20:47.457 raid_bdev1 : 9.66 109.78 329.35 0.00 0.00 12259.11 239.91 108852.18 00:20:47.457 =================================================================================================================== 00:20:47.457 Total : 109.78 329.35 0.00 0.00 12259.11 239.91 108852.18 00:20:47.457 [2024-07-24 18:57:32.250719] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:47.457 [2024-07-24 18:57:32.250744] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:47.457 [2024-07-24 18:57:32.250810] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:47.457 [2024-07-24 18:57:32.250817] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e83990 name raid_bdev1, state offline 00:20:47.457 0 00:20:47.457 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.457 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:20:47.457 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:20:47.457 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:20:47.457 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:20:47.457 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:20:47.457 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:47.457 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:20:47.457 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:47.457 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:47.457 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:47.457 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:20:47.457 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:47.457 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:47.457 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:20:47.716 /dev/nbd0 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:47.716 1+0 records in 00:20:47.716 1+0 records out 00:20:47.716 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000219124 s, 18.7 MB/s 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:47.716 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:20:47.975 /dev/nbd1 00:20:47.975 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:47.975 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:47.975 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:20:47.975 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:20:47.975 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:47.976 1+0 records in 00:20:47.976 1+0 records out 00:20:47.976 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000106614 s, 38.4 MB/s 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:47.976 18:57:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:48.234 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:48.234 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:48.234 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:48.234 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:48.234 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:48.234 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:48.234 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:20:48.234 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:48.235 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:20:48.235 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:20:48.235 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:20:48.235 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:48.235 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:20:48.235 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:48.235 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:20:48.235 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:48.235 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:20:48.235 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:48.235 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:48.235 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:20:48.493 /dev/nbd1 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:48.493 1+0 records in 00:20:48.493 1+0 records out 00:20:48.493 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000123323 s, 33.2 MB/s 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:20:48.493 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:48.494 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:20:48.494 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:48.494 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:48.494 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:48.494 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:48.494 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:48.494 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:48.494 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:48.494 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:20:48.752 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:49.010 18:57:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:49.269 [2024-07-24 18:57:34.024348] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:49.269 [2024-07-24 18:57:34.024382] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:49.269 [2024-07-24 18:57:34.024393] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e81e20 00:20:49.269 [2024-07-24 18:57:34.024400] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:49.269 [2024-07-24 18:57:34.025618] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:49.269 [2024-07-24 18:57:34.025639] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:49.269 [2024-07-24 18:57:34.025690] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:49.269 [2024-07-24 18:57:34.025709] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:49.269 [2024-07-24 18:57:34.025779] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:49.269 [2024-07-24 18:57:34.025827] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:49.269 spare 00:20:49.269 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:49.269 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:49.269 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:49.269 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:49.269 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:49.269 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:49.269 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:49.269 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:49.269 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:49.269 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:49.269 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.269 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:49.269 [2024-07-24 18:57:34.126119] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e801d0 00:20:49.269 [2024-07-24 18:57:34.126130] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:49.269 [2024-07-24 18:57:34.126254] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ea00f0 00:20:49.269 [2024-07-24 18:57:34.126353] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e801d0 00:20:49.269 [2024-07-24 18:57:34.126358] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e801d0 00:20:49.269 [2024-07-24 18:57:34.126423] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:49.269 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:49.269 "name": "raid_bdev1", 00:20:49.269 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:49.269 "strip_size_kb": 0, 00:20:49.269 "state": "online", 00:20:49.269 "raid_level": "raid1", 00:20:49.269 "superblock": true, 00:20:49.269 "num_base_bdevs": 4, 00:20:49.269 "num_base_bdevs_discovered": 3, 00:20:49.269 "num_base_bdevs_operational": 3, 00:20:49.269 "base_bdevs_list": [ 00:20:49.269 { 00:20:49.269 "name": "spare", 00:20:49.269 "uuid": "0197bd46-dd18-568e-a89f-74ce11c92137", 00:20:49.269 "is_configured": true, 00:20:49.269 "data_offset": 2048, 00:20:49.269 "data_size": 63488 00:20:49.269 }, 00:20:49.269 { 00:20:49.269 "name": null, 00:20:49.269 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:49.269 "is_configured": false, 00:20:49.269 "data_offset": 2048, 00:20:49.269 "data_size": 63488 00:20:49.269 }, 00:20:49.269 { 00:20:49.269 "name": "BaseBdev3", 00:20:49.269 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:49.269 "is_configured": true, 00:20:49.269 "data_offset": 2048, 00:20:49.269 "data_size": 63488 00:20:49.269 }, 00:20:49.269 { 00:20:49.269 "name": "BaseBdev4", 00:20:49.269 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:49.269 "is_configured": true, 00:20:49.269 "data_offset": 2048, 00:20:49.269 "data_size": 63488 00:20:49.269 } 00:20:49.269 ] 00:20:49.269 }' 00:20:49.269 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:49.269 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:49.836 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:49.836 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:49.836 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:49.836 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:49.836 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:49.836 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.836 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:50.094 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:50.094 "name": "raid_bdev1", 00:20:50.094 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:50.094 "strip_size_kb": 0, 00:20:50.094 "state": "online", 00:20:50.094 "raid_level": "raid1", 00:20:50.094 "superblock": true, 00:20:50.094 "num_base_bdevs": 4, 00:20:50.094 "num_base_bdevs_discovered": 3, 00:20:50.094 "num_base_bdevs_operational": 3, 00:20:50.094 "base_bdevs_list": [ 00:20:50.094 { 00:20:50.094 "name": "spare", 00:20:50.094 "uuid": "0197bd46-dd18-568e-a89f-74ce11c92137", 00:20:50.094 "is_configured": true, 00:20:50.094 "data_offset": 2048, 00:20:50.094 "data_size": 63488 00:20:50.094 }, 00:20:50.094 { 00:20:50.094 "name": null, 00:20:50.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:50.094 "is_configured": false, 00:20:50.094 "data_offset": 2048, 00:20:50.094 "data_size": 63488 00:20:50.094 }, 00:20:50.094 { 00:20:50.094 "name": "BaseBdev3", 00:20:50.094 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:50.094 "is_configured": true, 00:20:50.094 "data_offset": 2048, 00:20:50.094 "data_size": 63488 00:20:50.094 }, 00:20:50.094 { 00:20:50.094 "name": "BaseBdev4", 00:20:50.094 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:50.094 "is_configured": true, 00:20:50.094 "data_offset": 2048, 00:20:50.094 "data_size": 63488 00:20:50.094 } 00:20:50.094 ] 00:20:50.094 }' 00:20:50.094 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:50.094 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:50.094 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:50.094 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:50.094 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.094 18:57:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:20:50.352 18:57:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:20:50.352 18:57:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:50.352 [2024-07-24 18:57:35.315872] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:50.352 18:57:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:50.352 18:57:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:50.352 18:57:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:50.352 18:57:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:50.352 18:57:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:50.353 18:57:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:50.353 18:57:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:50.353 18:57:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:50.353 18:57:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:50.353 18:57:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:50.353 18:57:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:50.353 18:57:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.611 18:57:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:50.611 "name": "raid_bdev1", 00:20:50.611 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:50.611 "strip_size_kb": 0, 00:20:50.611 "state": "online", 00:20:50.611 "raid_level": "raid1", 00:20:50.611 "superblock": true, 00:20:50.611 "num_base_bdevs": 4, 00:20:50.611 "num_base_bdevs_discovered": 2, 00:20:50.611 "num_base_bdevs_operational": 2, 00:20:50.611 "base_bdevs_list": [ 00:20:50.611 { 00:20:50.611 "name": null, 00:20:50.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:50.611 "is_configured": false, 00:20:50.611 "data_offset": 2048, 00:20:50.611 "data_size": 63488 00:20:50.611 }, 00:20:50.611 { 00:20:50.611 "name": null, 00:20:50.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:50.611 "is_configured": false, 00:20:50.611 "data_offset": 2048, 00:20:50.611 "data_size": 63488 00:20:50.611 }, 00:20:50.611 { 00:20:50.611 "name": "BaseBdev3", 00:20:50.611 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:50.611 "is_configured": true, 00:20:50.611 "data_offset": 2048, 00:20:50.611 "data_size": 63488 00:20:50.611 }, 00:20:50.611 { 00:20:50.611 "name": "BaseBdev4", 00:20:50.611 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:50.611 "is_configured": true, 00:20:50.611 "data_offset": 2048, 00:20:50.611 "data_size": 63488 00:20:50.611 } 00:20:50.611 ] 00:20:50.611 }' 00:20:50.611 18:57:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:50.611 18:57:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:51.209 18:57:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:51.209 [2024-07-24 18:57:36.138073] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:51.209 [2024-07-24 18:57:36.138183] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:20:51.209 [2024-07-24 18:57:36.138192] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:51.209 [2024-07-24 18:57:36.138210] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:51.209 [2024-07-24 18:57:36.141997] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1eb3b30 00:20:51.209 [2024-07-24 18:57:36.143340] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:51.209 18:57:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:20:52.585 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:52.585 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:52.585 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:52.585 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:52.585 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:52.585 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.585 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:52.585 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:52.585 "name": "raid_bdev1", 00:20:52.585 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:52.585 "strip_size_kb": 0, 00:20:52.585 "state": "online", 00:20:52.585 "raid_level": "raid1", 00:20:52.585 "superblock": true, 00:20:52.585 "num_base_bdevs": 4, 00:20:52.585 "num_base_bdevs_discovered": 3, 00:20:52.585 "num_base_bdevs_operational": 3, 00:20:52.585 "process": { 00:20:52.585 "type": "rebuild", 00:20:52.585 "target": "spare", 00:20:52.585 "progress": { 00:20:52.585 "blocks": 22528, 00:20:52.585 "percent": 35 00:20:52.585 } 00:20:52.585 }, 00:20:52.585 "base_bdevs_list": [ 00:20:52.585 { 00:20:52.585 "name": "spare", 00:20:52.585 "uuid": "0197bd46-dd18-568e-a89f-74ce11c92137", 00:20:52.585 "is_configured": true, 00:20:52.585 "data_offset": 2048, 00:20:52.585 "data_size": 63488 00:20:52.585 }, 00:20:52.585 { 00:20:52.585 "name": null, 00:20:52.585 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:52.585 "is_configured": false, 00:20:52.585 "data_offset": 2048, 00:20:52.585 "data_size": 63488 00:20:52.585 }, 00:20:52.585 { 00:20:52.585 "name": "BaseBdev3", 00:20:52.585 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:52.585 "is_configured": true, 00:20:52.585 "data_offset": 2048, 00:20:52.585 "data_size": 63488 00:20:52.585 }, 00:20:52.585 { 00:20:52.585 "name": "BaseBdev4", 00:20:52.585 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:52.585 "is_configured": true, 00:20:52.585 "data_offset": 2048, 00:20:52.585 "data_size": 63488 00:20:52.585 } 00:20:52.585 ] 00:20:52.585 }' 00:20:52.585 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:52.585 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:52.585 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:52.585 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:52.585 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:52.585 [2024-07-24 18:57:37.575732] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:52.844 [2024-07-24 18:57:37.653975] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:52.844 [2024-07-24 18:57:37.654007] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:52.844 [2024-07-24 18:57:37.654016] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:52.844 [2024-07-24 18:57:37.654020] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:52.844 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:52.844 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:52.844 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:52.844 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:52.844 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:52.844 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:52.844 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:52.844 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:52.844 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:52.844 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:52.844 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.844 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:53.104 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:53.104 "name": "raid_bdev1", 00:20:53.104 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:53.104 "strip_size_kb": 0, 00:20:53.104 "state": "online", 00:20:53.104 "raid_level": "raid1", 00:20:53.104 "superblock": true, 00:20:53.104 "num_base_bdevs": 4, 00:20:53.104 "num_base_bdevs_discovered": 2, 00:20:53.104 "num_base_bdevs_operational": 2, 00:20:53.104 "base_bdevs_list": [ 00:20:53.104 { 00:20:53.104 "name": null, 00:20:53.104 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.104 "is_configured": false, 00:20:53.104 "data_offset": 2048, 00:20:53.104 "data_size": 63488 00:20:53.104 }, 00:20:53.104 { 00:20:53.104 "name": null, 00:20:53.104 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.104 "is_configured": false, 00:20:53.104 "data_offset": 2048, 00:20:53.104 "data_size": 63488 00:20:53.104 }, 00:20:53.104 { 00:20:53.104 "name": "BaseBdev3", 00:20:53.104 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:53.104 "is_configured": true, 00:20:53.104 "data_offset": 2048, 00:20:53.104 "data_size": 63488 00:20:53.104 }, 00:20:53.104 { 00:20:53.104 "name": "BaseBdev4", 00:20:53.104 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:53.104 "is_configured": true, 00:20:53.104 "data_offset": 2048, 00:20:53.104 "data_size": 63488 00:20:53.104 } 00:20:53.104 ] 00:20:53.104 }' 00:20:53.104 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:53.104 18:57:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:53.363 18:57:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:53.622 [2024-07-24 18:57:38.536249] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:53.622 [2024-07-24 18:57:38.536284] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:53.622 [2024-07-24 18:57:38.536296] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eb4880 00:20:53.622 [2024-07-24 18:57:38.536318] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:53.622 [2024-07-24 18:57:38.536593] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:53.622 [2024-07-24 18:57:38.536604] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:53.622 [2024-07-24 18:57:38.536659] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:53.622 [2024-07-24 18:57:38.536666] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:20:53.622 [2024-07-24 18:57:38.536671] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:53.622 [2024-07-24 18:57:38.536681] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:53.622 [2024-07-24 18:57:38.540474] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ea00f0 00:20:53.622 spare 00:20:53.622 [2024-07-24 18:57:38.541537] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:53.622 18:57:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:20:54.557 18:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:54.557 18:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:54.557 18:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:54.557 18:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:54.557 18:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:54.557 18:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:54.557 18:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.816 18:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:54.816 "name": "raid_bdev1", 00:20:54.816 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:54.816 "strip_size_kb": 0, 00:20:54.816 "state": "online", 00:20:54.816 "raid_level": "raid1", 00:20:54.816 "superblock": true, 00:20:54.816 "num_base_bdevs": 4, 00:20:54.816 "num_base_bdevs_discovered": 3, 00:20:54.816 "num_base_bdevs_operational": 3, 00:20:54.816 "process": { 00:20:54.816 "type": "rebuild", 00:20:54.816 "target": "spare", 00:20:54.816 "progress": { 00:20:54.816 "blocks": 22528, 00:20:54.816 "percent": 35 00:20:54.816 } 00:20:54.816 }, 00:20:54.816 "base_bdevs_list": [ 00:20:54.816 { 00:20:54.816 "name": "spare", 00:20:54.816 "uuid": "0197bd46-dd18-568e-a89f-74ce11c92137", 00:20:54.816 "is_configured": true, 00:20:54.816 "data_offset": 2048, 00:20:54.816 "data_size": 63488 00:20:54.816 }, 00:20:54.816 { 00:20:54.816 "name": null, 00:20:54.816 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.816 "is_configured": false, 00:20:54.816 "data_offset": 2048, 00:20:54.816 "data_size": 63488 00:20:54.816 }, 00:20:54.816 { 00:20:54.816 "name": "BaseBdev3", 00:20:54.816 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:54.816 "is_configured": true, 00:20:54.816 "data_offset": 2048, 00:20:54.816 "data_size": 63488 00:20:54.816 }, 00:20:54.816 { 00:20:54.816 "name": "BaseBdev4", 00:20:54.816 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:54.816 "is_configured": true, 00:20:54.816 "data_offset": 2048, 00:20:54.816 "data_size": 63488 00:20:54.816 } 00:20:54.816 ] 00:20:54.816 }' 00:20:54.816 18:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:54.816 18:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:54.816 18:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:54.816 18:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:54.816 18:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:55.075 [2024-07-24 18:57:39.972621] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:55.075 [2024-07-24 18:57:40.052073] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:55.075 [2024-07-24 18:57:40.052118] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:55.075 [2024-07-24 18:57:40.052143] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:55.075 [2024-07-24 18:57:40.052148] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:55.075 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:55.075 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:55.075 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:55.075 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:55.075 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:55.075 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:55.075 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:55.335 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:55.335 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:55.335 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:55.335 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.335 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:55.335 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:55.335 "name": "raid_bdev1", 00:20:55.335 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:55.335 "strip_size_kb": 0, 00:20:55.335 "state": "online", 00:20:55.335 "raid_level": "raid1", 00:20:55.335 "superblock": true, 00:20:55.335 "num_base_bdevs": 4, 00:20:55.335 "num_base_bdevs_discovered": 2, 00:20:55.335 "num_base_bdevs_operational": 2, 00:20:55.335 "base_bdevs_list": [ 00:20:55.335 { 00:20:55.335 "name": null, 00:20:55.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.335 "is_configured": false, 00:20:55.335 "data_offset": 2048, 00:20:55.335 "data_size": 63488 00:20:55.335 }, 00:20:55.335 { 00:20:55.335 "name": null, 00:20:55.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.335 "is_configured": false, 00:20:55.335 "data_offset": 2048, 00:20:55.335 "data_size": 63488 00:20:55.335 }, 00:20:55.335 { 00:20:55.335 "name": "BaseBdev3", 00:20:55.335 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:55.335 "is_configured": true, 00:20:55.335 "data_offset": 2048, 00:20:55.335 "data_size": 63488 00:20:55.335 }, 00:20:55.335 { 00:20:55.335 "name": "BaseBdev4", 00:20:55.335 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:55.335 "is_configured": true, 00:20:55.335 "data_offset": 2048, 00:20:55.335 "data_size": 63488 00:20:55.335 } 00:20:55.335 ] 00:20:55.335 }' 00:20:55.335 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:55.335 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:55.902 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:55.902 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:55.902 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:55.902 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:55.902 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:55.902 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.902 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:55.902 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:55.902 "name": "raid_bdev1", 00:20:55.902 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:55.902 "strip_size_kb": 0, 00:20:55.902 "state": "online", 00:20:55.902 "raid_level": "raid1", 00:20:55.902 "superblock": true, 00:20:55.902 "num_base_bdevs": 4, 00:20:55.902 "num_base_bdevs_discovered": 2, 00:20:55.902 "num_base_bdevs_operational": 2, 00:20:55.902 "base_bdevs_list": [ 00:20:55.902 { 00:20:55.902 "name": null, 00:20:55.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.902 "is_configured": false, 00:20:55.902 "data_offset": 2048, 00:20:55.902 "data_size": 63488 00:20:55.902 }, 00:20:55.902 { 00:20:55.902 "name": null, 00:20:55.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.903 "is_configured": false, 00:20:55.903 "data_offset": 2048, 00:20:55.903 "data_size": 63488 00:20:55.903 }, 00:20:55.903 { 00:20:55.903 "name": "BaseBdev3", 00:20:55.903 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:55.903 "is_configured": true, 00:20:55.903 "data_offset": 2048, 00:20:55.903 "data_size": 63488 00:20:55.903 }, 00:20:55.903 { 00:20:55.903 "name": "BaseBdev4", 00:20:55.903 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:55.903 "is_configured": true, 00:20:55.903 "data_offset": 2048, 00:20:55.903 "data_size": 63488 00:20:55.903 } 00:20:55.903 ] 00:20:55.903 }' 00:20:55.903 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:55.903 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:55.903 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:56.161 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:56.161 18:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:20:56.161 18:57:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:56.420 [2024-07-24 18:57:41.235187] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:56.420 [2024-07-24 18:57:41.235221] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:56.420 [2024-07-24 18:57:41.235233] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e89360 00:20:56.420 [2024-07-24 18:57:41.235239] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:56.420 [2024-07-24 18:57:41.235498] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:56.420 [2024-07-24 18:57:41.235509] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:56.420 [2024-07-24 18:57:41.235553] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:20:56.420 [2024-07-24 18:57:41.235560] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:20:56.420 [2024-07-24 18:57:41.235569] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:56.420 BaseBdev1 00:20:56.420 18:57:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:20:57.356 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:57.356 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:57.356 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:57.356 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:57.356 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:57.356 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:57.356 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:57.356 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:57.356 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:57.356 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:57.356 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.356 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:57.614 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:57.614 "name": "raid_bdev1", 00:20:57.614 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:57.614 "strip_size_kb": 0, 00:20:57.614 "state": "online", 00:20:57.614 "raid_level": "raid1", 00:20:57.614 "superblock": true, 00:20:57.614 "num_base_bdevs": 4, 00:20:57.614 "num_base_bdevs_discovered": 2, 00:20:57.614 "num_base_bdevs_operational": 2, 00:20:57.614 "base_bdevs_list": [ 00:20:57.614 { 00:20:57.614 "name": null, 00:20:57.614 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.614 "is_configured": false, 00:20:57.614 "data_offset": 2048, 00:20:57.614 "data_size": 63488 00:20:57.614 }, 00:20:57.614 { 00:20:57.614 "name": null, 00:20:57.614 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.614 "is_configured": false, 00:20:57.614 "data_offset": 2048, 00:20:57.614 "data_size": 63488 00:20:57.614 }, 00:20:57.614 { 00:20:57.614 "name": "BaseBdev3", 00:20:57.614 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:57.614 "is_configured": true, 00:20:57.614 "data_offset": 2048, 00:20:57.614 "data_size": 63488 00:20:57.614 }, 00:20:57.614 { 00:20:57.614 "name": "BaseBdev4", 00:20:57.614 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:57.614 "is_configured": true, 00:20:57.614 "data_offset": 2048, 00:20:57.614 "data_size": 63488 00:20:57.614 } 00:20:57.614 ] 00:20:57.614 }' 00:20:57.614 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:57.614 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:57.872 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:57.872 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:57.872 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:57.872 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:57.872 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:57.872 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.872 18:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:58.130 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:58.130 "name": "raid_bdev1", 00:20:58.130 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:58.130 "strip_size_kb": 0, 00:20:58.130 "state": "online", 00:20:58.130 "raid_level": "raid1", 00:20:58.130 "superblock": true, 00:20:58.130 "num_base_bdevs": 4, 00:20:58.130 "num_base_bdevs_discovered": 2, 00:20:58.130 "num_base_bdevs_operational": 2, 00:20:58.130 "base_bdevs_list": [ 00:20:58.130 { 00:20:58.130 "name": null, 00:20:58.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.130 "is_configured": false, 00:20:58.130 "data_offset": 2048, 00:20:58.130 "data_size": 63488 00:20:58.130 }, 00:20:58.130 { 00:20:58.130 "name": null, 00:20:58.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.130 "is_configured": false, 00:20:58.130 "data_offset": 2048, 00:20:58.130 "data_size": 63488 00:20:58.130 }, 00:20:58.130 { 00:20:58.130 "name": "BaseBdev3", 00:20:58.130 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:58.130 "is_configured": true, 00:20:58.130 "data_offset": 2048, 00:20:58.130 "data_size": 63488 00:20:58.130 }, 00:20:58.130 { 00:20:58.130 "name": "BaseBdev4", 00:20:58.130 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:58.130 "is_configured": true, 00:20:58.130 "data_offset": 2048, 00:20:58.130 "data_size": 63488 00:20:58.130 } 00:20:58.130 ] 00:20:58.130 }' 00:20:58.130 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:58.130 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:58.130 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:58.130 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:58.130 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:58.130 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:20:58.130 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:58.130 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:58.130 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:58.130 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:58.130 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:58.130 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:58.130 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:58.130 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:58.130 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:58.130 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:58.387 [2024-07-24 18:57:43.264591] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:58.387 [2024-07-24 18:57:43.264687] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:20:58.387 [2024-07-24 18:57:43.264696] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:58.387 request: 00:20:58.387 { 00:20:58.387 "base_bdev": "BaseBdev1", 00:20:58.387 "raid_bdev": "raid_bdev1", 00:20:58.387 "method": "bdev_raid_add_base_bdev", 00:20:58.387 "req_id": 1 00:20:58.387 } 00:20:58.387 Got JSON-RPC error response 00:20:58.387 response: 00:20:58.387 { 00:20:58.387 "code": -22, 00:20:58.387 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:20:58.387 } 00:20:58.387 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:20:58.387 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:20:58.387 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:20:58.387 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:20:58.387 18:57:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:20:59.321 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:59.321 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:59.321 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:59.321 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:59.321 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:59.321 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:59.321 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:59.321 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:59.321 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:59.321 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:59.321 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:59.321 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.579 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:59.579 "name": "raid_bdev1", 00:20:59.579 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:20:59.579 "strip_size_kb": 0, 00:20:59.579 "state": "online", 00:20:59.579 "raid_level": "raid1", 00:20:59.579 "superblock": true, 00:20:59.579 "num_base_bdevs": 4, 00:20:59.579 "num_base_bdevs_discovered": 2, 00:20:59.579 "num_base_bdevs_operational": 2, 00:20:59.579 "base_bdevs_list": [ 00:20:59.579 { 00:20:59.579 "name": null, 00:20:59.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.579 "is_configured": false, 00:20:59.579 "data_offset": 2048, 00:20:59.579 "data_size": 63488 00:20:59.579 }, 00:20:59.579 { 00:20:59.579 "name": null, 00:20:59.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.579 "is_configured": false, 00:20:59.579 "data_offset": 2048, 00:20:59.579 "data_size": 63488 00:20:59.579 }, 00:20:59.579 { 00:20:59.579 "name": "BaseBdev3", 00:20:59.579 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:20:59.579 "is_configured": true, 00:20:59.579 "data_offset": 2048, 00:20:59.579 "data_size": 63488 00:20:59.579 }, 00:20:59.579 { 00:20:59.579 "name": "BaseBdev4", 00:20:59.579 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:20:59.579 "is_configured": true, 00:20:59.579 "data_offset": 2048, 00:20:59.579 "data_size": 63488 00:20:59.579 } 00:20:59.579 ] 00:20:59.579 }' 00:20:59.579 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:59.579 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:00.145 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:00.145 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:00.145 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:00.145 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:00.145 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:00.145 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.145 18:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:00.145 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:00.145 "name": "raid_bdev1", 00:21:00.145 "uuid": "4c7971e5-1b72-42e5-9f20-3f2fa869a9bd", 00:21:00.145 "strip_size_kb": 0, 00:21:00.145 "state": "online", 00:21:00.145 "raid_level": "raid1", 00:21:00.146 "superblock": true, 00:21:00.146 "num_base_bdevs": 4, 00:21:00.146 "num_base_bdevs_discovered": 2, 00:21:00.146 "num_base_bdevs_operational": 2, 00:21:00.146 "base_bdevs_list": [ 00:21:00.146 { 00:21:00.146 "name": null, 00:21:00.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:00.146 "is_configured": false, 00:21:00.146 "data_offset": 2048, 00:21:00.146 "data_size": 63488 00:21:00.146 }, 00:21:00.146 { 00:21:00.146 "name": null, 00:21:00.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:00.146 "is_configured": false, 00:21:00.146 "data_offset": 2048, 00:21:00.146 "data_size": 63488 00:21:00.146 }, 00:21:00.146 { 00:21:00.146 "name": "BaseBdev3", 00:21:00.146 "uuid": "44aad5a7-e833-53f1-8084-8b43d1141aa2", 00:21:00.146 "is_configured": true, 00:21:00.146 "data_offset": 2048, 00:21:00.146 "data_size": 63488 00:21:00.146 }, 00:21:00.146 { 00:21:00.146 "name": "BaseBdev4", 00:21:00.146 "uuid": "1906cde2-28bf-5c70-b688-93b4521d3e09", 00:21:00.146 "is_configured": true, 00:21:00.146 "data_offset": 2048, 00:21:00.146 "data_size": 63488 00:21:00.146 } 00:21:00.146 ] 00:21:00.146 }' 00:21:00.146 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:00.146 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:00.146 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:00.146 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:00.146 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 2177324 00:21:00.146 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 2177324 ']' 00:21:00.146 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 2177324 00:21:00.146 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:21:00.146 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:00.146 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2177324 00:21:00.405 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:00.405 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:00.405 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2177324' 00:21:00.405 killing process with pid 2177324 00:21:00.405 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 2177324 00:21:00.405 Received shutdown signal, test time was about 22.553044 seconds 00:21:00.405 00:21:00.405 Latency(us) 00:21:00.405 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:00.405 =================================================================================================================== 00:21:00.405 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:21:00.405 [2024-07-24 18:57:45.167877] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:00.405 [2024-07-24 18:57:45.167952] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:00.405 [2024-07-24 18:57:45.167992] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:00.405 [2024-07-24 18:57:45.167998] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e801d0 name raid_bdev1, state offline 00:21:00.405 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 2177324 00:21:00.405 [2024-07-24 18:57:45.203738] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:00.405 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:21:00.405 00:21:00.405 real 0m26.687s 00:21:00.405 user 0m41.180s 00:21:00.405 sys 0m3.267s 00:21:00.405 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:00.405 18:57:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:00.405 ************************************ 00:21:00.405 END TEST raid_rebuild_test_sb_io 00:21:00.405 ************************************ 00:21:00.405 18:57:45 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:21:00.405 18:57:45 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:21:00.405 18:57:45 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:21:00.405 18:57:45 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:00.405 18:57:45 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:00.405 18:57:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:00.664 ************************************ 00:21:00.664 START TEST raid_state_function_test_sb_4k 00:21:00.664 ************************************ 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=2182121 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2182121' 00:21:00.664 Process raid pid: 2182121 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 2182121 /var/tmp/spdk-raid.sock 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 2182121 ']' 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:00.664 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:00.664 18:57:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:00.664 [2024-07-24 18:57:45.491893] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:21:00.664 [2024-07-24 18:57:45.491930] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:00.664 [2024-07-24 18:57:45.556538] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:00.664 [2024-07-24 18:57:45.635409] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:00.924 [2024-07-24 18:57:45.690600] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:00.924 [2024-07-24 18:57:45.690624] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:01.489 18:57:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:01.489 18:57:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:21:01.489 18:57:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:01.489 [2024-07-24 18:57:46.429586] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:01.489 [2024-07-24 18:57:46.429616] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:01.489 [2024-07-24 18:57:46.429621] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:01.489 [2024-07-24 18:57:46.429627] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:01.489 18:57:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:21:01.489 18:57:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:01.489 18:57:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:01.489 18:57:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:01.489 18:57:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:01.489 18:57:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:01.489 18:57:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:01.489 18:57:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:01.489 18:57:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:01.489 18:57:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:01.489 18:57:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:01.489 18:57:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.747 18:57:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:01.747 "name": "Existed_Raid", 00:21:01.747 "uuid": "17bf27bb-1587-416e-bd65-81f2bc553352", 00:21:01.747 "strip_size_kb": 0, 00:21:01.747 "state": "configuring", 00:21:01.747 "raid_level": "raid1", 00:21:01.747 "superblock": true, 00:21:01.747 "num_base_bdevs": 2, 00:21:01.747 "num_base_bdevs_discovered": 0, 00:21:01.747 "num_base_bdevs_operational": 2, 00:21:01.747 "base_bdevs_list": [ 00:21:01.747 { 00:21:01.747 "name": "BaseBdev1", 00:21:01.747 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:01.747 "is_configured": false, 00:21:01.747 "data_offset": 0, 00:21:01.747 "data_size": 0 00:21:01.747 }, 00:21:01.747 { 00:21:01.747 "name": "BaseBdev2", 00:21:01.747 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:01.747 "is_configured": false, 00:21:01.747 "data_offset": 0, 00:21:01.747 "data_size": 0 00:21:01.747 } 00:21:01.747 ] 00:21:01.747 }' 00:21:01.747 18:57:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:01.747 18:57:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:02.313 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:02.313 [2024-07-24 18:57:47.231598] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:02.313 [2024-07-24 18:57:47.231617] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2471b80 name Existed_Raid, state configuring 00:21:02.313 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:02.571 [2024-07-24 18:57:47.400043] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:02.571 [2024-07-24 18:57:47.400059] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:02.571 [2024-07-24 18:57:47.400063] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:02.571 [2024-07-24 18:57:47.400068] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:02.571 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:21:02.571 [2024-07-24 18:57:47.580750] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:02.829 BaseBdev1 00:21:02.829 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:02.829 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:02.829 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:02.829 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:21:02.829 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:02.830 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:02.830 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:02.830 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:03.089 [ 00:21:03.089 { 00:21:03.089 "name": "BaseBdev1", 00:21:03.089 "aliases": [ 00:21:03.089 "42e8e922-189e-48c4-9693-2128ea042c48" 00:21:03.089 ], 00:21:03.089 "product_name": "Malloc disk", 00:21:03.089 "block_size": 4096, 00:21:03.089 "num_blocks": 8192, 00:21:03.089 "uuid": "42e8e922-189e-48c4-9693-2128ea042c48", 00:21:03.089 "assigned_rate_limits": { 00:21:03.089 "rw_ios_per_sec": 0, 00:21:03.089 "rw_mbytes_per_sec": 0, 00:21:03.089 "r_mbytes_per_sec": 0, 00:21:03.089 "w_mbytes_per_sec": 0 00:21:03.089 }, 00:21:03.089 "claimed": true, 00:21:03.089 "claim_type": "exclusive_write", 00:21:03.089 "zoned": false, 00:21:03.089 "supported_io_types": { 00:21:03.089 "read": true, 00:21:03.089 "write": true, 00:21:03.089 "unmap": true, 00:21:03.089 "flush": true, 00:21:03.089 "reset": true, 00:21:03.089 "nvme_admin": false, 00:21:03.089 "nvme_io": false, 00:21:03.089 "nvme_io_md": false, 00:21:03.089 "write_zeroes": true, 00:21:03.089 "zcopy": true, 00:21:03.089 "get_zone_info": false, 00:21:03.089 "zone_management": false, 00:21:03.089 "zone_append": false, 00:21:03.089 "compare": false, 00:21:03.089 "compare_and_write": false, 00:21:03.089 "abort": true, 00:21:03.089 "seek_hole": false, 00:21:03.089 "seek_data": false, 00:21:03.089 "copy": true, 00:21:03.089 "nvme_iov_md": false 00:21:03.089 }, 00:21:03.089 "memory_domains": [ 00:21:03.089 { 00:21:03.089 "dma_device_id": "system", 00:21:03.089 "dma_device_type": 1 00:21:03.089 }, 00:21:03.089 { 00:21:03.089 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.089 "dma_device_type": 2 00:21:03.089 } 00:21:03.089 ], 00:21:03.089 "driver_specific": {} 00:21:03.089 } 00:21:03.089 ] 00:21:03.089 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:21:03.089 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:21:03.089 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:03.089 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:03.089 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:03.089 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:03.089 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:03.089 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:03.089 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:03.089 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:03.089 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:03.089 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.089 18:57:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:03.348 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:03.348 "name": "Existed_Raid", 00:21:03.348 "uuid": "e5212cc4-92d1-4247-99cc-19df0c967eec", 00:21:03.348 "strip_size_kb": 0, 00:21:03.348 "state": "configuring", 00:21:03.348 "raid_level": "raid1", 00:21:03.348 "superblock": true, 00:21:03.348 "num_base_bdevs": 2, 00:21:03.348 "num_base_bdevs_discovered": 1, 00:21:03.348 "num_base_bdevs_operational": 2, 00:21:03.348 "base_bdevs_list": [ 00:21:03.348 { 00:21:03.348 "name": "BaseBdev1", 00:21:03.348 "uuid": "42e8e922-189e-48c4-9693-2128ea042c48", 00:21:03.348 "is_configured": true, 00:21:03.348 "data_offset": 256, 00:21:03.348 "data_size": 7936 00:21:03.348 }, 00:21:03.348 { 00:21:03.348 "name": "BaseBdev2", 00:21:03.348 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:03.348 "is_configured": false, 00:21:03.348 "data_offset": 0, 00:21:03.348 "data_size": 0 00:21:03.348 } 00:21:03.348 ] 00:21:03.348 }' 00:21:03.348 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:03.348 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:03.606 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:03.867 [2024-07-24 18:57:48.747796] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:03.867 [2024-07-24 18:57:48.747824] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2471470 name Existed_Raid, state configuring 00:21:03.867 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:04.125 [2024-07-24 18:57:48.916243] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:04.125 [2024-07-24 18:57:48.917298] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:04.125 [2024-07-24 18:57:48.917322] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:04.125 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:04.125 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:04.125 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:21:04.125 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:04.125 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:04.125 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:04.125 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:04.125 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:04.125 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:04.125 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:04.125 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:04.125 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:04.125 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.125 18:57:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:04.125 18:57:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:04.125 "name": "Existed_Raid", 00:21:04.125 "uuid": "f2fb2454-6d3a-4f7d-ab2b-47c657c9b59f", 00:21:04.125 "strip_size_kb": 0, 00:21:04.125 "state": "configuring", 00:21:04.125 "raid_level": "raid1", 00:21:04.125 "superblock": true, 00:21:04.125 "num_base_bdevs": 2, 00:21:04.125 "num_base_bdevs_discovered": 1, 00:21:04.125 "num_base_bdevs_operational": 2, 00:21:04.125 "base_bdevs_list": [ 00:21:04.125 { 00:21:04.125 "name": "BaseBdev1", 00:21:04.125 "uuid": "42e8e922-189e-48c4-9693-2128ea042c48", 00:21:04.125 "is_configured": true, 00:21:04.125 "data_offset": 256, 00:21:04.125 "data_size": 7936 00:21:04.125 }, 00:21:04.125 { 00:21:04.125 "name": "BaseBdev2", 00:21:04.125 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:04.125 "is_configured": false, 00:21:04.125 "data_offset": 0, 00:21:04.125 "data_size": 0 00:21:04.125 } 00:21:04.125 ] 00:21:04.125 }' 00:21:04.125 18:57:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:04.125 18:57:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:04.703 18:57:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:21:04.962 [2024-07-24 18:57:49.757049] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:04.962 [2024-07-24 18:57:49.757160] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2472260 00:21:04.962 [2024-07-24 18:57:49.757169] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:04.962 [2024-07-24 18:57:49.757283] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24713c0 00:21:04.962 [2024-07-24 18:57:49.757363] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2472260 00:21:04.962 [2024-07-24 18:57:49.757368] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2472260 00:21:04.962 [2024-07-24 18:57:49.757427] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:04.962 BaseBdev2 00:21:04.962 18:57:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:04.962 18:57:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:04.962 18:57:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:04.962 18:57:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:21:04.962 18:57:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:04.962 18:57:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:04.962 18:57:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:04.962 18:57:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:05.220 [ 00:21:05.220 { 00:21:05.220 "name": "BaseBdev2", 00:21:05.220 "aliases": [ 00:21:05.220 "28a6198a-58de-40eb-93ec-2fa29d79bfac" 00:21:05.220 ], 00:21:05.220 "product_name": "Malloc disk", 00:21:05.220 "block_size": 4096, 00:21:05.220 "num_blocks": 8192, 00:21:05.220 "uuid": "28a6198a-58de-40eb-93ec-2fa29d79bfac", 00:21:05.220 "assigned_rate_limits": { 00:21:05.220 "rw_ios_per_sec": 0, 00:21:05.220 "rw_mbytes_per_sec": 0, 00:21:05.220 "r_mbytes_per_sec": 0, 00:21:05.220 "w_mbytes_per_sec": 0 00:21:05.220 }, 00:21:05.220 "claimed": true, 00:21:05.220 "claim_type": "exclusive_write", 00:21:05.220 "zoned": false, 00:21:05.220 "supported_io_types": { 00:21:05.220 "read": true, 00:21:05.220 "write": true, 00:21:05.220 "unmap": true, 00:21:05.220 "flush": true, 00:21:05.220 "reset": true, 00:21:05.220 "nvme_admin": false, 00:21:05.220 "nvme_io": false, 00:21:05.221 "nvme_io_md": false, 00:21:05.221 "write_zeroes": true, 00:21:05.221 "zcopy": true, 00:21:05.221 "get_zone_info": false, 00:21:05.221 "zone_management": false, 00:21:05.221 "zone_append": false, 00:21:05.221 "compare": false, 00:21:05.221 "compare_and_write": false, 00:21:05.221 "abort": true, 00:21:05.221 "seek_hole": false, 00:21:05.221 "seek_data": false, 00:21:05.221 "copy": true, 00:21:05.221 "nvme_iov_md": false 00:21:05.221 }, 00:21:05.221 "memory_domains": [ 00:21:05.221 { 00:21:05.221 "dma_device_id": "system", 00:21:05.221 "dma_device_type": 1 00:21:05.221 }, 00:21:05.221 { 00:21:05.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:05.221 "dma_device_type": 2 00:21:05.221 } 00:21:05.221 ], 00:21:05.221 "driver_specific": {} 00:21:05.221 } 00:21:05.221 ] 00:21:05.221 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:21:05.221 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:05.221 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:05.221 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:21:05.221 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:05.221 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:05.221 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:05.221 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:05.221 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:05.221 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:05.221 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:05.221 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:05.221 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:05.221 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:05.221 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.479 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:05.479 "name": "Existed_Raid", 00:21:05.479 "uuid": "f2fb2454-6d3a-4f7d-ab2b-47c657c9b59f", 00:21:05.479 "strip_size_kb": 0, 00:21:05.479 "state": "online", 00:21:05.479 "raid_level": "raid1", 00:21:05.479 "superblock": true, 00:21:05.479 "num_base_bdevs": 2, 00:21:05.479 "num_base_bdevs_discovered": 2, 00:21:05.479 "num_base_bdevs_operational": 2, 00:21:05.479 "base_bdevs_list": [ 00:21:05.479 { 00:21:05.479 "name": "BaseBdev1", 00:21:05.479 "uuid": "42e8e922-189e-48c4-9693-2128ea042c48", 00:21:05.479 "is_configured": true, 00:21:05.479 "data_offset": 256, 00:21:05.479 "data_size": 7936 00:21:05.479 }, 00:21:05.479 { 00:21:05.479 "name": "BaseBdev2", 00:21:05.479 "uuid": "28a6198a-58de-40eb-93ec-2fa29d79bfac", 00:21:05.479 "is_configured": true, 00:21:05.479 "data_offset": 256, 00:21:05.479 "data_size": 7936 00:21:05.479 } 00:21:05.479 ] 00:21:05.479 }' 00:21:05.479 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:05.479 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:05.739 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:05.739 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:05.739 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:05.739 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:05.739 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:05.739 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:21:05.739 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:05.739 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:06.009 [2024-07-24 18:57:50.884157] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:06.009 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:06.009 "name": "Existed_Raid", 00:21:06.009 "aliases": [ 00:21:06.009 "f2fb2454-6d3a-4f7d-ab2b-47c657c9b59f" 00:21:06.009 ], 00:21:06.009 "product_name": "Raid Volume", 00:21:06.009 "block_size": 4096, 00:21:06.009 "num_blocks": 7936, 00:21:06.009 "uuid": "f2fb2454-6d3a-4f7d-ab2b-47c657c9b59f", 00:21:06.009 "assigned_rate_limits": { 00:21:06.009 "rw_ios_per_sec": 0, 00:21:06.009 "rw_mbytes_per_sec": 0, 00:21:06.009 "r_mbytes_per_sec": 0, 00:21:06.009 "w_mbytes_per_sec": 0 00:21:06.009 }, 00:21:06.009 "claimed": false, 00:21:06.009 "zoned": false, 00:21:06.009 "supported_io_types": { 00:21:06.009 "read": true, 00:21:06.009 "write": true, 00:21:06.009 "unmap": false, 00:21:06.009 "flush": false, 00:21:06.009 "reset": true, 00:21:06.009 "nvme_admin": false, 00:21:06.009 "nvme_io": false, 00:21:06.009 "nvme_io_md": false, 00:21:06.009 "write_zeroes": true, 00:21:06.009 "zcopy": false, 00:21:06.009 "get_zone_info": false, 00:21:06.009 "zone_management": false, 00:21:06.009 "zone_append": false, 00:21:06.009 "compare": false, 00:21:06.009 "compare_and_write": false, 00:21:06.009 "abort": false, 00:21:06.009 "seek_hole": false, 00:21:06.009 "seek_data": false, 00:21:06.009 "copy": false, 00:21:06.009 "nvme_iov_md": false 00:21:06.009 }, 00:21:06.009 "memory_domains": [ 00:21:06.009 { 00:21:06.009 "dma_device_id": "system", 00:21:06.009 "dma_device_type": 1 00:21:06.009 }, 00:21:06.009 { 00:21:06.009 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:06.009 "dma_device_type": 2 00:21:06.009 }, 00:21:06.009 { 00:21:06.009 "dma_device_id": "system", 00:21:06.009 "dma_device_type": 1 00:21:06.009 }, 00:21:06.009 { 00:21:06.009 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:06.009 "dma_device_type": 2 00:21:06.009 } 00:21:06.009 ], 00:21:06.009 "driver_specific": { 00:21:06.009 "raid": { 00:21:06.009 "uuid": "f2fb2454-6d3a-4f7d-ab2b-47c657c9b59f", 00:21:06.009 "strip_size_kb": 0, 00:21:06.009 "state": "online", 00:21:06.009 "raid_level": "raid1", 00:21:06.009 "superblock": true, 00:21:06.009 "num_base_bdevs": 2, 00:21:06.009 "num_base_bdevs_discovered": 2, 00:21:06.009 "num_base_bdevs_operational": 2, 00:21:06.009 "base_bdevs_list": [ 00:21:06.009 { 00:21:06.009 "name": "BaseBdev1", 00:21:06.009 "uuid": "42e8e922-189e-48c4-9693-2128ea042c48", 00:21:06.009 "is_configured": true, 00:21:06.009 "data_offset": 256, 00:21:06.009 "data_size": 7936 00:21:06.009 }, 00:21:06.009 { 00:21:06.009 "name": "BaseBdev2", 00:21:06.009 "uuid": "28a6198a-58de-40eb-93ec-2fa29d79bfac", 00:21:06.009 "is_configured": true, 00:21:06.009 "data_offset": 256, 00:21:06.009 "data_size": 7936 00:21:06.009 } 00:21:06.009 ] 00:21:06.009 } 00:21:06.009 } 00:21:06.009 }' 00:21:06.010 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:06.010 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:06.010 BaseBdev2' 00:21:06.010 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:06.010 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:06.010 18:57:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:06.268 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:06.268 "name": "BaseBdev1", 00:21:06.268 "aliases": [ 00:21:06.268 "42e8e922-189e-48c4-9693-2128ea042c48" 00:21:06.268 ], 00:21:06.268 "product_name": "Malloc disk", 00:21:06.268 "block_size": 4096, 00:21:06.268 "num_blocks": 8192, 00:21:06.268 "uuid": "42e8e922-189e-48c4-9693-2128ea042c48", 00:21:06.268 "assigned_rate_limits": { 00:21:06.268 "rw_ios_per_sec": 0, 00:21:06.268 "rw_mbytes_per_sec": 0, 00:21:06.268 "r_mbytes_per_sec": 0, 00:21:06.268 "w_mbytes_per_sec": 0 00:21:06.268 }, 00:21:06.268 "claimed": true, 00:21:06.268 "claim_type": "exclusive_write", 00:21:06.268 "zoned": false, 00:21:06.268 "supported_io_types": { 00:21:06.268 "read": true, 00:21:06.268 "write": true, 00:21:06.268 "unmap": true, 00:21:06.268 "flush": true, 00:21:06.268 "reset": true, 00:21:06.268 "nvme_admin": false, 00:21:06.268 "nvme_io": false, 00:21:06.268 "nvme_io_md": false, 00:21:06.268 "write_zeroes": true, 00:21:06.268 "zcopy": true, 00:21:06.268 "get_zone_info": false, 00:21:06.268 "zone_management": false, 00:21:06.268 "zone_append": false, 00:21:06.268 "compare": false, 00:21:06.268 "compare_and_write": false, 00:21:06.268 "abort": true, 00:21:06.268 "seek_hole": false, 00:21:06.268 "seek_data": false, 00:21:06.268 "copy": true, 00:21:06.268 "nvme_iov_md": false 00:21:06.268 }, 00:21:06.268 "memory_domains": [ 00:21:06.268 { 00:21:06.268 "dma_device_id": "system", 00:21:06.268 "dma_device_type": 1 00:21:06.268 }, 00:21:06.268 { 00:21:06.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:06.268 "dma_device_type": 2 00:21:06.268 } 00:21:06.268 ], 00:21:06.268 "driver_specific": {} 00:21:06.268 }' 00:21:06.268 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:06.268 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:06.268 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:06.268 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:06.268 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:06.268 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:06.268 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:06.268 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:06.526 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:06.526 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:06.526 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:06.526 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:06.526 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:06.526 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:06.526 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:06.784 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:06.784 "name": "BaseBdev2", 00:21:06.784 "aliases": [ 00:21:06.784 "28a6198a-58de-40eb-93ec-2fa29d79bfac" 00:21:06.784 ], 00:21:06.784 "product_name": "Malloc disk", 00:21:06.784 "block_size": 4096, 00:21:06.784 "num_blocks": 8192, 00:21:06.784 "uuid": "28a6198a-58de-40eb-93ec-2fa29d79bfac", 00:21:06.784 "assigned_rate_limits": { 00:21:06.784 "rw_ios_per_sec": 0, 00:21:06.784 "rw_mbytes_per_sec": 0, 00:21:06.784 "r_mbytes_per_sec": 0, 00:21:06.784 "w_mbytes_per_sec": 0 00:21:06.784 }, 00:21:06.784 "claimed": true, 00:21:06.784 "claim_type": "exclusive_write", 00:21:06.784 "zoned": false, 00:21:06.784 "supported_io_types": { 00:21:06.784 "read": true, 00:21:06.784 "write": true, 00:21:06.784 "unmap": true, 00:21:06.784 "flush": true, 00:21:06.784 "reset": true, 00:21:06.784 "nvme_admin": false, 00:21:06.784 "nvme_io": false, 00:21:06.784 "nvme_io_md": false, 00:21:06.784 "write_zeroes": true, 00:21:06.784 "zcopy": true, 00:21:06.784 "get_zone_info": false, 00:21:06.784 "zone_management": false, 00:21:06.784 "zone_append": false, 00:21:06.784 "compare": false, 00:21:06.784 "compare_and_write": false, 00:21:06.784 "abort": true, 00:21:06.784 "seek_hole": false, 00:21:06.784 "seek_data": false, 00:21:06.784 "copy": true, 00:21:06.784 "nvme_iov_md": false 00:21:06.784 }, 00:21:06.784 "memory_domains": [ 00:21:06.784 { 00:21:06.784 "dma_device_id": "system", 00:21:06.784 "dma_device_type": 1 00:21:06.784 }, 00:21:06.784 { 00:21:06.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:06.784 "dma_device_type": 2 00:21:06.784 } 00:21:06.784 ], 00:21:06.784 "driver_specific": {} 00:21:06.784 }' 00:21:06.784 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:06.784 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:06.784 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:06.784 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:06.784 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:06.784 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:06.784 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:06.784 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:06.784 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:06.784 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:07.041 [2024-07-24 18:57:51.974820] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.041 18:57:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:07.299 18:57:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:07.299 "name": "Existed_Raid", 00:21:07.299 "uuid": "f2fb2454-6d3a-4f7d-ab2b-47c657c9b59f", 00:21:07.299 "strip_size_kb": 0, 00:21:07.299 "state": "online", 00:21:07.299 "raid_level": "raid1", 00:21:07.299 "superblock": true, 00:21:07.299 "num_base_bdevs": 2, 00:21:07.299 "num_base_bdevs_discovered": 1, 00:21:07.299 "num_base_bdevs_operational": 1, 00:21:07.299 "base_bdevs_list": [ 00:21:07.299 { 00:21:07.299 "name": null, 00:21:07.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:07.299 "is_configured": false, 00:21:07.299 "data_offset": 256, 00:21:07.299 "data_size": 7936 00:21:07.299 }, 00:21:07.299 { 00:21:07.299 "name": "BaseBdev2", 00:21:07.299 "uuid": "28a6198a-58de-40eb-93ec-2fa29d79bfac", 00:21:07.299 "is_configured": true, 00:21:07.299 "data_offset": 256, 00:21:07.299 "data_size": 7936 00:21:07.299 } 00:21:07.299 ] 00:21:07.299 }' 00:21:07.299 18:57:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:07.299 18:57:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:07.865 18:57:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:07.865 18:57:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:07.865 18:57:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.865 18:57:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:07.865 18:57:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:07.865 18:57:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:07.865 18:57:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:08.124 [2024-07-24 18:57:52.982237] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:08.124 [2024-07-24 18:57:52.982303] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:08.124 [2024-07-24 18:57:52.992297] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:08.124 [2024-07-24 18:57:52.992322] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:08.124 [2024-07-24 18:57:52.992333] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2472260 name Existed_Raid, state offline 00:21:08.124 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:08.124 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:08.124 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.124 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:08.399 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:08.399 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:08.399 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:21:08.399 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 2182121 00:21:08.399 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 2182121 ']' 00:21:08.399 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 2182121 00:21:08.400 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:21:08.400 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:08.400 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2182121 00:21:08.400 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:08.400 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:08.400 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2182121' 00:21:08.400 killing process with pid 2182121 00:21:08.400 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 2182121 00:21:08.400 [2024-07-24 18:57:53.211441] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:08.400 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 2182121 00:21:08.400 [2024-07-24 18:57:53.212218] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:08.400 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:21:08.400 00:21:08.400 real 0m7.948s 00:21:08.400 user 0m14.204s 00:21:08.400 sys 0m1.254s 00:21:08.400 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:08.400 18:57:53 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:08.400 ************************************ 00:21:08.400 END TEST raid_state_function_test_sb_4k 00:21:08.400 ************************************ 00:21:08.701 18:57:53 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:21:08.701 18:57:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:21:08.701 18:57:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:08.701 18:57:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:08.701 ************************************ 00:21:08.701 START TEST raid_superblock_test_4k 00:21:08.701 ************************************ 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=2183677 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 2183677 /var/tmp/spdk-raid.sock 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 2183677 ']' 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:08.701 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:08.701 18:57:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:08.701 [2024-07-24 18:57:53.504513] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:21:08.701 [2024-07-24 18:57:53.504556] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2183677 ] 00:21:08.701 [2024-07-24 18:57:53.568389] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:08.701 [2024-07-24 18:57:53.645304] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:08.701 [2024-07-24 18:57:53.698010] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:08.701 [2024-07-24 18:57:53.698032] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:21:09.639 malloc1 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:09.639 [2024-07-24 18:57:54.614330] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:09.639 [2024-07-24 18:57:54.614368] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:09.639 [2024-07-24 18:57:54.614379] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1298e20 00:21:09.639 [2024-07-24 18:57:54.614386] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:09.639 [2024-07-24 18:57:54.615587] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:09.639 [2024-07-24 18:57:54.615611] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:09.639 pt1 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:09.639 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:21:09.899 malloc2 00:21:09.899 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:10.158 [2024-07-24 18:57:54.974743] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:10.158 [2024-07-24 18:57:54.974774] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:10.158 [2024-07-24 18:57:54.974782] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1442ed0 00:21:10.158 [2024-07-24 18:57:54.974788] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:10.158 [2024-07-24 18:57:54.975782] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:10.158 [2024-07-24 18:57:54.975802] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:10.158 pt2 00:21:10.158 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:10.158 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:10.158 18:57:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:21:10.158 [2024-07-24 18:57:55.147199] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:10.158 [2024-07-24 18:57:55.147980] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:10.158 [2024-07-24 18:57:55.148077] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1442170 00:21:10.158 [2024-07-24 18:57:55.148086] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:10.158 [2024-07-24 18:57:55.148204] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14445d0 00:21:10.158 [2024-07-24 18:57:55.148297] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1442170 00:21:10.158 [2024-07-24 18:57:55.148302] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1442170 00:21:10.158 [2024-07-24 18:57:55.148361] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:10.416 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:10.416 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:10.416 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:10.416 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:10.416 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:10.416 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:10.416 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:10.416 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:10.416 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:10.416 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:10.416 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.416 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:10.416 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:10.416 "name": "raid_bdev1", 00:21:10.416 "uuid": "fd226c41-c06d-4534-b409-811d93f388a9", 00:21:10.416 "strip_size_kb": 0, 00:21:10.416 "state": "online", 00:21:10.416 "raid_level": "raid1", 00:21:10.416 "superblock": true, 00:21:10.416 "num_base_bdevs": 2, 00:21:10.416 "num_base_bdevs_discovered": 2, 00:21:10.417 "num_base_bdevs_operational": 2, 00:21:10.417 "base_bdevs_list": [ 00:21:10.417 { 00:21:10.417 "name": "pt1", 00:21:10.417 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:10.417 "is_configured": true, 00:21:10.417 "data_offset": 256, 00:21:10.417 "data_size": 7936 00:21:10.417 }, 00:21:10.417 { 00:21:10.417 "name": "pt2", 00:21:10.417 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:10.417 "is_configured": true, 00:21:10.417 "data_offset": 256, 00:21:10.417 "data_size": 7936 00:21:10.417 } 00:21:10.417 ] 00:21:10.417 }' 00:21:10.417 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:10.417 18:57:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:10.983 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:21:10.983 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:10.983 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:10.983 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:10.983 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:10.983 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:21:10.983 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:10.983 18:57:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:11.242 [2024-07-24 18:57:55.997572] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:11.242 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:11.242 "name": "raid_bdev1", 00:21:11.242 "aliases": [ 00:21:11.242 "fd226c41-c06d-4534-b409-811d93f388a9" 00:21:11.242 ], 00:21:11.242 "product_name": "Raid Volume", 00:21:11.242 "block_size": 4096, 00:21:11.242 "num_blocks": 7936, 00:21:11.242 "uuid": "fd226c41-c06d-4534-b409-811d93f388a9", 00:21:11.242 "assigned_rate_limits": { 00:21:11.242 "rw_ios_per_sec": 0, 00:21:11.242 "rw_mbytes_per_sec": 0, 00:21:11.242 "r_mbytes_per_sec": 0, 00:21:11.242 "w_mbytes_per_sec": 0 00:21:11.242 }, 00:21:11.242 "claimed": false, 00:21:11.242 "zoned": false, 00:21:11.242 "supported_io_types": { 00:21:11.242 "read": true, 00:21:11.242 "write": true, 00:21:11.242 "unmap": false, 00:21:11.242 "flush": false, 00:21:11.242 "reset": true, 00:21:11.242 "nvme_admin": false, 00:21:11.242 "nvme_io": false, 00:21:11.242 "nvme_io_md": false, 00:21:11.242 "write_zeroes": true, 00:21:11.242 "zcopy": false, 00:21:11.242 "get_zone_info": false, 00:21:11.242 "zone_management": false, 00:21:11.242 "zone_append": false, 00:21:11.242 "compare": false, 00:21:11.242 "compare_and_write": false, 00:21:11.242 "abort": false, 00:21:11.242 "seek_hole": false, 00:21:11.242 "seek_data": false, 00:21:11.242 "copy": false, 00:21:11.242 "nvme_iov_md": false 00:21:11.242 }, 00:21:11.242 "memory_domains": [ 00:21:11.242 { 00:21:11.242 "dma_device_id": "system", 00:21:11.242 "dma_device_type": 1 00:21:11.242 }, 00:21:11.242 { 00:21:11.242 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.242 "dma_device_type": 2 00:21:11.242 }, 00:21:11.242 { 00:21:11.242 "dma_device_id": "system", 00:21:11.242 "dma_device_type": 1 00:21:11.242 }, 00:21:11.242 { 00:21:11.242 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.242 "dma_device_type": 2 00:21:11.242 } 00:21:11.242 ], 00:21:11.242 "driver_specific": { 00:21:11.242 "raid": { 00:21:11.242 "uuid": "fd226c41-c06d-4534-b409-811d93f388a9", 00:21:11.242 "strip_size_kb": 0, 00:21:11.242 "state": "online", 00:21:11.242 "raid_level": "raid1", 00:21:11.242 "superblock": true, 00:21:11.242 "num_base_bdevs": 2, 00:21:11.242 "num_base_bdevs_discovered": 2, 00:21:11.242 "num_base_bdevs_operational": 2, 00:21:11.242 "base_bdevs_list": [ 00:21:11.242 { 00:21:11.242 "name": "pt1", 00:21:11.242 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:11.242 "is_configured": true, 00:21:11.242 "data_offset": 256, 00:21:11.242 "data_size": 7936 00:21:11.242 }, 00:21:11.242 { 00:21:11.242 "name": "pt2", 00:21:11.242 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:11.242 "is_configured": true, 00:21:11.242 "data_offset": 256, 00:21:11.242 "data_size": 7936 00:21:11.242 } 00:21:11.242 ] 00:21:11.242 } 00:21:11.242 } 00:21:11.242 }' 00:21:11.242 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:11.242 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:11.242 pt2' 00:21:11.242 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:11.242 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:11.242 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:11.242 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:11.242 "name": "pt1", 00:21:11.242 "aliases": [ 00:21:11.242 "00000000-0000-0000-0000-000000000001" 00:21:11.242 ], 00:21:11.242 "product_name": "passthru", 00:21:11.242 "block_size": 4096, 00:21:11.242 "num_blocks": 8192, 00:21:11.242 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:11.242 "assigned_rate_limits": { 00:21:11.242 "rw_ios_per_sec": 0, 00:21:11.242 "rw_mbytes_per_sec": 0, 00:21:11.242 "r_mbytes_per_sec": 0, 00:21:11.242 "w_mbytes_per_sec": 0 00:21:11.242 }, 00:21:11.242 "claimed": true, 00:21:11.242 "claim_type": "exclusive_write", 00:21:11.242 "zoned": false, 00:21:11.242 "supported_io_types": { 00:21:11.242 "read": true, 00:21:11.242 "write": true, 00:21:11.242 "unmap": true, 00:21:11.242 "flush": true, 00:21:11.242 "reset": true, 00:21:11.242 "nvme_admin": false, 00:21:11.242 "nvme_io": false, 00:21:11.242 "nvme_io_md": false, 00:21:11.242 "write_zeroes": true, 00:21:11.242 "zcopy": true, 00:21:11.242 "get_zone_info": false, 00:21:11.242 "zone_management": false, 00:21:11.242 "zone_append": false, 00:21:11.242 "compare": false, 00:21:11.242 "compare_and_write": false, 00:21:11.242 "abort": true, 00:21:11.242 "seek_hole": false, 00:21:11.242 "seek_data": false, 00:21:11.242 "copy": true, 00:21:11.242 "nvme_iov_md": false 00:21:11.242 }, 00:21:11.242 "memory_domains": [ 00:21:11.242 { 00:21:11.242 "dma_device_id": "system", 00:21:11.242 "dma_device_type": 1 00:21:11.242 }, 00:21:11.242 { 00:21:11.242 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.242 "dma_device_type": 2 00:21:11.242 } 00:21:11.242 ], 00:21:11.242 "driver_specific": { 00:21:11.242 "passthru": { 00:21:11.242 "name": "pt1", 00:21:11.242 "base_bdev_name": "malloc1" 00:21:11.242 } 00:21:11.242 } 00:21:11.242 }' 00:21:11.242 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:11.500 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:11.500 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:11.500 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:11.500 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:11.500 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:11.500 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:11.500 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:11.500 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:11.500 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:11.759 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:11.759 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:11.759 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:11.759 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:11.759 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:11.759 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:11.759 "name": "pt2", 00:21:11.759 "aliases": [ 00:21:11.759 "00000000-0000-0000-0000-000000000002" 00:21:11.759 ], 00:21:11.759 "product_name": "passthru", 00:21:11.759 "block_size": 4096, 00:21:11.759 "num_blocks": 8192, 00:21:11.759 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:11.759 "assigned_rate_limits": { 00:21:11.759 "rw_ios_per_sec": 0, 00:21:11.759 "rw_mbytes_per_sec": 0, 00:21:11.759 "r_mbytes_per_sec": 0, 00:21:11.759 "w_mbytes_per_sec": 0 00:21:11.759 }, 00:21:11.759 "claimed": true, 00:21:11.759 "claim_type": "exclusive_write", 00:21:11.759 "zoned": false, 00:21:11.759 "supported_io_types": { 00:21:11.759 "read": true, 00:21:11.759 "write": true, 00:21:11.759 "unmap": true, 00:21:11.759 "flush": true, 00:21:11.759 "reset": true, 00:21:11.759 "nvme_admin": false, 00:21:11.759 "nvme_io": false, 00:21:11.759 "nvme_io_md": false, 00:21:11.759 "write_zeroes": true, 00:21:11.759 "zcopy": true, 00:21:11.759 "get_zone_info": false, 00:21:11.759 "zone_management": false, 00:21:11.759 "zone_append": false, 00:21:11.759 "compare": false, 00:21:11.759 "compare_and_write": false, 00:21:11.759 "abort": true, 00:21:11.759 "seek_hole": false, 00:21:11.759 "seek_data": false, 00:21:11.759 "copy": true, 00:21:11.759 "nvme_iov_md": false 00:21:11.759 }, 00:21:11.759 "memory_domains": [ 00:21:11.759 { 00:21:11.759 "dma_device_id": "system", 00:21:11.759 "dma_device_type": 1 00:21:11.759 }, 00:21:11.759 { 00:21:11.759 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.759 "dma_device_type": 2 00:21:11.759 } 00:21:11.759 ], 00:21:11.759 "driver_specific": { 00:21:11.759 "passthru": { 00:21:11.759 "name": "pt2", 00:21:11.759 "base_bdev_name": "malloc2" 00:21:11.759 } 00:21:11.759 } 00:21:11.759 }' 00:21:11.759 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:11.759 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:12.019 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:12.019 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:12.019 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:12.019 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:12.019 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:12.019 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:12.019 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:12.019 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:12.019 18:57:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:12.278 18:57:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:12.278 18:57:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:12.278 18:57:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:21:12.278 [2024-07-24 18:57:57.184611] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:12.278 18:57:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=fd226c41-c06d-4534-b409-811d93f388a9 00:21:12.278 18:57:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z fd226c41-c06d-4534-b409-811d93f388a9 ']' 00:21:12.278 18:57:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:12.536 [2024-07-24 18:57:57.352880] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:12.536 [2024-07-24 18:57:57.352893] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:12.536 [2024-07-24 18:57:57.352931] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:12.536 [2024-07-24 18:57:57.352969] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:12.536 [2024-07-24 18:57:57.352976] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1442170 name raid_bdev1, state offline 00:21:12.536 18:57:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.536 18:57:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:21:12.536 18:57:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:21:12.536 18:57:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:21:12.536 18:57:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:12.536 18:57:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:12.794 18:57:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:12.794 18:57:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:13.053 18:57:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:13.053 18:57:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:13.312 [2024-07-24 18:57:58.235155] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:13.312 [2024-07-24 18:57:58.236148] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:13.312 [2024-07-24 18:57:58.236190] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:13.312 [2024-07-24 18:57:58.236219] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:13.312 [2024-07-24 18:57:58.236229] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:13.312 [2024-07-24 18:57:58.236250] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1443900 name raid_bdev1, state configuring 00:21:13.312 request: 00:21:13.312 { 00:21:13.312 "name": "raid_bdev1", 00:21:13.312 "raid_level": "raid1", 00:21:13.312 "base_bdevs": [ 00:21:13.312 "malloc1", 00:21:13.312 "malloc2" 00:21:13.312 ], 00:21:13.312 "superblock": false, 00:21:13.312 "method": "bdev_raid_create", 00:21:13.312 "req_id": 1 00:21:13.312 } 00:21:13.312 Got JSON-RPC error response 00:21:13.312 response: 00:21:13.312 { 00:21:13.312 "code": -17, 00:21:13.312 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:13.312 } 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:21:13.312 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.570 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:21:13.570 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:21:13.570 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:13.570 [2024-07-24 18:57:58.580014] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:13.829 [2024-07-24 18:57:58.580051] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:13.829 [2024-07-24 18:57:58.580080] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1297760 00:21:13.829 [2024-07-24 18:57:58.580088] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:13.829 [2024-07-24 18:57:58.581268] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:13.829 [2024-07-24 18:57:58.581290] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:13.829 [2024-07-24 18:57:58.581336] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:13.829 [2024-07-24 18:57:58.581355] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:13.829 pt1 00:21:13.829 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:21:13.829 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:13.829 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:13.829 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:13.829 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:13.829 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:13.829 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:13.829 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:13.829 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:13.829 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:13.829 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.829 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:13.829 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:13.829 "name": "raid_bdev1", 00:21:13.830 "uuid": "fd226c41-c06d-4534-b409-811d93f388a9", 00:21:13.830 "strip_size_kb": 0, 00:21:13.830 "state": "configuring", 00:21:13.830 "raid_level": "raid1", 00:21:13.830 "superblock": true, 00:21:13.830 "num_base_bdevs": 2, 00:21:13.830 "num_base_bdevs_discovered": 1, 00:21:13.830 "num_base_bdevs_operational": 2, 00:21:13.830 "base_bdevs_list": [ 00:21:13.830 { 00:21:13.830 "name": "pt1", 00:21:13.830 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:13.830 "is_configured": true, 00:21:13.830 "data_offset": 256, 00:21:13.830 "data_size": 7936 00:21:13.830 }, 00:21:13.830 { 00:21:13.830 "name": null, 00:21:13.830 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:13.830 "is_configured": false, 00:21:13.830 "data_offset": 256, 00:21:13.830 "data_size": 7936 00:21:13.830 } 00:21:13.830 ] 00:21:13.830 }' 00:21:13.830 18:57:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:13.830 18:57:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:14.398 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:21:14.398 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:21:14.398 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:14.398 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:14.398 [2024-07-24 18:57:59.394120] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:14.398 [2024-07-24 18:57:59.394151] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:14.398 [2024-07-24 18:57:59.394161] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1299a40 00:21:14.398 [2024-07-24 18:57:59.394183] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:14.398 [2024-07-24 18:57:59.394435] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:14.398 [2024-07-24 18:57:59.394445] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:14.398 [2024-07-24 18:57:59.394491] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:14.398 [2024-07-24 18:57:59.394503] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:14.398 [2024-07-24 18:57:59.394574] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x14454e0 00:21:14.398 [2024-07-24 18:57:59.394580] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:14.398 [2024-07-24 18:57:59.394699] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1448cd0 00:21:14.398 [2024-07-24 18:57:59.394791] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14454e0 00:21:14.398 [2024-07-24 18:57:59.394796] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14454e0 00:21:14.398 [2024-07-24 18:57:59.394867] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:14.398 pt2 00:21:14.656 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:14.656 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:14.656 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:14.656 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:14.656 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:14.656 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:14.656 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:14.656 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:14.656 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:14.656 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:14.656 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:14.656 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:14.656 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.656 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:14.656 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:14.656 "name": "raid_bdev1", 00:21:14.656 "uuid": "fd226c41-c06d-4534-b409-811d93f388a9", 00:21:14.656 "strip_size_kb": 0, 00:21:14.656 "state": "online", 00:21:14.656 "raid_level": "raid1", 00:21:14.656 "superblock": true, 00:21:14.656 "num_base_bdevs": 2, 00:21:14.656 "num_base_bdevs_discovered": 2, 00:21:14.656 "num_base_bdevs_operational": 2, 00:21:14.656 "base_bdevs_list": [ 00:21:14.656 { 00:21:14.656 "name": "pt1", 00:21:14.656 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:14.656 "is_configured": true, 00:21:14.656 "data_offset": 256, 00:21:14.656 "data_size": 7936 00:21:14.656 }, 00:21:14.656 { 00:21:14.656 "name": "pt2", 00:21:14.656 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:14.656 "is_configured": true, 00:21:14.656 "data_offset": 256, 00:21:14.656 "data_size": 7936 00:21:14.656 } 00:21:14.656 ] 00:21:14.656 }' 00:21:14.656 18:57:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:14.657 18:57:59 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:15.222 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:21:15.222 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:15.222 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:15.222 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:15.222 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:15.222 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:21:15.222 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:15.222 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:15.481 [2024-07-24 18:58:00.232474] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:15.481 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:15.481 "name": "raid_bdev1", 00:21:15.481 "aliases": [ 00:21:15.481 "fd226c41-c06d-4534-b409-811d93f388a9" 00:21:15.481 ], 00:21:15.481 "product_name": "Raid Volume", 00:21:15.481 "block_size": 4096, 00:21:15.481 "num_blocks": 7936, 00:21:15.481 "uuid": "fd226c41-c06d-4534-b409-811d93f388a9", 00:21:15.481 "assigned_rate_limits": { 00:21:15.481 "rw_ios_per_sec": 0, 00:21:15.481 "rw_mbytes_per_sec": 0, 00:21:15.481 "r_mbytes_per_sec": 0, 00:21:15.481 "w_mbytes_per_sec": 0 00:21:15.481 }, 00:21:15.481 "claimed": false, 00:21:15.481 "zoned": false, 00:21:15.481 "supported_io_types": { 00:21:15.481 "read": true, 00:21:15.481 "write": true, 00:21:15.481 "unmap": false, 00:21:15.481 "flush": false, 00:21:15.481 "reset": true, 00:21:15.481 "nvme_admin": false, 00:21:15.481 "nvme_io": false, 00:21:15.481 "nvme_io_md": false, 00:21:15.481 "write_zeroes": true, 00:21:15.481 "zcopy": false, 00:21:15.481 "get_zone_info": false, 00:21:15.481 "zone_management": false, 00:21:15.481 "zone_append": false, 00:21:15.481 "compare": false, 00:21:15.481 "compare_and_write": false, 00:21:15.481 "abort": false, 00:21:15.481 "seek_hole": false, 00:21:15.481 "seek_data": false, 00:21:15.481 "copy": false, 00:21:15.481 "nvme_iov_md": false 00:21:15.481 }, 00:21:15.481 "memory_domains": [ 00:21:15.481 { 00:21:15.481 "dma_device_id": "system", 00:21:15.481 "dma_device_type": 1 00:21:15.481 }, 00:21:15.481 { 00:21:15.481 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:15.481 "dma_device_type": 2 00:21:15.481 }, 00:21:15.481 { 00:21:15.481 "dma_device_id": "system", 00:21:15.481 "dma_device_type": 1 00:21:15.481 }, 00:21:15.481 { 00:21:15.481 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:15.481 "dma_device_type": 2 00:21:15.481 } 00:21:15.481 ], 00:21:15.481 "driver_specific": { 00:21:15.481 "raid": { 00:21:15.481 "uuid": "fd226c41-c06d-4534-b409-811d93f388a9", 00:21:15.481 "strip_size_kb": 0, 00:21:15.481 "state": "online", 00:21:15.481 "raid_level": "raid1", 00:21:15.481 "superblock": true, 00:21:15.481 "num_base_bdevs": 2, 00:21:15.481 "num_base_bdevs_discovered": 2, 00:21:15.481 "num_base_bdevs_operational": 2, 00:21:15.481 "base_bdevs_list": [ 00:21:15.481 { 00:21:15.481 "name": "pt1", 00:21:15.481 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:15.481 "is_configured": true, 00:21:15.481 "data_offset": 256, 00:21:15.481 "data_size": 7936 00:21:15.481 }, 00:21:15.481 { 00:21:15.481 "name": "pt2", 00:21:15.481 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:15.481 "is_configured": true, 00:21:15.481 "data_offset": 256, 00:21:15.481 "data_size": 7936 00:21:15.481 } 00:21:15.481 ] 00:21:15.481 } 00:21:15.481 } 00:21:15.481 }' 00:21:15.482 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:15.482 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:15.482 pt2' 00:21:15.482 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:15.482 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:15.482 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:15.482 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:15.482 "name": "pt1", 00:21:15.482 "aliases": [ 00:21:15.482 "00000000-0000-0000-0000-000000000001" 00:21:15.482 ], 00:21:15.482 "product_name": "passthru", 00:21:15.482 "block_size": 4096, 00:21:15.482 "num_blocks": 8192, 00:21:15.482 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:15.482 "assigned_rate_limits": { 00:21:15.482 "rw_ios_per_sec": 0, 00:21:15.482 "rw_mbytes_per_sec": 0, 00:21:15.482 "r_mbytes_per_sec": 0, 00:21:15.482 "w_mbytes_per_sec": 0 00:21:15.482 }, 00:21:15.482 "claimed": true, 00:21:15.482 "claim_type": "exclusive_write", 00:21:15.482 "zoned": false, 00:21:15.482 "supported_io_types": { 00:21:15.482 "read": true, 00:21:15.482 "write": true, 00:21:15.482 "unmap": true, 00:21:15.482 "flush": true, 00:21:15.482 "reset": true, 00:21:15.482 "nvme_admin": false, 00:21:15.482 "nvme_io": false, 00:21:15.482 "nvme_io_md": false, 00:21:15.482 "write_zeroes": true, 00:21:15.482 "zcopy": true, 00:21:15.482 "get_zone_info": false, 00:21:15.482 "zone_management": false, 00:21:15.482 "zone_append": false, 00:21:15.482 "compare": false, 00:21:15.482 "compare_and_write": false, 00:21:15.482 "abort": true, 00:21:15.482 "seek_hole": false, 00:21:15.482 "seek_data": false, 00:21:15.482 "copy": true, 00:21:15.482 "nvme_iov_md": false 00:21:15.482 }, 00:21:15.482 "memory_domains": [ 00:21:15.482 { 00:21:15.482 "dma_device_id": "system", 00:21:15.482 "dma_device_type": 1 00:21:15.482 }, 00:21:15.482 { 00:21:15.482 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:15.482 "dma_device_type": 2 00:21:15.482 } 00:21:15.482 ], 00:21:15.482 "driver_specific": { 00:21:15.482 "passthru": { 00:21:15.482 "name": "pt1", 00:21:15.482 "base_bdev_name": "malloc1" 00:21:15.482 } 00:21:15.482 } 00:21:15.482 }' 00:21:15.482 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:15.740 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:15.740 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:15.740 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:15.740 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:15.740 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:15.740 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:15.740 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:15.740 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:15.740 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:15.999 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:15.999 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:15.999 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:15.999 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:15.999 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:15.999 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:15.999 "name": "pt2", 00:21:15.999 "aliases": [ 00:21:15.999 "00000000-0000-0000-0000-000000000002" 00:21:15.999 ], 00:21:15.999 "product_name": "passthru", 00:21:15.999 "block_size": 4096, 00:21:15.999 "num_blocks": 8192, 00:21:15.999 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:15.999 "assigned_rate_limits": { 00:21:15.999 "rw_ios_per_sec": 0, 00:21:15.999 "rw_mbytes_per_sec": 0, 00:21:15.999 "r_mbytes_per_sec": 0, 00:21:15.999 "w_mbytes_per_sec": 0 00:21:15.999 }, 00:21:15.999 "claimed": true, 00:21:15.999 "claim_type": "exclusive_write", 00:21:15.999 "zoned": false, 00:21:15.999 "supported_io_types": { 00:21:15.999 "read": true, 00:21:15.999 "write": true, 00:21:15.999 "unmap": true, 00:21:15.999 "flush": true, 00:21:15.999 "reset": true, 00:21:15.999 "nvme_admin": false, 00:21:15.999 "nvme_io": false, 00:21:15.999 "nvme_io_md": false, 00:21:15.999 "write_zeroes": true, 00:21:15.999 "zcopy": true, 00:21:15.999 "get_zone_info": false, 00:21:15.999 "zone_management": false, 00:21:15.999 "zone_append": false, 00:21:15.999 "compare": false, 00:21:15.999 "compare_and_write": false, 00:21:15.999 "abort": true, 00:21:15.999 "seek_hole": false, 00:21:15.999 "seek_data": false, 00:21:15.999 "copy": true, 00:21:15.999 "nvme_iov_md": false 00:21:15.999 }, 00:21:15.999 "memory_domains": [ 00:21:15.999 { 00:21:15.999 "dma_device_id": "system", 00:21:15.999 "dma_device_type": 1 00:21:15.999 }, 00:21:15.999 { 00:21:15.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:15.999 "dma_device_type": 2 00:21:15.999 } 00:21:15.999 ], 00:21:15.999 "driver_specific": { 00:21:15.999 "passthru": { 00:21:15.999 "name": "pt2", 00:21:15.999 "base_bdev_name": "malloc2" 00:21:15.999 } 00:21:15.999 } 00:21:15.999 }' 00:21:15.999 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:15.999 18:58:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:16.258 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:16.258 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:16.258 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:16.258 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:16.258 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:16.258 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:16.258 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:16.258 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:16.258 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:16.516 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:16.516 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:16.516 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:21:16.516 [2024-07-24 18:58:01.423555] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:16.516 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' fd226c41-c06d-4534-b409-811d93f388a9 '!=' fd226c41-c06d-4534-b409-811d93f388a9 ']' 00:21:16.516 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:21:16.516 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:16.516 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:21:16.516 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:16.776 [2024-07-24 18:58:01.591863] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:21:16.776 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:16.776 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:16.776 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:16.776 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:16.776 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:16.776 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:16.776 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:16.776 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:16.776 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:16.776 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:16.776 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.776 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:16.776 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:16.776 "name": "raid_bdev1", 00:21:16.776 "uuid": "fd226c41-c06d-4534-b409-811d93f388a9", 00:21:16.776 "strip_size_kb": 0, 00:21:16.776 "state": "online", 00:21:16.776 "raid_level": "raid1", 00:21:16.776 "superblock": true, 00:21:16.776 "num_base_bdevs": 2, 00:21:16.776 "num_base_bdevs_discovered": 1, 00:21:16.776 "num_base_bdevs_operational": 1, 00:21:16.776 "base_bdevs_list": [ 00:21:16.776 { 00:21:16.776 "name": null, 00:21:16.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:16.776 "is_configured": false, 00:21:16.776 "data_offset": 256, 00:21:16.776 "data_size": 7936 00:21:16.776 }, 00:21:16.776 { 00:21:16.776 "name": "pt2", 00:21:16.776 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:16.776 "is_configured": true, 00:21:16.776 "data_offset": 256, 00:21:16.776 "data_size": 7936 00:21:16.776 } 00:21:16.776 ] 00:21:16.776 }' 00:21:16.776 18:58:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:16.776 18:58:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:17.343 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:17.602 [2024-07-24 18:58:02.381869] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:17.602 [2024-07-24 18:58:02.381889] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:17.602 [2024-07-24 18:58:02.381924] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:17.602 [2024-07-24 18:58:02.381952] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:17.602 [2024-07-24 18:58:02.381958] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14454e0 name raid_bdev1, state offline 00:21:17.602 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.602 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:21:17.602 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:21:17.602 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:21:17.602 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:21:17.602 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:17.602 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:17.860 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:21:17.860 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:17.860 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:21:17.860 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:21:17.860 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:21:17.860 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:18.119 [2024-07-24 18:58:02.875235] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:18.119 [2024-07-24 18:58:02.875266] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:18.119 [2024-07-24 18:58:02.875279] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1299050 00:21:18.119 [2024-07-24 18:58:02.875285] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:18.119 [2024-07-24 18:58:02.876466] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:18.119 [2024-07-24 18:58:02.876494] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:18.119 [2024-07-24 18:58:02.876540] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:18.119 [2024-07-24 18:58:02.876559] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:18.119 [2024-07-24 18:58:02.876621] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1449f20 00:21:18.119 [2024-07-24 18:58:02.876627] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:18.119 [2024-07-24 18:58:02.876744] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x129a420 00:21:18.119 [2024-07-24 18:58:02.876834] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1449f20 00:21:18.119 [2024-07-24 18:58:02.876839] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1449f20 00:21:18.119 [2024-07-24 18:58:02.876910] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:18.119 pt2 00:21:18.119 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:18.119 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:18.119 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:18.119 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:18.119 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:18.119 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:18.119 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:18.119 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:18.119 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:18.119 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:18.119 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.119 18:58:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:18.119 18:58:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:18.119 "name": "raid_bdev1", 00:21:18.119 "uuid": "fd226c41-c06d-4534-b409-811d93f388a9", 00:21:18.119 "strip_size_kb": 0, 00:21:18.119 "state": "online", 00:21:18.119 "raid_level": "raid1", 00:21:18.119 "superblock": true, 00:21:18.119 "num_base_bdevs": 2, 00:21:18.119 "num_base_bdevs_discovered": 1, 00:21:18.119 "num_base_bdevs_operational": 1, 00:21:18.119 "base_bdevs_list": [ 00:21:18.119 { 00:21:18.119 "name": null, 00:21:18.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:18.120 "is_configured": false, 00:21:18.120 "data_offset": 256, 00:21:18.120 "data_size": 7936 00:21:18.120 }, 00:21:18.120 { 00:21:18.120 "name": "pt2", 00:21:18.120 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:18.120 "is_configured": true, 00:21:18.120 "data_offset": 256, 00:21:18.120 "data_size": 7936 00:21:18.120 } 00:21:18.120 ] 00:21:18.120 }' 00:21:18.120 18:58:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:18.120 18:58:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:18.686 18:58:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:18.944 [2024-07-24 18:58:03.701371] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:18.944 [2024-07-24 18:58:03.701391] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:18.944 [2024-07-24 18:58:03.701434] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:18.944 [2024-07-24 18:58:03.701466] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:18.944 [2024-07-24 18:58:03.701478] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1449f20 name raid_bdev1, state offline 00:21:18.944 18:58:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.944 18:58:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:21:18.944 18:58:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:21:18.945 18:58:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:21:18.945 18:58:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:21:18.945 18:58:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:19.204 [2024-07-24 18:58:04.034225] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:19.204 [2024-07-24 18:58:04.034259] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:19.204 [2024-07-24 18:58:04.034268] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1443100 00:21:19.204 [2024-07-24 18:58:04.034274] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:19.204 [2024-07-24 18:58:04.035443] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:19.204 [2024-07-24 18:58:04.035465] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:19.204 [2024-07-24 18:58:04.035519] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:19.204 [2024-07-24 18:58:04.035536] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:19.204 [2024-07-24 18:58:04.035607] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:21:19.204 [2024-07-24 18:58:04.035614] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:19.204 [2024-07-24 18:58:04.035621] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1447df0 name raid_bdev1, state configuring 00:21:19.204 [2024-07-24 18:58:04.035636] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:19.204 [2024-07-24 18:58:04.035672] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1448900 00:21:19.204 [2024-07-24 18:58:04.035677] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:19.204 [2024-07-24 18:58:04.035795] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1443be0 00:21:19.204 [2024-07-24 18:58:04.035877] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1448900 00:21:19.204 [2024-07-24 18:58:04.035882] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1448900 00:21:19.204 [2024-07-24 18:58:04.035949] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:19.204 pt1 00:21:19.204 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:21:19.204 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:19.204 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:19.204 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:19.204 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:19.204 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:19.204 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:19.204 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:19.204 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:19.204 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:19.204 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:19.204 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:19.204 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.463 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:19.463 "name": "raid_bdev1", 00:21:19.463 "uuid": "fd226c41-c06d-4534-b409-811d93f388a9", 00:21:19.463 "strip_size_kb": 0, 00:21:19.463 "state": "online", 00:21:19.463 "raid_level": "raid1", 00:21:19.463 "superblock": true, 00:21:19.463 "num_base_bdevs": 2, 00:21:19.463 "num_base_bdevs_discovered": 1, 00:21:19.463 "num_base_bdevs_operational": 1, 00:21:19.463 "base_bdevs_list": [ 00:21:19.463 { 00:21:19.463 "name": null, 00:21:19.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:19.463 "is_configured": false, 00:21:19.463 "data_offset": 256, 00:21:19.463 "data_size": 7936 00:21:19.463 }, 00:21:19.463 { 00:21:19.463 "name": "pt2", 00:21:19.463 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:19.463 "is_configured": true, 00:21:19.463 "data_offset": 256, 00:21:19.463 "data_size": 7936 00:21:19.463 } 00:21:19.463 ] 00:21:19.463 }' 00:21:19.463 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:19.463 18:58:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:19.722 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:21:19.722 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:21:19.981 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:21:19.981 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:19.981 18:58:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:21:20.240 [2024-07-24 18:58:05.028950] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:20.240 18:58:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' fd226c41-c06d-4534-b409-811d93f388a9 '!=' fd226c41-c06d-4534-b409-811d93f388a9 ']' 00:21:20.240 18:58:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 2183677 00:21:20.240 18:58:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 2183677 ']' 00:21:20.240 18:58:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 2183677 00:21:20.240 18:58:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:21:20.240 18:58:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:20.240 18:58:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2183677 00:21:20.240 18:58:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:20.241 18:58:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:20.241 18:58:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2183677' 00:21:20.241 killing process with pid 2183677 00:21:20.241 18:58:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 2183677 00:21:20.241 [2024-07-24 18:58:05.090324] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:20.241 [2024-07-24 18:58:05.090364] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:20.241 [2024-07-24 18:58:05.090396] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:20.241 [2024-07-24 18:58:05.090402] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1448900 name raid_bdev1, state offline 00:21:20.241 18:58:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 2183677 00:21:20.241 [2024-07-24 18:58:05.105987] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:20.500 18:58:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:21:20.500 00:21:20.500 real 0m11.831s 00:21:20.500 user 0m21.710s 00:21:20.500 sys 0m1.930s 00:21:20.500 18:58:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:20.500 18:58:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:20.500 ************************************ 00:21:20.500 END TEST raid_superblock_test_4k 00:21:20.500 ************************************ 00:21:20.500 18:58:05 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:21:20.500 18:58:05 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:21:20.500 18:58:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:21:20.500 18:58:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:20.500 18:58:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:20.500 ************************************ 00:21:20.500 START TEST raid_rebuild_test_sb_4k 00:21:20.500 ************************************ 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=2185860 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 2185860 /var/tmp/spdk-raid.sock 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 2185860 ']' 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:20.500 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:20.500 18:58:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:20.500 [2024-07-24 18:58:05.395993] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:21:20.500 [2024-07-24 18:58:05.396031] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2185860 ] 00:21:20.500 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:20.500 Zero copy mechanism will not be used. 00:21:20.500 [2024-07-24 18:58:05.459215] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:20.759 [2024-07-24 18:58:05.537543] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:20.759 [2024-07-24 18:58:05.587890] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:20.759 [2024-07-24 18:58:05.587915] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:21.327 18:58:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:21.327 18:58:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:21:21.327 18:58:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:21.327 18:58:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:21:21.586 BaseBdev1_malloc 00:21:21.586 18:58:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:21.586 [2024-07-24 18:58:06.507150] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:21.586 [2024-07-24 18:58:06.507186] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:21.586 [2024-07-24 18:58:06.507200] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa87130 00:21:21.586 [2024-07-24 18:58:06.507206] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:21.586 [2024-07-24 18:58:06.508345] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:21.586 [2024-07-24 18:58:06.508366] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:21.586 BaseBdev1 00:21:21.586 18:58:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:21.586 18:58:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:21:21.844 BaseBdev2_malloc 00:21:21.844 18:58:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:21.844 [2024-07-24 18:58:06.851615] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:21.844 [2024-07-24 18:58:06.851647] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:21.844 [2024-07-24 18:58:06.851658] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc2cfa0 00:21:21.844 [2024-07-24 18:58:06.851680] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:21.844 [2024-07-24 18:58:06.852730] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:21.844 [2024-07-24 18:58:06.852750] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:22.102 BaseBdev2 00:21:22.102 18:58:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:21:22.102 spare_malloc 00:21:22.102 18:58:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:22.374 spare_delay 00:21:22.374 18:58:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:22.374 [2024-07-24 18:58:07.364516] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:22.374 [2024-07-24 18:58:07.364547] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:22.374 [2024-07-24 18:58:07.364559] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc2ef40 00:21:22.374 [2024-07-24 18:58:07.364565] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:22.374 [2024-07-24 18:58:07.365629] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:22.374 [2024-07-24 18:58:07.365649] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:22.374 spare 00:21:22.374 18:58:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:21:22.633 [2024-07-24 18:58:07.532979] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:22.633 [2024-07-24 18:58:07.533837] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:22.633 [2024-07-24 18:58:07.533953] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xc30370 00:21:22.633 [2024-07-24 18:58:07.533961] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:22.633 [2024-07-24 18:58:07.534090] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc2f1d0 00:21:22.633 [2024-07-24 18:58:07.534183] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc30370 00:21:22.633 [2024-07-24 18:58:07.534189] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc30370 00:21:22.633 [2024-07-24 18:58:07.534254] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:22.633 18:58:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:22.633 18:58:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:22.633 18:58:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:22.633 18:58:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:22.633 18:58:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:22.633 18:58:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:22.633 18:58:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:22.633 18:58:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:22.633 18:58:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:22.633 18:58:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:22.633 18:58:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:22.633 18:58:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.891 18:58:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:22.891 "name": "raid_bdev1", 00:21:22.891 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:22.891 "strip_size_kb": 0, 00:21:22.891 "state": "online", 00:21:22.891 "raid_level": "raid1", 00:21:22.891 "superblock": true, 00:21:22.891 "num_base_bdevs": 2, 00:21:22.891 "num_base_bdevs_discovered": 2, 00:21:22.891 "num_base_bdevs_operational": 2, 00:21:22.891 "base_bdevs_list": [ 00:21:22.891 { 00:21:22.891 "name": "BaseBdev1", 00:21:22.891 "uuid": "657807d0-faae-5e22-9b3b-89f14c251e60", 00:21:22.891 "is_configured": true, 00:21:22.891 "data_offset": 256, 00:21:22.891 "data_size": 7936 00:21:22.891 }, 00:21:22.891 { 00:21:22.891 "name": "BaseBdev2", 00:21:22.891 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:22.891 "is_configured": true, 00:21:22.891 "data_offset": 256, 00:21:22.891 "data_size": 7936 00:21:22.891 } 00:21:22.891 ] 00:21:22.891 }' 00:21:22.891 18:58:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:22.891 18:58:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:23.459 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:23.459 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:23.459 [2024-07-24 18:58:08.359239] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:23.459 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:21:23.459 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:23.459 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:23.718 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:21:23.718 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:21:23.718 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:21:23.718 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:21:23.718 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:21:23.718 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:23.718 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:21:23.718 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:23.718 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:23.718 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:23.718 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:21:23.718 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:23.718 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:23.718 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:21:23.718 [2024-07-24 18:58:08.708028] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc2f1d0 00:21:23.718 /dev/nbd0 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:23.976 1+0 records in 00:21:23.976 1+0 records out 00:21:23.976 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239825 s, 17.1 MB/s 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:21:23.976 18:58:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:21:24.557 7936+0 records in 00:21:24.557 7936+0 records out 00:21:24.557 32505856 bytes (33 MB, 31 MiB) copied, 0.507631 s, 64.0 MB/s 00:21:24.557 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:24.557 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:24.557 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:24.557 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:24.557 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:21:24.557 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:24.557 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:24.557 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:24.557 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:24.557 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:24.557 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:24.557 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:24.557 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:24.557 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:21:24.557 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:21:24.557 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:24.557 [2024-07-24 18:58:09.468623] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:24.826 [2024-07-24 18:58:09.613035] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:24.826 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:24.826 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:24.826 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:24.826 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:24.826 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:24.826 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:24.826 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:24.826 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:24.826 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:24.826 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:24.826 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:24.826 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:24.826 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:24.826 "name": "raid_bdev1", 00:21:24.826 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:24.826 "strip_size_kb": 0, 00:21:24.826 "state": "online", 00:21:24.826 "raid_level": "raid1", 00:21:24.826 "superblock": true, 00:21:24.826 "num_base_bdevs": 2, 00:21:24.826 "num_base_bdevs_discovered": 1, 00:21:24.826 "num_base_bdevs_operational": 1, 00:21:24.826 "base_bdevs_list": [ 00:21:24.826 { 00:21:24.826 "name": null, 00:21:24.826 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:24.826 "is_configured": false, 00:21:24.826 "data_offset": 256, 00:21:24.826 "data_size": 7936 00:21:24.826 }, 00:21:24.826 { 00:21:24.826 "name": "BaseBdev2", 00:21:24.826 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:24.826 "is_configured": true, 00:21:24.826 "data_offset": 256, 00:21:24.826 "data_size": 7936 00:21:24.826 } 00:21:24.826 ] 00:21:24.826 }' 00:21:24.826 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:24.826 18:58:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:25.393 18:58:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:25.652 [2024-07-24 18:58:10.439191] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:25.652 [2024-07-24 18:58:10.443502] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc2f1d0 00:21:25.652 [2024-07-24 18:58:10.444852] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:25.652 18:58:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:21:26.585 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:26.585 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:26.585 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:26.585 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:26.585 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:26.585 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:26.585 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:26.843 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:26.843 "name": "raid_bdev1", 00:21:26.843 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:26.843 "strip_size_kb": 0, 00:21:26.843 "state": "online", 00:21:26.843 "raid_level": "raid1", 00:21:26.843 "superblock": true, 00:21:26.843 "num_base_bdevs": 2, 00:21:26.843 "num_base_bdevs_discovered": 2, 00:21:26.843 "num_base_bdevs_operational": 2, 00:21:26.843 "process": { 00:21:26.843 "type": "rebuild", 00:21:26.843 "target": "spare", 00:21:26.843 "progress": { 00:21:26.843 "blocks": 2816, 00:21:26.843 "percent": 35 00:21:26.843 } 00:21:26.843 }, 00:21:26.843 "base_bdevs_list": [ 00:21:26.843 { 00:21:26.843 "name": "spare", 00:21:26.843 "uuid": "0c3a4430-419d-5048-b40c-53c67f2473be", 00:21:26.843 "is_configured": true, 00:21:26.843 "data_offset": 256, 00:21:26.843 "data_size": 7936 00:21:26.843 }, 00:21:26.843 { 00:21:26.843 "name": "BaseBdev2", 00:21:26.843 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:26.843 "is_configured": true, 00:21:26.843 "data_offset": 256, 00:21:26.843 "data_size": 7936 00:21:26.843 } 00:21:26.843 ] 00:21:26.843 }' 00:21:26.843 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:26.843 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:26.843 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:26.843 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:26.843 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:27.101 [2024-07-24 18:58:11.875886] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:27.101 [2024-07-24 18:58:11.955305] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:27.101 [2024-07-24 18:58:11.955333] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:27.101 [2024-07-24 18:58:11.955342] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:27.101 [2024-07-24 18:58:11.955362] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:27.101 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:27.101 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:27.101 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:27.101 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:27.101 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:27.101 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:27.101 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:27.101 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:27.101 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:27.101 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:27.101 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:27.101 18:58:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:27.358 18:58:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:27.358 "name": "raid_bdev1", 00:21:27.358 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:27.358 "strip_size_kb": 0, 00:21:27.358 "state": "online", 00:21:27.358 "raid_level": "raid1", 00:21:27.358 "superblock": true, 00:21:27.358 "num_base_bdevs": 2, 00:21:27.358 "num_base_bdevs_discovered": 1, 00:21:27.358 "num_base_bdevs_operational": 1, 00:21:27.358 "base_bdevs_list": [ 00:21:27.358 { 00:21:27.358 "name": null, 00:21:27.358 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.358 "is_configured": false, 00:21:27.358 "data_offset": 256, 00:21:27.358 "data_size": 7936 00:21:27.358 }, 00:21:27.358 { 00:21:27.358 "name": "BaseBdev2", 00:21:27.358 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:27.358 "is_configured": true, 00:21:27.358 "data_offset": 256, 00:21:27.358 "data_size": 7936 00:21:27.358 } 00:21:27.358 ] 00:21:27.358 }' 00:21:27.358 18:58:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:27.358 18:58:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:27.616 18:58:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:27.616 18:58:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:27.616 18:58:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:27.616 18:58:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:27.616 18:58:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:27.616 18:58:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:27.616 18:58:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:27.874 18:58:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:27.874 "name": "raid_bdev1", 00:21:27.874 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:27.874 "strip_size_kb": 0, 00:21:27.874 "state": "online", 00:21:27.874 "raid_level": "raid1", 00:21:27.874 "superblock": true, 00:21:27.874 "num_base_bdevs": 2, 00:21:27.874 "num_base_bdevs_discovered": 1, 00:21:27.874 "num_base_bdevs_operational": 1, 00:21:27.874 "base_bdevs_list": [ 00:21:27.874 { 00:21:27.874 "name": null, 00:21:27.874 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.874 "is_configured": false, 00:21:27.874 "data_offset": 256, 00:21:27.875 "data_size": 7936 00:21:27.875 }, 00:21:27.875 { 00:21:27.875 "name": "BaseBdev2", 00:21:27.875 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:27.875 "is_configured": true, 00:21:27.875 "data_offset": 256, 00:21:27.875 "data_size": 7936 00:21:27.875 } 00:21:27.875 ] 00:21:27.875 }' 00:21:27.875 18:58:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:27.875 18:58:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:27.875 18:58:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:27.875 18:58:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:27.875 18:58:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:28.133 [2024-07-24 18:58:13.010113] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:28.133 [2024-07-24 18:58:13.014410] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc38cf0 00:21:28.133 [2024-07-24 18:58:13.015450] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:28.133 18:58:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:21:29.066 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:29.066 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:29.066 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:29.067 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:29.067 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:29.067 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.067 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:29.325 "name": "raid_bdev1", 00:21:29.325 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:29.325 "strip_size_kb": 0, 00:21:29.325 "state": "online", 00:21:29.325 "raid_level": "raid1", 00:21:29.325 "superblock": true, 00:21:29.325 "num_base_bdevs": 2, 00:21:29.325 "num_base_bdevs_discovered": 2, 00:21:29.325 "num_base_bdevs_operational": 2, 00:21:29.325 "process": { 00:21:29.325 "type": "rebuild", 00:21:29.325 "target": "spare", 00:21:29.325 "progress": { 00:21:29.325 "blocks": 2816, 00:21:29.325 "percent": 35 00:21:29.325 } 00:21:29.325 }, 00:21:29.325 "base_bdevs_list": [ 00:21:29.325 { 00:21:29.325 "name": "spare", 00:21:29.325 "uuid": "0c3a4430-419d-5048-b40c-53c67f2473be", 00:21:29.325 "is_configured": true, 00:21:29.325 "data_offset": 256, 00:21:29.325 "data_size": 7936 00:21:29.325 }, 00:21:29.325 { 00:21:29.325 "name": "BaseBdev2", 00:21:29.325 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:29.325 "is_configured": true, 00:21:29.325 "data_offset": 256, 00:21:29.325 "data_size": 7936 00:21:29.325 } 00:21:29.325 ] 00:21:29.325 }' 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:21:29.325 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=775 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.325 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:29.584 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:29.584 "name": "raid_bdev1", 00:21:29.584 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:29.584 "strip_size_kb": 0, 00:21:29.584 "state": "online", 00:21:29.584 "raid_level": "raid1", 00:21:29.584 "superblock": true, 00:21:29.584 "num_base_bdevs": 2, 00:21:29.584 "num_base_bdevs_discovered": 2, 00:21:29.584 "num_base_bdevs_operational": 2, 00:21:29.584 "process": { 00:21:29.584 "type": "rebuild", 00:21:29.584 "target": "spare", 00:21:29.584 "progress": { 00:21:29.584 "blocks": 3584, 00:21:29.584 "percent": 45 00:21:29.584 } 00:21:29.584 }, 00:21:29.584 "base_bdevs_list": [ 00:21:29.584 { 00:21:29.584 "name": "spare", 00:21:29.584 "uuid": "0c3a4430-419d-5048-b40c-53c67f2473be", 00:21:29.584 "is_configured": true, 00:21:29.584 "data_offset": 256, 00:21:29.584 "data_size": 7936 00:21:29.584 }, 00:21:29.584 { 00:21:29.584 "name": "BaseBdev2", 00:21:29.584 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:29.584 "is_configured": true, 00:21:29.584 "data_offset": 256, 00:21:29.584 "data_size": 7936 00:21:29.584 } 00:21:29.584 ] 00:21:29.584 }' 00:21:29.584 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:29.584 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:29.584 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:29.584 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:29.584 18:58:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:30.519 18:58:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:30.519 18:58:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:30.519 18:58:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:30.519 18:58:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:30.519 18:58:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:30.519 18:58:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:30.519 18:58:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:30.519 18:58:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:30.778 18:58:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:30.778 "name": "raid_bdev1", 00:21:30.778 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:30.778 "strip_size_kb": 0, 00:21:30.778 "state": "online", 00:21:30.778 "raid_level": "raid1", 00:21:30.778 "superblock": true, 00:21:30.778 "num_base_bdevs": 2, 00:21:30.778 "num_base_bdevs_discovered": 2, 00:21:30.778 "num_base_bdevs_operational": 2, 00:21:30.778 "process": { 00:21:30.778 "type": "rebuild", 00:21:30.778 "target": "spare", 00:21:30.778 "progress": { 00:21:30.778 "blocks": 6656, 00:21:30.778 "percent": 83 00:21:30.778 } 00:21:30.778 }, 00:21:30.778 "base_bdevs_list": [ 00:21:30.778 { 00:21:30.778 "name": "spare", 00:21:30.778 "uuid": "0c3a4430-419d-5048-b40c-53c67f2473be", 00:21:30.778 "is_configured": true, 00:21:30.778 "data_offset": 256, 00:21:30.778 "data_size": 7936 00:21:30.778 }, 00:21:30.778 { 00:21:30.778 "name": "BaseBdev2", 00:21:30.778 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:30.778 "is_configured": true, 00:21:30.778 "data_offset": 256, 00:21:30.778 "data_size": 7936 00:21:30.778 } 00:21:30.778 ] 00:21:30.778 }' 00:21:30.778 18:58:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:30.778 18:58:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:30.778 18:58:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:30.778 18:58:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:30.778 18:58:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:31.345 [2024-07-24 18:58:16.137157] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:31.345 [2024-07-24 18:58:16.137194] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:31.345 [2024-07-24 18:58:16.137250] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:31.913 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:31.913 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:31.913 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:31.913 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:31.913 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:31.913 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:31.913 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:31.913 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:32.172 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:32.172 "name": "raid_bdev1", 00:21:32.172 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:32.172 "strip_size_kb": 0, 00:21:32.172 "state": "online", 00:21:32.172 "raid_level": "raid1", 00:21:32.172 "superblock": true, 00:21:32.172 "num_base_bdevs": 2, 00:21:32.172 "num_base_bdevs_discovered": 2, 00:21:32.172 "num_base_bdevs_operational": 2, 00:21:32.172 "base_bdevs_list": [ 00:21:32.172 { 00:21:32.172 "name": "spare", 00:21:32.172 "uuid": "0c3a4430-419d-5048-b40c-53c67f2473be", 00:21:32.172 "is_configured": true, 00:21:32.172 "data_offset": 256, 00:21:32.172 "data_size": 7936 00:21:32.172 }, 00:21:32.172 { 00:21:32.172 "name": "BaseBdev2", 00:21:32.172 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:32.172 "is_configured": true, 00:21:32.172 "data_offset": 256, 00:21:32.172 "data_size": 7936 00:21:32.172 } 00:21:32.172 ] 00:21:32.172 }' 00:21:32.172 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:32.172 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:32.172 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:32.172 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:21:32.172 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:21:32.172 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:32.172 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:32.172 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:32.172 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:32.172 18:58:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:32.172 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.172 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:32.172 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:32.172 "name": "raid_bdev1", 00:21:32.172 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:32.172 "strip_size_kb": 0, 00:21:32.172 "state": "online", 00:21:32.172 "raid_level": "raid1", 00:21:32.172 "superblock": true, 00:21:32.172 "num_base_bdevs": 2, 00:21:32.172 "num_base_bdevs_discovered": 2, 00:21:32.173 "num_base_bdevs_operational": 2, 00:21:32.173 "base_bdevs_list": [ 00:21:32.173 { 00:21:32.173 "name": "spare", 00:21:32.173 "uuid": "0c3a4430-419d-5048-b40c-53c67f2473be", 00:21:32.173 "is_configured": true, 00:21:32.173 "data_offset": 256, 00:21:32.173 "data_size": 7936 00:21:32.173 }, 00:21:32.173 { 00:21:32.173 "name": "BaseBdev2", 00:21:32.173 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:32.173 "is_configured": true, 00:21:32.173 "data_offset": 256, 00:21:32.173 "data_size": 7936 00:21:32.173 } 00:21:32.173 ] 00:21:32.173 }' 00:21:32.173 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:32.432 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:32.432 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:32.432 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:32.432 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:32.432 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:32.432 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:32.432 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:32.432 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:32.432 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:32.432 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:32.432 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:32.432 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:32.432 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:32.432 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:32.432 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.432 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:32.432 "name": "raid_bdev1", 00:21:32.432 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:32.432 "strip_size_kb": 0, 00:21:32.432 "state": "online", 00:21:32.432 "raid_level": "raid1", 00:21:32.432 "superblock": true, 00:21:32.432 "num_base_bdevs": 2, 00:21:32.432 "num_base_bdevs_discovered": 2, 00:21:32.432 "num_base_bdevs_operational": 2, 00:21:32.432 "base_bdevs_list": [ 00:21:32.432 { 00:21:32.432 "name": "spare", 00:21:32.432 "uuid": "0c3a4430-419d-5048-b40c-53c67f2473be", 00:21:32.432 "is_configured": true, 00:21:32.432 "data_offset": 256, 00:21:32.432 "data_size": 7936 00:21:32.432 }, 00:21:32.432 { 00:21:32.432 "name": "BaseBdev2", 00:21:32.432 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:32.432 "is_configured": true, 00:21:32.432 "data_offset": 256, 00:21:32.432 "data_size": 7936 00:21:32.432 } 00:21:32.432 ] 00:21:32.432 }' 00:21:32.432 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:32.433 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:33.000 18:58:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:33.259 [2024-07-24 18:58:18.065969] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:33.259 [2024-07-24 18:58:18.065990] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:33.259 [2024-07-24 18:58:18.066038] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:33.259 [2024-07-24 18:58:18.066078] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:33.259 [2024-07-24 18:58:18.066084] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc30370 name raid_bdev1, state offline 00:21:33.259 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:33.259 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:21:33.259 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:21:33.259 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:21:33.259 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:21:33.259 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:21:33.259 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:33.259 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:21:33.259 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:33.259 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:33.259 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:33.259 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:21:33.259 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:33.259 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:33.259 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:21:33.517 /dev/nbd0 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:33.517 1+0 records in 00:21:33.517 1+0 records out 00:21:33.517 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024148 s, 17.0 MB/s 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:33.517 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:21:33.775 /dev/nbd1 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:33.775 1+0 records in 00:21:33.775 1+0 records out 00:21:33.775 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000221239 s, 18.5 MB/s 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:33.775 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:34.034 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:34.034 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:34.034 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:34.034 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:34.034 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:34.034 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:34.034 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:21:34.034 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:21:34.034 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:34.034 18:58:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:34.293 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:34.293 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:34.293 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:34.293 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:34.293 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:34.293 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:34.293 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:21:34.293 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:21:34.293 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:21:34.293 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:34.293 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:34.552 [2024-07-24 18:58:19.430264] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:34.552 [2024-07-24 18:58:19.430299] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:34.552 [2024-07-24 18:58:19.430310] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa81600 00:21:34.552 [2024-07-24 18:58:19.430316] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:34.552 [2024-07-24 18:58:19.431538] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:34.552 [2024-07-24 18:58:19.431564] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:34.552 [2024-07-24 18:58:19.431618] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:34.552 [2024-07-24 18:58:19.431637] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:34.552 [2024-07-24 18:58:19.431712] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:34.552 spare 00:21:34.552 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:34.552 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:34.552 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:34.552 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:34.552 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:34.552 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:34.552 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:34.552 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:34.552 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:34.552 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:34.552 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:34.552 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:34.552 [2024-07-24 18:58:19.532003] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xa84f00 00:21:34.552 [2024-07-24 18:58:19.532013] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:34.552 [2024-07-24 18:58:19.532132] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc38cf0 00:21:34.552 [2024-07-24 18:58:19.532229] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa84f00 00:21:34.552 [2024-07-24 18:58:19.532234] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa84f00 00:21:34.552 [2024-07-24 18:58:19.532299] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:34.811 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:34.811 "name": "raid_bdev1", 00:21:34.811 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:34.811 "strip_size_kb": 0, 00:21:34.811 "state": "online", 00:21:34.811 "raid_level": "raid1", 00:21:34.811 "superblock": true, 00:21:34.811 "num_base_bdevs": 2, 00:21:34.811 "num_base_bdevs_discovered": 2, 00:21:34.811 "num_base_bdevs_operational": 2, 00:21:34.811 "base_bdevs_list": [ 00:21:34.811 { 00:21:34.811 "name": "spare", 00:21:34.811 "uuid": "0c3a4430-419d-5048-b40c-53c67f2473be", 00:21:34.811 "is_configured": true, 00:21:34.811 "data_offset": 256, 00:21:34.811 "data_size": 7936 00:21:34.811 }, 00:21:34.811 { 00:21:34.811 "name": "BaseBdev2", 00:21:34.811 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:34.811 "is_configured": true, 00:21:34.811 "data_offset": 256, 00:21:34.811 "data_size": 7936 00:21:34.811 } 00:21:34.811 ] 00:21:34.811 }' 00:21:34.811 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:34.811 18:58:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:35.378 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:35.378 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:35.378 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:35.378 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:35.378 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:35.378 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.378 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:35.378 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:35.378 "name": "raid_bdev1", 00:21:35.378 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:35.378 "strip_size_kb": 0, 00:21:35.378 "state": "online", 00:21:35.378 "raid_level": "raid1", 00:21:35.378 "superblock": true, 00:21:35.379 "num_base_bdevs": 2, 00:21:35.379 "num_base_bdevs_discovered": 2, 00:21:35.379 "num_base_bdevs_operational": 2, 00:21:35.379 "base_bdevs_list": [ 00:21:35.379 { 00:21:35.379 "name": "spare", 00:21:35.379 "uuid": "0c3a4430-419d-5048-b40c-53c67f2473be", 00:21:35.379 "is_configured": true, 00:21:35.379 "data_offset": 256, 00:21:35.379 "data_size": 7936 00:21:35.379 }, 00:21:35.379 { 00:21:35.379 "name": "BaseBdev2", 00:21:35.379 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:35.379 "is_configured": true, 00:21:35.379 "data_offset": 256, 00:21:35.379 "data_size": 7936 00:21:35.379 } 00:21:35.379 ] 00:21:35.379 }' 00:21:35.379 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:35.379 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:35.379 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:35.379 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:35.379 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.379 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:21:35.637 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:21:35.637 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:35.895 [2024-07-24 18:58:20.673529] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:35.896 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:35.896 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:35.896 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:35.896 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:35.896 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:35.896 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:35.896 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:35.896 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:35.896 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:35.896 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:35.896 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:35.896 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.896 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:35.896 "name": "raid_bdev1", 00:21:35.896 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:35.896 "strip_size_kb": 0, 00:21:35.896 "state": "online", 00:21:35.896 "raid_level": "raid1", 00:21:35.896 "superblock": true, 00:21:35.896 "num_base_bdevs": 2, 00:21:35.896 "num_base_bdevs_discovered": 1, 00:21:35.896 "num_base_bdevs_operational": 1, 00:21:35.896 "base_bdevs_list": [ 00:21:35.896 { 00:21:35.896 "name": null, 00:21:35.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:35.896 "is_configured": false, 00:21:35.896 "data_offset": 256, 00:21:35.896 "data_size": 7936 00:21:35.896 }, 00:21:35.896 { 00:21:35.896 "name": "BaseBdev2", 00:21:35.896 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:35.896 "is_configured": true, 00:21:35.896 "data_offset": 256, 00:21:35.896 "data_size": 7936 00:21:35.896 } 00:21:35.896 ] 00:21:35.896 }' 00:21:35.896 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:35.896 18:58:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:36.478 18:58:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:36.478 [2024-07-24 18:58:21.479616] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:36.478 [2024-07-24 18:58:21.479724] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:21:36.478 [2024-07-24 18:58:21.479734] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:36.478 [2024-07-24 18:58:21.479754] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:36.478 [2024-07-24 18:58:21.484028] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc301b0 00:21:36.478 [2024-07-24 18:58:21.485035] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:36.737 18:58:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:21:37.673 18:58:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:37.673 18:58:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:37.673 18:58:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:37.673 18:58:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:37.673 18:58:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:37.673 18:58:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.673 18:58:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:37.673 18:58:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:37.673 "name": "raid_bdev1", 00:21:37.673 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:37.673 "strip_size_kb": 0, 00:21:37.673 "state": "online", 00:21:37.673 "raid_level": "raid1", 00:21:37.673 "superblock": true, 00:21:37.673 "num_base_bdevs": 2, 00:21:37.673 "num_base_bdevs_discovered": 2, 00:21:37.673 "num_base_bdevs_operational": 2, 00:21:37.673 "process": { 00:21:37.673 "type": "rebuild", 00:21:37.673 "target": "spare", 00:21:37.673 "progress": { 00:21:37.673 "blocks": 2816, 00:21:37.673 "percent": 35 00:21:37.673 } 00:21:37.673 }, 00:21:37.673 "base_bdevs_list": [ 00:21:37.673 { 00:21:37.673 "name": "spare", 00:21:37.673 "uuid": "0c3a4430-419d-5048-b40c-53c67f2473be", 00:21:37.673 "is_configured": true, 00:21:37.673 "data_offset": 256, 00:21:37.673 "data_size": 7936 00:21:37.673 }, 00:21:37.673 { 00:21:37.673 "name": "BaseBdev2", 00:21:37.673 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:37.673 "is_configured": true, 00:21:37.673 "data_offset": 256, 00:21:37.673 "data_size": 7936 00:21:37.673 } 00:21:37.673 ] 00:21:37.673 }' 00:21:37.673 18:58:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:37.939 18:58:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:37.939 18:58:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:37.939 18:58:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:37.939 18:58:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:37.939 [2024-07-24 18:58:22.904042] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:38.198 [2024-07-24 18:58:22.995519] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:38.198 [2024-07-24 18:58:22.995549] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:38.198 [2024-07-24 18:58:22.995557] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:38.198 [2024-07-24 18:58:22.995578] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:38.198 18:58:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:38.198 18:58:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:38.198 18:58:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:38.198 18:58:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:38.198 18:58:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:38.198 18:58:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:38.198 18:58:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:38.198 18:58:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:38.198 18:58:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:38.198 18:58:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:38.198 18:58:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.198 18:58:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:38.198 18:58:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:38.198 "name": "raid_bdev1", 00:21:38.198 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:38.198 "strip_size_kb": 0, 00:21:38.198 "state": "online", 00:21:38.198 "raid_level": "raid1", 00:21:38.198 "superblock": true, 00:21:38.198 "num_base_bdevs": 2, 00:21:38.198 "num_base_bdevs_discovered": 1, 00:21:38.198 "num_base_bdevs_operational": 1, 00:21:38.198 "base_bdevs_list": [ 00:21:38.198 { 00:21:38.198 "name": null, 00:21:38.198 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:38.199 "is_configured": false, 00:21:38.199 "data_offset": 256, 00:21:38.199 "data_size": 7936 00:21:38.199 }, 00:21:38.199 { 00:21:38.199 "name": "BaseBdev2", 00:21:38.199 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:38.199 "is_configured": true, 00:21:38.199 "data_offset": 256, 00:21:38.199 "data_size": 7936 00:21:38.199 } 00:21:38.199 ] 00:21:38.199 }' 00:21:38.199 18:58:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:38.199 18:58:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:38.767 18:58:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:39.025 [2024-07-24 18:58:23.817632] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:39.025 [2024-07-24 18:58:23.817667] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:39.025 [2024-07-24 18:58:23.817680] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa82150 00:21:39.025 [2024-07-24 18:58:23.817685] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:39.025 [2024-07-24 18:58:23.817943] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:39.025 [2024-07-24 18:58:23.817953] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:39.025 [2024-07-24 18:58:23.818006] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:39.025 [2024-07-24 18:58:23.818012] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:21:39.025 [2024-07-24 18:58:23.818017] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:39.025 [2024-07-24 18:58:23.818027] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:39.025 [2024-07-24 18:58:23.822157] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc2aef0 00:21:39.025 [2024-07-24 18:58:23.823124] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:39.025 spare 00:21:39.025 18:58:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:21:39.961 18:58:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:39.961 18:58:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:39.961 18:58:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:39.961 18:58:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:39.961 18:58:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:39.961 18:58:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.961 18:58:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:40.219 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:40.219 "name": "raid_bdev1", 00:21:40.219 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:40.219 "strip_size_kb": 0, 00:21:40.219 "state": "online", 00:21:40.219 "raid_level": "raid1", 00:21:40.219 "superblock": true, 00:21:40.219 "num_base_bdevs": 2, 00:21:40.219 "num_base_bdevs_discovered": 2, 00:21:40.219 "num_base_bdevs_operational": 2, 00:21:40.219 "process": { 00:21:40.219 "type": "rebuild", 00:21:40.219 "target": "spare", 00:21:40.219 "progress": { 00:21:40.219 "blocks": 2816, 00:21:40.219 "percent": 35 00:21:40.219 } 00:21:40.219 }, 00:21:40.219 "base_bdevs_list": [ 00:21:40.219 { 00:21:40.219 "name": "spare", 00:21:40.219 "uuid": "0c3a4430-419d-5048-b40c-53c67f2473be", 00:21:40.219 "is_configured": true, 00:21:40.219 "data_offset": 256, 00:21:40.219 "data_size": 7936 00:21:40.219 }, 00:21:40.219 { 00:21:40.219 "name": "BaseBdev2", 00:21:40.219 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:40.219 "is_configured": true, 00:21:40.219 "data_offset": 256, 00:21:40.219 "data_size": 7936 00:21:40.219 } 00:21:40.219 ] 00:21:40.219 }' 00:21:40.219 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:40.219 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:40.219 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:40.219 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:40.220 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:40.478 [2024-07-24 18:58:25.254562] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:40.478 [2024-07-24 18:58:25.333531] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:40.478 [2024-07-24 18:58:25.333560] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:40.478 [2024-07-24 18:58:25.333568] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:40.478 [2024-07-24 18:58:25.333572] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:40.478 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:40.478 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:40.478 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:40.478 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:40.478 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:40.478 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:40.478 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:40.478 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:40.478 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:40.478 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:40.478 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.478 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:40.736 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:40.736 "name": "raid_bdev1", 00:21:40.736 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:40.736 "strip_size_kb": 0, 00:21:40.736 "state": "online", 00:21:40.736 "raid_level": "raid1", 00:21:40.736 "superblock": true, 00:21:40.736 "num_base_bdevs": 2, 00:21:40.736 "num_base_bdevs_discovered": 1, 00:21:40.736 "num_base_bdevs_operational": 1, 00:21:40.736 "base_bdevs_list": [ 00:21:40.736 { 00:21:40.736 "name": null, 00:21:40.736 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:40.736 "is_configured": false, 00:21:40.736 "data_offset": 256, 00:21:40.736 "data_size": 7936 00:21:40.736 }, 00:21:40.736 { 00:21:40.736 "name": "BaseBdev2", 00:21:40.736 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:40.736 "is_configured": true, 00:21:40.736 "data_offset": 256, 00:21:40.736 "data_size": 7936 00:21:40.736 } 00:21:40.736 ] 00:21:40.736 }' 00:21:40.736 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:40.737 18:58:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:41.373 18:58:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:41.373 18:58:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:41.373 18:58:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:41.373 18:58:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:41.373 18:58:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:41.373 18:58:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.373 18:58:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:41.373 18:58:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:41.373 "name": "raid_bdev1", 00:21:41.373 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:41.373 "strip_size_kb": 0, 00:21:41.373 "state": "online", 00:21:41.373 "raid_level": "raid1", 00:21:41.373 "superblock": true, 00:21:41.373 "num_base_bdevs": 2, 00:21:41.373 "num_base_bdevs_discovered": 1, 00:21:41.373 "num_base_bdevs_operational": 1, 00:21:41.373 "base_bdevs_list": [ 00:21:41.373 { 00:21:41.373 "name": null, 00:21:41.373 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:41.373 "is_configured": false, 00:21:41.373 "data_offset": 256, 00:21:41.373 "data_size": 7936 00:21:41.373 }, 00:21:41.373 { 00:21:41.373 "name": "BaseBdev2", 00:21:41.373 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:41.373 "is_configured": true, 00:21:41.373 "data_offset": 256, 00:21:41.373 "data_size": 7936 00:21:41.373 } 00:21:41.373 ] 00:21:41.373 }' 00:21:41.373 18:58:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:41.373 18:58:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:41.373 18:58:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:41.373 18:58:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:41.373 18:58:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:21:41.695 18:58:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:41.695 [2024-07-24 18:58:26.592787] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:41.695 [2024-07-24 18:58:26.592822] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:41.695 [2024-07-24 18:58:26.592834] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa87360 00:21:41.695 [2024-07-24 18:58:26.592841] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:41.695 [2024-07-24 18:58:26.593088] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:41.695 [2024-07-24 18:58:26.593098] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:41.695 [2024-07-24 18:58:26.593143] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:21:41.695 [2024-07-24 18:58:26.593150] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:21:41.695 [2024-07-24 18:58:26.593154] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:41.695 BaseBdev1 00:21:41.695 18:58:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:21:42.628 18:58:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:42.628 18:58:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:42.628 18:58:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:42.628 18:58:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:42.628 18:58:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:42.628 18:58:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:42.628 18:58:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:42.628 18:58:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:42.628 18:58:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:42.628 18:58:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:42.628 18:58:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.628 18:58:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:42.886 18:58:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:42.886 "name": "raid_bdev1", 00:21:42.886 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:42.886 "strip_size_kb": 0, 00:21:42.886 "state": "online", 00:21:42.886 "raid_level": "raid1", 00:21:42.886 "superblock": true, 00:21:42.886 "num_base_bdevs": 2, 00:21:42.886 "num_base_bdevs_discovered": 1, 00:21:42.886 "num_base_bdevs_operational": 1, 00:21:42.886 "base_bdevs_list": [ 00:21:42.886 { 00:21:42.886 "name": null, 00:21:42.886 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:42.886 "is_configured": false, 00:21:42.886 "data_offset": 256, 00:21:42.886 "data_size": 7936 00:21:42.886 }, 00:21:42.886 { 00:21:42.886 "name": "BaseBdev2", 00:21:42.886 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:42.886 "is_configured": true, 00:21:42.886 "data_offset": 256, 00:21:42.886 "data_size": 7936 00:21:42.886 } 00:21:42.886 ] 00:21:42.886 }' 00:21:42.886 18:58:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:42.886 18:58:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:43.449 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:43.449 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:43.449 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:43.449 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:43.449 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:43.449 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.449 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:43.449 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:43.449 "name": "raid_bdev1", 00:21:43.449 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:43.449 "strip_size_kb": 0, 00:21:43.449 "state": "online", 00:21:43.449 "raid_level": "raid1", 00:21:43.449 "superblock": true, 00:21:43.449 "num_base_bdevs": 2, 00:21:43.449 "num_base_bdevs_discovered": 1, 00:21:43.449 "num_base_bdevs_operational": 1, 00:21:43.449 "base_bdevs_list": [ 00:21:43.449 { 00:21:43.449 "name": null, 00:21:43.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:43.449 "is_configured": false, 00:21:43.449 "data_offset": 256, 00:21:43.449 "data_size": 7936 00:21:43.449 }, 00:21:43.449 { 00:21:43.449 "name": "BaseBdev2", 00:21:43.449 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:43.449 "is_configured": true, 00:21:43.449 "data_offset": 256, 00:21:43.449 "data_size": 7936 00:21:43.449 } 00:21:43.449 ] 00:21:43.449 }' 00:21:43.449 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:43.706 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:43.706 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:43.706 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:43.707 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:43.707 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:21:43.707 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:43.707 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:43.707 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:43.707 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:43.707 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:43.707 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:43.707 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:43.707 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:43.707 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:43.707 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:43.707 [2024-07-24 18:58:28.682235] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:43.707 [2024-07-24 18:58:28.682333] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:21:43.707 [2024-07-24 18:58:28.682341] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:43.707 request: 00:21:43.707 { 00:21:43.707 "base_bdev": "BaseBdev1", 00:21:43.707 "raid_bdev": "raid_bdev1", 00:21:43.707 "method": "bdev_raid_add_base_bdev", 00:21:43.707 "req_id": 1 00:21:43.707 } 00:21:43.707 Got JSON-RPC error response 00:21:43.707 response: 00:21:43.707 { 00:21:43.707 "code": -22, 00:21:43.707 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:21:43.707 } 00:21:43.707 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:21:43.707 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:43.707 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:43.707 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:43.707 18:58:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:21:45.081 18:58:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:45.081 18:58:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:45.081 18:58:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:45.081 18:58:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:45.081 18:58:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:45.081 18:58:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:45.081 18:58:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:45.081 18:58:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:45.081 18:58:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:45.081 18:58:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:45.081 18:58:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.081 18:58:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:45.081 18:58:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:45.081 "name": "raid_bdev1", 00:21:45.081 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:45.081 "strip_size_kb": 0, 00:21:45.081 "state": "online", 00:21:45.081 "raid_level": "raid1", 00:21:45.081 "superblock": true, 00:21:45.081 "num_base_bdevs": 2, 00:21:45.081 "num_base_bdevs_discovered": 1, 00:21:45.081 "num_base_bdevs_operational": 1, 00:21:45.081 "base_bdevs_list": [ 00:21:45.081 { 00:21:45.081 "name": null, 00:21:45.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.081 "is_configured": false, 00:21:45.081 "data_offset": 256, 00:21:45.081 "data_size": 7936 00:21:45.081 }, 00:21:45.081 { 00:21:45.081 "name": "BaseBdev2", 00:21:45.081 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:45.081 "is_configured": true, 00:21:45.081 "data_offset": 256, 00:21:45.081 "data_size": 7936 00:21:45.081 } 00:21:45.081 ] 00:21:45.081 }' 00:21:45.081 18:58:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:45.081 18:58:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:45.647 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:45.647 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:45.647 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:45.647 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:45.647 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:45.647 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.647 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:45.647 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:45.647 "name": "raid_bdev1", 00:21:45.647 "uuid": "ded3a2b6-6eb8-425e-9256-b56b1b5dcd86", 00:21:45.647 "strip_size_kb": 0, 00:21:45.647 "state": "online", 00:21:45.647 "raid_level": "raid1", 00:21:45.647 "superblock": true, 00:21:45.647 "num_base_bdevs": 2, 00:21:45.647 "num_base_bdevs_discovered": 1, 00:21:45.647 "num_base_bdevs_operational": 1, 00:21:45.647 "base_bdevs_list": [ 00:21:45.647 { 00:21:45.647 "name": null, 00:21:45.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.647 "is_configured": false, 00:21:45.647 "data_offset": 256, 00:21:45.647 "data_size": 7936 00:21:45.647 }, 00:21:45.647 { 00:21:45.647 "name": "BaseBdev2", 00:21:45.647 "uuid": "04d9b8fe-391a-5ed4-bca0-0b98f2ce6a72", 00:21:45.647 "is_configured": true, 00:21:45.647 "data_offset": 256, 00:21:45.647 "data_size": 7936 00:21:45.647 } 00:21:45.647 ] 00:21:45.647 }' 00:21:45.647 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:45.647 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:45.647 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:45.647 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:45.647 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 2185860 00:21:45.647 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 2185860 ']' 00:21:45.647 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 2185860 00:21:45.647 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:21:45.647 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:45.647 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2185860 00:21:45.905 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:45.905 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:45.905 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2185860' 00:21:45.905 killing process with pid 2185860 00:21:45.905 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 2185860 00:21:45.905 Received shutdown signal, test time was about 60.000000 seconds 00:21:45.905 00:21:45.905 Latency(us) 00:21:45.905 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:45.905 =================================================================================================================== 00:21:45.905 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:21:45.905 [2024-07-24 18:58:30.677058] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:45.905 [2024-07-24 18:58:30.677124] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:45.905 [2024-07-24 18:58:30.677156] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:45.905 [2024-07-24 18:58:30.677161] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa84f00 name raid_bdev1, state offline 00:21:45.905 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 2185860 00:21:45.905 [2024-07-24 18:58:30.700413] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:45.905 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:21:45.905 00:21:45.906 real 0m25.525s 00:21:45.906 user 0m39.113s 00:21:45.906 sys 0m3.254s 00:21:45.906 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:45.906 18:58:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:45.906 ************************************ 00:21:45.906 END TEST raid_rebuild_test_sb_4k 00:21:45.906 ************************************ 00:21:45.906 18:58:30 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:21:45.906 18:58:30 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:21:45.906 18:58:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:45.906 18:58:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:45.906 18:58:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:46.164 ************************************ 00:21:46.164 START TEST raid_state_function_test_sb_md_separate 00:21:46.164 ************************************ 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=2190512 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2190512' 00:21:46.164 Process raid pid: 2190512 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 2190512 /var/tmp/spdk-raid.sock 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2190512 ']' 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:46.164 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:46.164 18:58:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:46.164 [2024-07-24 18:58:30.994575] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:21:46.164 [2024-07-24 18:58:30.994634] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:46.164 [2024-07-24 18:58:31.058743] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:46.164 [2024-07-24 18:58:31.137515] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:46.422 [2024-07-24 18:58:31.196978] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:46.422 [2024-07-24 18:58:31.197002] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:46.989 18:58:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:46.989 18:58:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:21:46.989 18:58:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:46.989 [2024-07-24 18:58:31.928152] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:46.989 [2024-07-24 18:58:31.928180] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:46.989 [2024-07-24 18:58:31.928185] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:46.989 [2024-07-24 18:58:31.928191] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:46.989 18:58:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:21:46.989 18:58:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:46.989 18:58:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:46.989 18:58:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:46.989 18:58:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:46.989 18:58:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:46.989 18:58:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:46.989 18:58:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:46.989 18:58:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:46.989 18:58:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:46.989 18:58:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.989 18:58:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:47.246 18:58:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:47.246 "name": "Existed_Raid", 00:21:47.246 "uuid": "1f31eb4b-d2cc-4209-ae59-90b3e1e09718", 00:21:47.246 "strip_size_kb": 0, 00:21:47.246 "state": "configuring", 00:21:47.246 "raid_level": "raid1", 00:21:47.246 "superblock": true, 00:21:47.246 "num_base_bdevs": 2, 00:21:47.246 "num_base_bdevs_discovered": 0, 00:21:47.246 "num_base_bdevs_operational": 2, 00:21:47.246 "base_bdevs_list": [ 00:21:47.246 { 00:21:47.246 "name": "BaseBdev1", 00:21:47.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.246 "is_configured": false, 00:21:47.246 "data_offset": 0, 00:21:47.246 "data_size": 0 00:21:47.246 }, 00:21:47.246 { 00:21:47.246 "name": "BaseBdev2", 00:21:47.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.246 "is_configured": false, 00:21:47.246 "data_offset": 0, 00:21:47.246 "data_size": 0 00:21:47.246 } 00:21:47.246 ] 00:21:47.246 }' 00:21:47.246 18:58:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:47.246 18:58:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:47.812 18:58:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:47.813 [2024-07-24 18:58:32.718102] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:47.813 [2024-07-24 18:58:32.718123] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x146ab80 name Existed_Raid, state configuring 00:21:47.813 18:58:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:48.071 [2024-07-24 18:58:32.890571] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:48.071 [2024-07-24 18:58:32.890589] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:48.071 [2024-07-24 18:58:32.890594] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:48.071 [2024-07-24 18:58:32.890600] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:48.071 18:58:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:21:48.071 [2024-07-24 18:58:33.067907] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:48.071 BaseBdev1 00:21:48.330 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:48.330 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:48.330 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:48.330 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:21:48.330 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:48.330 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:48.330 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:48.330 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:48.587 [ 00:21:48.587 { 00:21:48.587 "name": "BaseBdev1", 00:21:48.587 "aliases": [ 00:21:48.587 "b574f4d8-51ea-4867-9e93-3e186515b7af" 00:21:48.587 ], 00:21:48.587 "product_name": "Malloc disk", 00:21:48.587 "block_size": 4096, 00:21:48.587 "num_blocks": 8192, 00:21:48.587 "uuid": "b574f4d8-51ea-4867-9e93-3e186515b7af", 00:21:48.587 "md_size": 32, 00:21:48.587 "md_interleave": false, 00:21:48.587 "dif_type": 0, 00:21:48.587 "assigned_rate_limits": { 00:21:48.587 "rw_ios_per_sec": 0, 00:21:48.587 "rw_mbytes_per_sec": 0, 00:21:48.587 "r_mbytes_per_sec": 0, 00:21:48.587 "w_mbytes_per_sec": 0 00:21:48.587 }, 00:21:48.587 "claimed": true, 00:21:48.587 "claim_type": "exclusive_write", 00:21:48.587 "zoned": false, 00:21:48.587 "supported_io_types": { 00:21:48.587 "read": true, 00:21:48.587 "write": true, 00:21:48.587 "unmap": true, 00:21:48.587 "flush": true, 00:21:48.587 "reset": true, 00:21:48.587 "nvme_admin": false, 00:21:48.587 "nvme_io": false, 00:21:48.587 "nvme_io_md": false, 00:21:48.587 "write_zeroes": true, 00:21:48.587 "zcopy": true, 00:21:48.587 "get_zone_info": false, 00:21:48.587 "zone_management": false, 00:21:48.587 "zone_append": false, 00:21:48.587 "compare": false, 00:21:48.587 "compare_and_write": false, 00:21:48.587 "abort": true, 00:21:48.587 "seek_hole": false, 00:21:48.587 "seek_data": false, 00:21:48.587 "copy": true, 00:21:48.587 "nvme_iov_md": false 00:21:48.587 }, 00:21:48.587 "memory_domains": [ 00:21:48.587 { 00:21:48.587 "dma_device_id": "system", 00:21:48.587 "dma_device_type": 1 00:21:48.587 }, 00:21:48.587 { 00:21:48.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:48.587 "dma_device_type": 2 00:21:48.587 } 00:21:48.587 ], 00:21:48.587 "driver_specific": {} 00:21:48.587 } 00:21:48.587 ] 00:21:48.587 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:21:48.587 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:21:48.587 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:48.587 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:48.587 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:48.587 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:48.587 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:48.587 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:48.587 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:48.587 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:48.587 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:48.587 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.587 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:48.587 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:48.587 "name": "Existed_Raid", 00:21:48.587 "uuid": "55b3c8d3-169d-4a48-adba-1cad5901e457", 00:21:48.587 "strip_size_kb": 0, 00:21:48.587 "state": "configuring", 00:21:48.587 "raid_level": "raid1", 00:21:48.587 "superblock": true, 00:21:48.587 "num_base_bdevs": 2, 00:21:48.587 "num_base_bdevs_discovered": 1, 00:21:48.587 "num_base_bdevs_operational": 2, 00:21:48.587 "base_bdevs_list": [ 00:21:48.587 { 00:21:48.587 "name": "BaseBdev1", 00:21:48.587 "uuid": "b574f4d8-51ea-4867-9e93-3e186515b7af", 00:21:48.587 "is_configured": true, 00:21:48.587 "data_offset": 256, 00:21:48.587 "data_size": 7936 00:21:48.587 }, 00:21:48.587 { 00:21:48.587 "name": "BaseBdev2", 00:21:48.587 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:48.587 "is_configured": false, 00:21:48.587 "data_offset": 0, 00:21:48.587 "data_size": 0 00:21:48.587 } 00:21:48.587 ] 00:21:48.587 }' 00:21:48.587 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:48.587 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:49.151 18:58:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:49.151 [2024-07-24 18:58:34.154726] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:49.151 [2024-07-24 18:58:34.154759] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x146a470 name Existed_Raid, state configuring 00:21:49.408 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:49.408 [2024-07-24 18:58:34.311150] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:49.408 [2024-07-24 18:58:34.312180] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:49.408 [2024-07-24 18:58:34.312203] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:49.408 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:49.408 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:49.408 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:21:49.408 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:49.408 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:49.408 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:49.408 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:49.408 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:49.408 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:49.408 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:49.408 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:49.408 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:49.408 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.408 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:49.666 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:49.666 "name": "Existed_Raid", 00:21:49.666 "uuid": "9e4d116c-0128-417d-aa6d-df54c63f8f67", 00:21:49.666 "strip_size_kb": 0, 00:21:49.666 "state": "configuring", 00:21:49.666 "raid_level": "raid1", 00:21:49.666 "superblock": true, 00:21:49.666 "num_base_bdevs": 2, 00:21:49.666 "num_base_bdevs_discovered": 1, 00:21:49.666 "num_base_bdevs_operational": 2, 00:21:49.666 "base_bdevs_list": [ 00:21:49.666 { 00:21:49.666 "name": "BaseBdev1", 00:21:49.666 "uuid": "b574f4d8-51ea-4867-9e93-3e186515b7af", 00:21:49.666 "is_configured": true, 00:21:49.666 "data_offset": 256, 00:21:49.666 "data_size": 7936 00:21:49.666 }, 00:21:49.666 { 00:21:49.666 "name": "BaseBdev2", 00:21:49.666 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:49.666 "is_configured": false, 00:21:49.666 "data_offset": 0, 00:21:49.666 "data_size": 0 00:21:49.666 } 00:21:49.666 ] 00:21:49.666 }' 00:21:49.666 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:49.666 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:50.233 18:58:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:21:50.233 [2024-07-24 18:58:35.160664] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:50.233 [2024-07-24 18:58:35.160767] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1607cd0 00:21:50.233 [2024-07-24 18:58:35.160774] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:50.233 [2024-07-24 18:58:35.160814] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1607710 00:21:50.233 [2024-07-24 18:58:35.160875] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1607cd0 00:21:50.233 [2024-07-24 18:58:35.160880] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1607cd0 00:21:50.233 [2024-07-24 18:58:35.160924] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:50.233 BaseBdev2 00:21:50.233 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:50.233 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:50.233 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:50.233 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:21:50.233 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:50.233 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:50.233 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:50.491 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:50.748 [ 00:21:50.748 { 00:21:50.748 "name": "BaseBdev2", 00:21:50.748 "aliases": [ 00:21:50.748 "98b7983a-dbb9-4e9d-a43d-f9852cbedc00" 00:21:50.748 ], 00:21:50.748 "product_name": "Malloc disk", 00:21:50.748 "block_size": 4096, 00:21:50.748 "num_blocks": 8192, 00:21:50.748 "uuid": "98b7983a-dbb9-4e9d-a43d-f9852cbedc00", 00:21:50.748 "md_size": 32, 00:21:50.748 "md_interleave": false, 00:21:50.748 "dif_type": 0, 00:21:50.748 "assigned_rate_limits": { 00:21:50.748 "rw_ios_per_sec": 0, 00:21:50.748 "rw_mbytes_per_sec": 0, 00:21:50.748 "r_mbytes_per_sec": 0, 00:21:50.748 "w_mbytes_per_sec": 0 00:21:50.748 }, 00:21:50.748 "claimed": true, 00:21:50.748 "claim_type": "exclusive_write", 00:21:50.748 "zoned": false, 00:21:50.748 "supported_io_types": { 00:21:50.748 "read": true, 00:21:50.748 "write": true, 00:21:50.748 "unmap": true, 00:21:50.748 "flush": true, 00:21:50.748 "reset": true, 00:21:50.748 "nvme_admin": false, 00:21:50.748 "nvme_io": false, 00:21:50.748 "nvme_io_md": false, 00:21:50.748 "write_zeroes": true, 00:21:50.748 "zcopy": true, 00:21:50.748 "get_zone_info": false, 00:21:50.748 "zone_management": false, 00:21:50.748 "zone_append": false, 00:21:50.748 "compare": false, 00:21:50.748 "compare_and_write": false, 00:21:50.748 "abort": true, 00:21:50.748 "seek_hole": false, 00:21:50.748 "seek_data": false, 00:21:50.748 "copy": true, 00:21:50.748 "nvme_iov_md": false 00:21:50.748 }, 00:21:50.748 "memory_domains": [ 00:21:50.748 { 00:21:50.748 "dma_device_id": "system", 00:21:50.748 "dma_device_type": 1 00:21:50.748 }, 00:21:50.748 { 00:21:50.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:50.748 "dma_device_type": 2 00:21:50.748 } 00:21:50.748 ], 00:21:50.748 "driver_specific": {} 00:21:50.748 } 00:21:50.748 ] 00:21:50.748 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:21:50.748 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:50.748 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:50.748 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:21:50.748 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:50.748 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:50.748 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:50.748 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:50.748 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:50.748 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:50.748 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:50.748 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:50.748 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:50.748 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.748 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:50.748 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:50.748 "name": "Existed_Raid", 00:21:50.748 "uuid": "9e4d116c-0128-417d-aa6d-df54c63f8f67", 00:21:50.748 "strip_size_kb": 0, 00:21:50.748 "state": "online", 00:21:50.748 "raid_level": "raid1", 00:21:50.748 "superblock": true, 00:21:50.748 "num_base_bdevs": 2, 00:21:50.748 "num_base_bdevs_discovered": 2, 00:21:50.748 "num_base_bdevs_operational": 2, 00:21:50.748 "base_bdevs_list": [ 00:21:50.748 { 00:21:50.748 "name": "BaseBdev1", 00:21:50.748 "uuid": "b574f4d8-51ea-4867-9e93-3e186515b7af", 00:21:50.748 "is_configured": true, 00:21:50.748 "data_offset": 256, 00:21:50.748 "data_size": 7936 00:21:50.748 }, 00:21:50.748 { 00:21:50.748 "name": "BaseBdev2", 00:21:50.748 "uuid": "98b7983a-dbb9-4e9d-a43d-f9852cbedc00", 00:21:50.748 "is_configured": true, 00:21:50.748 "data_offset": 256, 00:21:50.748 "data_size": 7936 00:21:50.748 } 00:21:50.748 ] 00:21:50.748 }' 00:21:50.748 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:50.748 18:58:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:51.312 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:51.312 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:51.312 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:51.312 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:51.312 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:51.312 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:21:51.312 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:51.312 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:51.312 [2024-07-24 18:58:36.271744] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:51.312 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:51.312 "name": "Existed_Raid", 00:21:51.312 "aliases": [ 00:21:51.312 "9e4d116c-0128-417d-aa6d-df54c63f8f67" 00:21:51.312 ], 00:21:51.312 "product_name": "Raid Volume", 00:21:51.312 "block_size": 4096, 00:21:51.312 "num_blocks": 7936, 00:21:51.312 "uuid": "9e4d116c-0128-417d-aa6d-df54c63f8f67", 00:21:51.312 "md_size": 32, 00:21:51.312 "md_interleave": false, 00:21:51.312 "dif_type": 0, 00:21:51.312 "assigned_rate_limits": { 00:21:51.312 "rw_ios_per_sec": 0, 00:21:51.312 "rw_mbytes_per_sec": 0, 00:21:51.312 "r_mbytes_per_sec": 0, 00:21:51.312 "w_mbytes_per_sec": 0 00:21:51.312 }, 00:21:51.312 "claimed": false, 00:21:51.312 "zoned": false, 00:21:51.312 "supported_io_types": { 00:21:51.312 "read": true, 00:21:51.312 "write": true, 00:21:51.312 "unmap": false, 00:21:51.312 "flush": false, 00:21:51.312 "reset": true, 00:21:51.312 "nvme_admin": false, 00:21:51.312 "nvme_io": false, 00:21:51.312 "nvme_io_md": false, 00:21:51.312 "write_zeroes": true, 00:21:51.312 "zcopy": false, 00:21:51.312 "get_zone_info": false, 00:21:51.312 "zone_management": false, 00:21:51.312 "zone_append": false, 00:21:51.312 "compare": false, 00:21:51.312 "compare_and_write": false, 00:21:51.312 "abort": false, 00:21:51.312 "seek_hole": false, 00:21:51.312 "seek_data": false, 00:21:51.312 "copy": false, 00:21:51.312 "nvme_iov_md": false 00:21:51.312 }, 00:21:51.312 "memory_domains": [ 00:21:51.312 { 00:21:51.312 "dma_device_id": "system", 00:21:51.312 "dma_device_type": 1 00:21:51.312 }, 00:21:51.312 { 00:21:51.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:51.312 "dma_device_type": 2 00:21:51.312 }, 00:21:51.312 { 00:21:51.312 "dma_device_id": "system", 00:21:51.312 "dma_device_type": 1 00:21:51.312 }, 00:21:51.312 { 00:21:51.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:51.312 "dma_device_type": 2 00:21:51.312 } 00:21:51.312 ], 00:21:51.312 "driver_specific": { 00:21:51.312 "raid": { 00:21:51.312 "uuid": "9e4d116c-0128-417d-aa6d-df54c63f8f67", 00:21:51.312 "strip_size_kb": 0, 00:21:51.312 "state": "online", 00:21:51.312 "raid_level": "raid1", 00:21:51.312 "superblock": true, 00:21:51.312 "num_base_bdevs": 2, 00:21:51.312 "num_base_bdevs_discovered": 2, 00:21:51.312 "num_base_bdevs_operational": 2, 00:21:51.312 "base_bdevs_list": [ 00:21:51.312 { 00:21:51.312 "name": "BaseBdev1", 00:21:51.312 "uuid": "b574f4d8-51ea-4867-9e93-3e186515b7af", 00:21:51.313 "is_configured": true, 00:21:51.313 "data_offset": 256, 00:21:51.313 "data_size": 7936 00:21:51.313 }, 00:21:51.313 { 00:21:51.313 "name": "BaseBdev2", 00:21:51.313 "uuid": "98b7983a-dbb9-4e9d-a43d-f9852cbedc00", 00:21:51.313 "is_configured": true, 00:21:51.313 "data_offset": 256, 00:21:51.313 "data_size": 7936 00:21:51.313 } 00:21:51.313 ] 00:21:51.313 } 00:21:51.313 } 00:21:51.313 }' 00:21:51.313 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:51.570 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:51.570 BaseBdev2' 00:21:51.570 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:51.570 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:51.570 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:51.570 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:51.570 "name": "BaseBdev1", 00:21:51.570 "aliases": [ 00:21:51.570 "b574f4d8-51ea-4867-9e93-3e186515b7af" 00:21:51.570 ], 00:21:51.570 "product_name": "Malloc disk", 00:21:51.570 "block_size": 4096, 00:21:51.570 "num_blocks": 8192, 00:21:51.570 "uuid": "b574f4d8-51ea-4867-9e93-3e186515b7af", 00:21:51.570 "md_size": 32, 00:21:51.570 "md_interleave": false, 00:21:51.570 "dif_type": 0, 00:21:51.570 "assigned_rate_limits": { 00:21:51.570 "rw_ios_per_sec": 0, 00:21:51.570 "rw_mbytes_per_sec": 0, 00:21:51.570 "r_mbytes_per_sec": 0, 00:21:51.570 "w_mbytes_per_sec": 0 00:21:51.570 }, 00:21:51.570 "claimed": true, 00:21:51.570 "claim_type": "exclusive_write", 00:21:51.570 "zoned": false, 00:21:51.570 "supported_io_types": { 00:21:51.570 "read": true, 00:21:51.570 "write": true, 00:21:51.570 "unmap": true, 00:21:51.570 "flush": true, 00:21:51.570 "reset": true, 00:21:51.570 "nvme_admin": false, 00:21:51.570 "nvme_io": false, 00:21:51.570 "nvme_io_md": false, 00:21:51.570 "write_zeroes": true, 00:21:51.570 "zcopy": true, 00:21:51.570 "get_zone_info": false, 00:21:51.570 "zone_management": false, 00:21:51.570 "zone_append": false, 00:21:51.570 "compare": false, 00:21:51.570 "compare_and_write": false, 00:21:51.570 "abort": true, 00:21:51.570 "seek_hole": false, 00:21:51.570 "seek_data": false, 00:21:51.570 "copy": true, 00:21:51.570 "nvme_iov_md": false 00:21:51.570 }, 00:21:51.570 "memory_domains": [ 00:21:51.570 { 00:21:51.570 "dma_device_id": "system", 00:21:51.570 "dma_device_type": 1 00:21:51.570 }, 00:21:51.570 { 00:21:51.570 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:51.570 "dma_device_type": 2 00:21:51.570 } 00:21:51.570 ], 00:21:51.570 "driver_specific": {} 00:21:51.570 }' 00:21:51.570 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:51.570 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:51.570 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:51.570 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:51.829 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:51.829 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:21:51.829 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:51.829 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:51.829 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:21:51.829 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:51.829 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:51.829 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:21:51.829 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:51.829 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:51.829 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:52.088 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:52.088 "name": "BaseBdev2", 00:21:52.088 "aliases": [ 00:21:52.088 "98b7983a-dbb9-4e9d-a43d-f9852cbedc00" 00:21:52.088 ], 00:21:52.088 "product_name": "Malloc disk", 00:21:52.088 "block_size": 4096, 00:21:52.088 "num_blocks": 8192, 00:21:52.088 "uuid": "98b7983a-dbb9-4e9d-a43d-f9852cbedc00", 00:21:52.088 "md_size": 32, 00:21:52.088 "md_interleave": false, 00:21:52.088 "dif_type": 0, 00:21:52.088 "assigned_rate_limits": { 00:21:52.088 "rw_ios_per_sec": 0, 00:21:52.088 "rw_mbytes_per_sec": 0, 00:21:52.088 "r_mbytes_per_sec": 0, 00:21:52.088 "w_mbytes_per_sec": 0 00:21:52.088 }, 00:21:52.088 "claimed": true, 00:21:52.088 "claim_type": "exclusive_write", 00:21:52.088 "zoned": false, 00:21:52.088 "supported_io_types": { 00:21:52.088 "read": true, 00:21:52.088 "write": true, 00:21:52.088 "unmap": true, 00:21:52.088 "flush": true, 00:21:52.088 "reset": true, 00:21:52.088 "nvme_admin": false, 00:21:52.088 "nvme_io": false, 00:21:52.088 "nvme_io_md": false, 00:21:52.088 "write_zeroes": true, 00:21:52.088 "zcopy": true, 00:21:52.088 "get_zone_info": false, 00:21:52.088 "zone_management": false, 00:21:52.088 "zone_append": false, 00:21:52.088 "compare": false, 00:21:52.088 "compare_and_write": false, 00:21:52.088 "abort": true, 00:21:52.088 "seek_hole": false, 00:21:52.088 "seek_data": false, 00:21:52.088 "copy": true, 00:21:52.088 "nvme_iov_md": false 00:21:52.088 }, 00:21:52.088 "memory_domains": [ 00:21:52.088 { 00:21:52.088 "dma_device_id": "system", 00:21:52.088 "dma_device_type": 1 00:21:52.088 }, 00:21:52.088 { 00:21:52.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:52.088 "dma_device_type": 2 00:21:52.088 } 00:21:52.088 ], 00:21:52.088 "driver_specific": {} 00:21:52.088 }' 00:21:52.088 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:52.088 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:52.088 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:52.088 18:58:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:52.088 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:52.088 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:21:52.088 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:52.088 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:52.347 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:21:52.347 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:52.347 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:52.347 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:21:52.347 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:52.347 [2024-07-24 18:58:37.350388] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:52.604 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:52.604 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:52.604 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:52.604 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:21:52.604 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:52.604 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:21:52.604 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:52.604 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:52.604 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:52.604 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:52.604 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:52.604 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:52.604 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:52.604 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:52.604 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:52.605 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.605 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:52.605 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:52.605 "name": "Existed_Raid", 00:21:52.605 "uuid": "9e4d116c-0128-417d-aa6d-df54c63f8f67", 00:21:52.605 "strip_size_kb": 0, 00:21:52.605 "state": "online", 00:21:52.605 "raid_level": "raid1", 00:21:52.605 "superblock": true, 00:21:52.605 "num_base_bdevs": 2, 00:21:52.605 "num_base_bdevs_discovered": 1, 00:21:52.605 "num_base_bdevs_operational": 1, 00:21:52.605 "base_bdevs_list": [ 00:21:52.605 { 00:21:52.605 "name": null, 00:21:52.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:52.605 "is_configured": false, 00:21:52.605 "data_offset": 256, 00:21:52.605 "data_size": 7936 00:21:52.605 }, 00:21:52.605 { 00:21:52.605 "name": "BaseBdev2", 00:21:52.605 "uuid": "98b7983a-dbb9-4e9d-a43d-f9852cbedc00", 00:21:52.605 "is_configured": true, 00:21:52.605 "data_offset": 256, 00:21:52.605 "data_size": 7936 00:21:52.605 } 00:21:52.605 ] 00:21:52.605 }' 00:21:52.605 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:52.605 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:53.170 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:53.170 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:53.170 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:53.170 18:58:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:53.170 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:53.170 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:53.170 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:53.427 [2024-07-24 18:58:38.322723] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:53.427 [2024-07-24 18:58:38.322791] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:53.427 [2024-07-24 18:58:38.333090] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:53.427 [2024-07-24 18:58:38.333116] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:53.427 [2024-07-24 18:58:38.333121] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1607cd0 name Existed_Raid, state offline 00:21:53.427 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:53.427 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:53.427 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:53.427 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:53.685 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:53.685 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:53.685 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:21:53.685 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 2190512 00:21:53.685 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2190512 ']' 00:21:53.685 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 2190512 00:21:53.685 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:21:53.685 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:53.685 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2190512 00:21:53.685 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:53.685 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:53.685 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2190512' 00:21:53.685 killing process with pid 2190512 00:21:53.685 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 2190512 00:21:53.685 [2024-07-24 18:58:38.551819] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:53.685 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 2190512 00:21:53.685 [2024-07-24 18:58:38.552601] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:53.944 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:21:53.944 00:21:53.944 real 0m7.787s 00:21:53.944 user 0m13.930s 00:21:53.944 sys 0m1.247s 00:21:53.944 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:53.944 18:58:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:53.944 ************************************ 00:21:53.944 END TEST raid_state_function_test_sb_md_separate 00:21:53.944 ************************************ 00:21:53.944 18:58:38 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:21:53.944 18:58:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:21:53.944 18:58:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:53.944 18:58:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:53.944 ************************************ 00:21:53.944 START TEST raid_superblock_test_md_separate 00:21:53.944 ************************************ 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=2192007 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 2192007 /var/tmp/spdk-raid.sock 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2192007 ']' 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:53.944 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:53.944 18:58:38 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:53.944 [2024-07-24 18:58:38.820200] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:21:53.944 [2024-07-24 18:58:38.820237] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2192007 ] 00:21:53.944 [2024-07-24 18:58:38.883069] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:54.203 [2024-07-24 18:58:38.961554] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:54.203 [2024-07-24 18:58:39.012039] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:54.203 [2024-07-24 18:58:39.012065] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:54.770 18:58:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:54.770 18:58:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:21:54.770 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:21:54.770 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:54.770 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:21:54.770 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:21:54.770 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:54.770 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:54.770 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:54.770 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:54.770 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:21:55.028 malloc1 00:21:55.028 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:55.028 [2024-07-24 18:58:39.944042] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:55.028 [2024-07-24 18:58:39.944077] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:55.028 [2024-07-24 18:58:39.944089] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23fa7c0 00:21:55.028 [2024-07-24 18:58:39.944110] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:55.028 [2024-07-24 18:58:39.945163] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:55.028 [2024-07-24 18:58:39.945183] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:55.028 pt1 00:21:55.028 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:55.028 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:55.028 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:21:55.028 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:21:55.028 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:55.028 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:55.028 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:55.028 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:55.028 18:58:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:21:55.286 malloc2 00:21:55.286 18:58:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:55.286 [2024-07-24 18:58:40.293121] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:55.286 [2024-07-24 18:58:40.293153] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:55.286 [2024-07-24 18:58:40.293163] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2586ef0 00:21:55.286 [2024-07-24 18:58:40.293169] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:55.286 [2024-07-24 18:58:40.294267] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:55.286 [2024-07-24 18:58:40.294286] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:55.544 pt2 00:21:55.544 18:58:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:55.544 18:58:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:55.544 18:58:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:21:55.544 [2024-07-24 18:58:40.461571] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:55.544 [2024-07-24 18:58:40.462428] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:55.544 [2024-07-24 18:58:40.462536] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x257c420 00:21:55.544 [2024-07-24 18:58:40.462544] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:55.544 [2024-07-24 18:58:40.462589] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x257d4a0 00:21:55.544 [2024-07-24 18:58:40.462663] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x257c420 00:21:55.544 [2024-07-24 18:58:40.462668] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x257c420 00:21:55.544 [2024-07-24 18:58:40.462709] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:55.544 18:58:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:55.544 18:58:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:55.544 18:58:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:55.544 18:58:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:55.544 18:58:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:55.544 18:58:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:55.544 18:58:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:55.544 18:58:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:55.544 18:58:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:55.544 18:58:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:55.544 18:58:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.544 18:58:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:55.802 18:58:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:55.802 "name": "raid_bdev1", 00:21:55.802 "uuid": "34354063-6106-4b7e-9059-b2c9f793c293", 00:21:55.802 "strip_size_kb": 0, 00:21:55.802 "state": "online", 00:21:55.802 "raid_level": "raid1", 00:21:55.802 "superblock": true, 00:21:55.802 "num_base_bdevs": 2, 00:21:55.802 "num_base_bdevs_discovered": 2, 00:21:55.802 "num_base_bdevs_operational": 2, 00:21:55.802 "base_bdevs_list": [ 00:21:55.802 { 00:21:55.802 "name": "pt1", 00:21:55.802 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:55.802 "is_configured": true, 00:21:55.802 "data_offset": 256, 00:21:55.802 "data_size": 7936 00:21:55.802 }, 00:21:55.802 { 00:21:55.802 "name": "pt2", 00:21:55.802 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:55.802 "is_configured": true, 00:21:55.802 "data_offset": 256, 00:21:55.802 "data_size": 7936 00:21:55.802 } 00:21:55.802 ] 00:21:55.802 }' 00:21:55.802 18:58:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:55.802 18:58:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:56.370 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:21:56.370 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:56.370 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:56.370 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:56.370 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:56.370 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:21:56.370 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:56.370 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:56.370 [2024-07-24 18:58:41.231717] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:56.370 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:56.370 "name": "raid_bdev1", 00:21:56.370 "aliases": [ 00:21:56.370 "34354063-6106-4b7e-9059-b2c9f793c293" 00:21:56.370 ], 00:21:56.370 "product_name": "Raid Volume", 00:21:56.370 "block_size": 4096, 00:21:56.370 "num_blocks": 7936, 00:21:56.370 "uuid": "34354063-6106-4b7e-9059-b2c9f793c293", 00:21:56.370 "md_size": 32, 00:21:56.370 "md_interleave": false, 00:21:56.370 "dif_type": 0, 00:21:56.370 "assigned_rate_limits": { 00:21:56.370 "rw_ios_per_sec": 0, 00:21:56.370 "rw_mbytes_per_sec": 0, 00:21:56.370 "r_mbytes_per_sec": 0, 00:21:56.370 "w_mbytes_per_sec": 0 00:21:56.370 }, 00:21:56.370 "claimed": false, 00:21:56.370 "zoned": false, 00:21:56.370 "supported_io_types": { 00:21:56.370 "read": true, 00:21:56.370 "write": true, 00:21:56.370 "unmap": false, 00:21:56.370 "flush": false, 00:21:56.370 "reset": true, 00:21:56.370 "nvme_admin": false, 00:21:56.370 "nvme_io": false, 00:21:56.370 "nvme_io_md": false, 00:21:56.370 "write_zeroes": true, 00:21:56.371 "zcopy": false, 00:21:56.371 "get_zone_info": false, 00:21:56.371 "zone_management": false, 00:21:56.371 "zone_append": false, 00:21:56.371 "compare": false, 00:21:56.371 "compare_and_write": false, 00:21:56.371 "abort": false, 00:21:56.371 "seek_hole": false, 00:21:56.371 "seek_data": false, 00:21:56.371 "copy": false, 00:21:56.371 "nvme_iov_md": false 00:21:56.371 }, 00:21:56.371 "memory_domains": [ 00:21:56.371 { 00:21:56.371 "dma_device_id": "system", 00:21:56.371 "dma_device_type": 1 00:21:56.371 }, 00:21:56.371 { 00:21:56.371 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:56.371 "dma_device_type": 2 00:21:56.371 }, 00:21:56.371 { 00:21:56.371 "dma_device_id": "system", 00:21:56.371 "dma_device_type": 1 00:21:56.371 }, 00:21:56.371 { 00:21:56.371 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:56.371 "dma_device_type": 2 00:21:56.371 } 00:21:56.371 ], 00:21:56.371 "driver_specific": { 00:21:56.371 "raid": { 00:21:56.371 "uuid": "34354063-6106-4b7e-9059-b2c9f793c293", 00:21:56.371 "strip_size_kb": 0, 00:21:56.371 "state": "online", 00:21:56.371 "raid_level": "raid1", 00:21:56.371 "superblock": true, 00:21:56.371 "num_base_bdevs": 2, 00:21:56.371 "num_base_bdevs_discovered": 2, 00:21:56.371 "num_base_bdevs_operational": 2, 00:21:56.371 "base_bdevs_list": [ 00:21:56.371 { 00:21:56.371 "name": "pt1", 00:21:56.371 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:56.371 "is_configured": true, 00:21:56.371 "data_offset": 256, 00:21:56.371 "data_size": 7936 00:21:56.371 }, 00:21:56.371 { 00:21:56.371 "name": "pt2", 00:21:56.371 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:56.371 "is_configured": true, 00:21:56.371 "data_offset": 256, 00:21:56.371 "data_size": 7936 00:21:56.371 } 00:21:56.371 ] 00:21:56.371 } 00:21:56.371 } 00:21:56.371 }' 00:21:56.371 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:56.371 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:56.371 pt2' 00:21:56.371 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:56.371 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:56.371 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:56.629 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:56.629 "name": "pt1", 00:21:56.629 "aliases": [ 00:21:56.629 "00000000-0000-0000-0000-000000000001" 00:21:56.629 ], 00:21:56.629 "product_name": "passthru", 00:21:56.629 "block_size": 4096, 00:21:56.629 "num_blocks": 8192, 00:21:56.629 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:56.629 "md_size": 32, 00:21:56.629 "md_interleave": false, 00:21:56.629 "dif_type": 0, 00:21:56.629 "assigned_rate_limits": { 00:21:56.629 "rw_ios_per_sec": 0, 00:21:56.629 "rw_mbytes_per_sec": 0, 00:21:56.629 "r_mbytes_per_sec": 0, 00:21:56.629 "w_mbytes_per_sec": 0 00:21:56.629 }, 00:21:56.629 "claimed": true, 00:21:56.629 "claim_type": "exclusive_write", 00:21:56.629 "zoned": false, 00:21:56.629 "supported_io_types": { 00:21:56.629 "read": true, 00:21:56.629 "write": true, 00:21:56.629 "unmap": true, 00:21:56.629 "flush": true, 00:21:56.629 "reset": true, 00:21:56.629 "nvme_admin": false, 00:21:56.629 "nvme_io": false, 00:21:56.629 "nvme_io_md": false, 00:21:56.629 "write_zeroes": true, 00:21:56.629 "zcopy": true, 00:21:56.629 "get_zone_info": false, 00:21:56.629 "zone_management": false, 00:21:56.629 "zone_append": false, 00:21:56.629 "compare": false, 00:21:56.629 "compare_and_write": false, 00:21:56.629 "abort": true, 00:21:56.629 "seek_hole": false, 00:21:56.629 "seek_data": false, 00:21:56.629 "copy": true, 00:21:56.629 "nvme_iov_md": false 00:21:56.629 }, 00:21:56.629 "memory_domains": [ 00:21:56.629 { 00:21:56.629 "dma_device_id": "system", 00:21:56.629 "dma_device_type": 1 00:21:56.629 }, 00:21:56.629 { 00:21:56.629 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:56.629 "dma_device_type": 2 00:21:56.629 } 00:21:56.629 ], 00:21:56.629 "driver_specific": { 00:21:56.629 "passthru": { 00:21:56.629 "name": "pt1", 00:21:56.629 "base_bdev_name": "malloc1" 00:21:56.629 } 00:21:56.629 } 00:21:56.629 }' 00:21:56.629 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:56.629 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:56.629 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:56.629 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:56.629 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:56.629 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:21:56.629 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:56.629 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:56.887 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:21:56.887 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:56.887 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:56.887 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:21:56.887 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:56.887 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:56.887 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:56.887 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:56.887 "name": "pt2", 00:21:56.887 "aliases": [ 00:21:56.887 "00000000-0000-0000-0000-000000000002" 00:21:56.887 ], 00:21:56.887 "product_name": "passthru", 00:21:56.887 "block_size": 4096, 00:21:56.887 "num_blocks": 8192, 00:21:56.887 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:56.887 "md_size": 32, 00:21:56.887 "md_interleave": false, 00:21:56.887 "dif_type": 0, 00:21:56.887 "assigned_rate_limits": { 00:21:56.887 "rw_ios_per_sec": 0, 00:21:56.887 "rw_mbytes_per_sec": 0, 00:21:56.887 "r_mbytes_per_sec": 0, 00:21:56.887 "w_mbytes_per_sec": 0 00:21:56.887 }, 00:21:56.887 "claimed": true, 00:21:56.887 "claim_type": "exclusive_write", 00:21:56.887 "zoned": false, 00:21:56.887 "supported_io_types": { 00:21:56.887 "read": true, 00:21:56.887 "write": true, 00:21:56.887 "unmap": true, 00:21:56.887 "flush": true, 00:21:56.887 "reset": true, 00:21:56.887 "nvme_admin": false, 00:21:56.887 "nvme_io": false, 00:21:56.887 "nvme_io_md": false, 00:21:56.887 "write_zeroes": true, 00:21:56.887 "zcopy": true, 00:21:56.887 "get_zone_info": false, 00:21:56.887 "zone_management": false, 00:21:56.887 "zone_append": false, 00:21:56.887 "compare": false, 00:21:56.887 "compare_and_write": false, 00:21:56.887 "abort": true, 00:21:56.887 "seek_hole": false, 00:21:56.887 "seek_data": false, 00:21:56.887 "copy": true, 00:21:56.887 "nvme_iov_md": false 00:21:56.887 }, 00:21:56.887 "memory_domains": [ 00:21:56.887 { 00:21:56.887 "dma_device_id": "system", 00:21:56.887 "dma_device_type": 1 00:21:56.887 }, 00:21:56.887 { 00:21:56.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:56.887 "dma_device_type": 2 00:21:56.887 } 00:21:56.887 ], 00:21:56.887 "driver_specific": { 00:21:56.887 "passthru": { 00:21:56.887 "name": "pt2", 00:21:56.887 "base_bdev_name": "malloc2" 00:21:56.887 } 00:21:56.887 } 00:21:56.887 }' 00:21:56.887 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:57.145 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:57.145 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:57.145 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:57.145 18:58:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:57.145 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:21:57.145 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:57.145 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:57.145 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:21:57.145 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:57.145 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:57.403 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:21:57.403 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:57.403 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:21:57.403 [2024-07-24 18:58:42.314508] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:57.403 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=34354063-6106-4b7e-9059-b2c9f793c293 00:21:57.403 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z 34354063-6106-4b7e-9059-b2c9f793c293 ']' 00:21:57.403 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:57.661 [2024-07-24 18:58:42.486777] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:57.661 [2024-07-24 18:58:42.486791] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:57.661 [2024-07-24 18:58:42.486829] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:57.661 [2024-07-24 18:58:42.486864] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:57.661 [2024-07-24 18:58:42.486870] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x257c420 name raid_bdev1, state offline 00:21:57.661 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.661 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:21:57.662 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:21:57.662 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:21:57.662 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:57.662 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:57.921 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:57.921 18:58:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:58.180 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:58.180 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:58.180 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:21:58.180 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:58.180 18:58:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:21:58.180 18:58:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:58.180 18:58:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:58.180 18:58:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:58.180 18:58:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:58.180 18:58:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:58.180 18:58:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:58.180 18:58:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:58.180 18:58:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:58.180 18:58:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:58.180 18:58:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:58.438 [2024-07-24 18:58:43.328931] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:58.438 [2024-07-24 18:58:43.329930] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:58.438 [2024-07-24 18:58:43.329971] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:58.438 [2024-07-24 18:58:43.330000] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:58.438 [2024-07-24 18:58:43.330010] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:58.438 [2024-07-24 18:58:43.330031] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x257c6a0 name raid_bdev1, state configuring 00:21:58.438 request: 00:21:58.438 { 00:21:58.438 "name": "raid_bdev1", 00:21:58.438 "raid_level": "raid1", 00:21:58.438 "base_bdevs": [ 00:21:58.438 "malloc1", 00:21:58.438 "malloc2" 00:21:58.438 ], 00:21:58.438 "superblock": false, 00:21:58.438 "method": "bdev_raid_create", 00:21:58.438 "req_id": 1 00:21:58.438 } 00:21:58.438 Got JSON-RPC error response 00:21:58.438 response: 00:21:58.438 { 00:21:58.438 "code": -17, 00:21:58.438 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:58.438 } 00:21:58.438 18:58:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:21:58.438 18:58:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:58.438 18:58:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:58.438 18:58:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:58.438 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.438 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:21:58.695 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:21:58.695 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:21:58.695 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:58.695 [2024-07-24 18:58:43.669787] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:58.695 [2024-07-24 18:58:43.669815] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:58.695 [2024-07-24 18:58:43.669825] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23fa9f0 00:21:58.695 [2024-07-24 18:58:43.669831] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:58.695 [2024-07-24 18:58:43.670900] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:58.695 [2024-07-24 18:58:43.670919] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:58.695 [2024-07-24 18:58:43.670947] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:58.695 [2024-07-24 18:58:43.670969] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:58.695 pt1 00:21:58.695 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:21:58.695 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:58.695 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:58.695 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:58.695 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:58.695 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:58.695 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:58.695 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:58.695 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:58.695 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:58.695 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:58.695 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.952 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:58.952 "name": "raid_bdev1", 00:21:58.952 "uuid": "34354063-6106-4b7e-9059-b2c9f793c293", 00:21:58.952 "strip_size_kb": 0, 00:21:58.952 "state": "configuring", 00:21:58.952 "raid_level": "raid1", 00:21:58.952 "superblock": true, 00:21:58.952 "num_base_bdevs": 2, 00:21:58.952 "num_base_bdevs_discovered": 1, 00:21:58.952 "num_base_bdevs_operational": 2, 00:21:58.952 "base_bdevs_list": [ 00:21:58.952 { 00:21:58.952 "name": "pt1", 00:21:58.952 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:58.952 "is_configured": true, 00:21:58.952 "data_offset": 256, 00:21:58.952 "data_size": 7936 00:21:58.952 }, 00:21:58.952 { 00:21:58.952 "name": null, 00:21:58.952 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:58.952 "is_configured": false, 00:21:58.952 "data_offset": 256, 00:21:58.952 "data_size": 7936 00:21:58.952 } 00:21:58.952 ] 00:21:58.952 }' 00:21:58.952 18:58:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:58.952 18:58:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:59.517 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:21:59.517 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:21:59.517 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:59.517 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:59.517 [2024-07-24 18:58:44.483902] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:59.517 [2024-07-24 18:58:44.483939] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:59.517 [2024-07-24 18:58:44.483950] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x257deb0 00:21:59.517 [2024-07-24 18:58:44.483957] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:59.517 [2024-07-24 18:58:44.484098] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:59.517 [2024-07-24 18:58:44.484107] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:59.517 [2024-07-24 18:58:44.484134] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:59.517 [2024-07-24 18:58:44.484146] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:59.517 [2024-07-24 18:58:44.484207] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23f9280 00:21:59.517 [2024-07-24 18:58:44.484212] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:59.517 [2024-07-24 18:58:44.484255] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x257f4a0 00:21:59.517 [2024-07-24 18:58:44.484320] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23f9280 00:21:59.517 [2024-07-24 18:58:44.484325] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23f9280 00:21:59.517 [2024-07-24 18:58:44.484373] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:59.517 pt2 00:21:59.517 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:59.517 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:59.517 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:59.517 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:59.517 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:59.517 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:59.517 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:59.517 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:59.517 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:59.517 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:59.517 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:59.517 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:59.517 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:59.517 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:59.775 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:59.775 "name": "raid_bdev1", 00:21:59.775 "uuid": "34354063-6106-4b7e-9059-b2c9f793c293", 00:21:59.775 "strip_size_kb": 0, 00:21:59.775 "state": "online", 00:21:59.775 "raid_level": "raid1", 00:21:59.775 "superblock": true, 00:21:59.775 "num_base_bdevs": 2, 00:21:59.775 "num_base_bdevs_discovered": 2, 00:21:59.775 "num_base_bdevs_operational": 2, 00:21:59.775 "base_bdevs_list": [ 00:21:59.775 { 00:21:59.775 "name": "pt1", 00:21:59.775 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:59.775 "is_configured": true, 00:21:59.775 "data_offset": 256, 00:21:59.775 "data_size": 7936 00:21:59.775 }, 00:21:59.775 { 00:21:59.775 "name": "pt2", 00:21:59.775 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:59.775 "is_configured": true, 00:21:59.775 "data_offset": 256, 00:21:59.775 "data_size": 7936 00:21:59.775 } 00:21:59.775 ] 00:21:59.775 }' 00:21:59.775 18:58:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:59.775 18:58:44 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:00.339 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:22:00.339 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:00.339 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:00.339 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:00.339 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:00.339 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:22:00.339 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:00.339 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:00.339 [2024-07-24 18:58:45.318237] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:00.339 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:00.339 "name": "raid_bdev1", 00:22:00.339 "aliases": [ 00:22:00.339 "34354063-6106-4b7e-9059-b2c9f793c293" 00:22:00.339 ], 00:22:00.339 "product_name": "Raid Volume", 00:22:00.339 "block_size": 4096, 00:22:00.339 "num_blocks": 7936, 00:22:00.339 "uuid": "34354063-6106-4b7e-9059-b2c9f793c293", 00:22:00.339 "md_size": 32, 00:22:00.339 "md_interleave": false, 00:22:00.339 "dif_type": 0, 00:22:00.339 "assigned_rate_limits": { 00:22:00.339 "rw_ios_per_sec": 0, 00:22:00.339 "rw_mbytes_per_sec": 0, 00:22:00.339 "r_mbytes_per_sec": 0, 00:22:00.339 "w_mbytes_per_sec": 0 00:22:00.339 }, 00:22:00.339 "claimed": false, 00:22:00.339 "zoned": false, 00:22:00.339 "supported_io_types": { 00:22:00.339 "read": true, 00:22:00.339 "write": true, 00:22:00.339 "unmap": false, 00:22:00.339 "flush": false, 00:22:00.339 "reset": true, 00:22:00.339 "nvme_admin": false, 00:22:00.339 "nvme_io": false, 00:22:00.339 "nvme_io_md": false, 00:22:00.339 "write_zeroes": true, 00:22:00.339 "zcopy": false, 00:22:00.339 "get_zone_info": false, 00:22:00.339 "zone_management": false, 00:22:00.339 "zone_append": false, 00:22:00.339 "compare": false, 00:22:00.339 "compare_and_write": false, 00:22:00.339 "abort": false, 00:22:00.339 "seek_hole": false, 00:22:00.339 "seek_data": false, 00:22:00.339 "copy": false, 00:22:00.339 "nvme_iov_md": false 00:22:00.339 }, 00:22:00.339 "memory_domains": [ 00:22:00.339 { 00:22:00.339 "dma_device_id": "system", 00:22:00.339 "dma_device_type": 1 00:22:00.339 }, 00:22:00.339 { 00:22:00.339 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:00.339 "dma_device_type": 2 00:22:00.339 }, 00:22:00.339 { 00:22:00.339 "dma_device_id": "system", 00:22:00.339 "dma_device_type": 1 00:22:00.339 }, 00:22:00.339 { 00:22:00.339 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:00.339 "dma_device_type": 2 00:22:00.339 } 00:22:00.339 ], 00:22:00.339 "driver_specific": { 00:22:00.339 "raid": { 00:22:00.339 "uuid": "34354063-6106-4b7e-9059-b2c9f793c293", 00:22:00.339 "strip_size_kb": 0, 00:22:00.339 "state": "online", 00:22:00.339 "raid_level": "raid1", 00:22:00.339 "superblock": true, 00:22:00.339 "num_base_bdevs": 2, 00:22:00.339 "num_base_bdevs_discovered": 2, 00:22:00.339 "num_base_bdevs_operational": 2, 00:22:00.339 "base_bdevs_list": [ 00:22:00.339 { 00:22:00.339 "name": "pt1", 00:22:00.339 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:00.339 "is_configured": true, 00:22:00.339 "data_offset": 256, 00:22:00.339 "data_size": 7936 00:22:00.339 }, 00:22:00.339 { 00:22:00.339 "name": "pt2", 00:22:00.339 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:00.339 "is_configured": true, 00:22:00.339 "data_offset": 256, 00:22:00.339 "data_size": 7936 00:22:00.339 } 00:22:00.339 ] 00:22:00.339 } 00:22:00.339 } 00:22:00.339 }' 00:22:00.339 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:00.597 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:00.597 pt2' 00:22:00.597 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:00.597 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:00.597 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:00.597 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:00.597 "name": "pt1", 00:22:00.597 "aliases": [ 00:22:00.597 "00000000-0000-0000-0000-000000000001" 00:22:00.597 ], 00:22:00.597 "product_name": "passthru", 00:22:00.597 "block_size": 4096, 00:22:00.597 "num_blocks": 8192, 00:22:00.597 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:00.597 "md_size": 32, 00:22:00.597 "md_interleave": false, 00:22:00.597 "dif_type": 0, 00:22:00.597 "assigned_rate_limits": { 00:22:00.597 "rw_ios_per_sec": 0, 00:22:00.597 "rw_mbytes_per_sec": 0, 00:22:00.597 "r_mbytes_per_sec": 0, 00:22:00.597 "w_mbytes_per_sec": 0 00:22:00.597 }, 00:22:00.597 "claimed": true, 00:22:00.597 "claim_type": "exclusive_write", 00:22:00.597 "zoned": false, 00:22:00.597 "supported_io_types": { 00:22:00.597 "read": true, 00:22:00.597 "write": true, 00:22:00.597 "unmap": true, 00:22:00.597 "flush": true, 00:22:00.597 "reset": true, 00:22:00.597 "nvme_admin": false, 00:22:00.597 "nvme_io": false, 00:22:00.597 "nvme_io_md": false, 00:22:00.597 "write_zeroes": true, 00:22:00.597 "zcopy": true, 00:22:00.597 "get_zone_info": false, 00:22:00.597 "zone_management": false, 00:22:00.597 "zone_append": false, 00:22:00.597 "compare": false, 00:22:00.597 "compare_and_write": false, 00:22:00.597 "abort": true, 00:22:00.597 "seek_hole": false, 00:22:00.597 "seek_data": false, 00:22:00.597 "copy": true, 00:22:00.597 "nvme_iov_md": false 00:22:00.597 }, 00:22:00.597 "memory_domains": [ 00:22:00.597 { 00:22:00.597 "dma_device_id": "system", 00:22:00.597 "dma_device_type": 1 00:22:00.597 }, 00:22:00.597 { 00:22:00.597 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:00.597 "dma_device_type": 2 00:22:00.597 } 00:22:00.597 ], 00:22:00.597 "driver_specific": { 00:22:00.597 "passthru": { 00:22:00.597 "name": "pt1", 00:22:00.597 "base_bdev_name": "malloc1" 00:22:00.597 } 00:22:00.597 } 00:22:00.597 }' 00:22:00.597 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:00.597 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:00.855 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:22:00.855 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:00.855 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:00.855 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:00.855 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:00.855 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:00.855 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:22:00.855 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:00.855 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:00.855 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:00.855 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:00.855 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:00.855 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:01.114 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:01.114 "name": "pt2", 00:22:01.114 "aliases": [ 00:22:01.114 "00000000-0000-0000-0000-000000000002" 00:22:01.114 ], 00:22:01.114 "product_name": "passthru", 00:22:01.114 "block_size": 4096, 00:22:01.114 "num_blocks": 8192, 00:22:01.114 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:01.114 "md_size": 32, 00:22:01.114 "md_interleave": false, 00:22:01.114 "dif_type": 0, 00:22:01.114 "assigned_rate_limits": { 00:22:01.114 "rw_ios_per_sec": 0, 00:22:01.114 "rw_mbytes_per_sec": 0, 00:22:01.114 "r_mbytes_per_sec": 0, 00:22:01.114 "w_mbytes_per_sec": 0 00:22:01.114 }, 00:22:01.114 "claimed": true, 00:22:01.114 "claim_type": "exclusive_write", 00:22:01.114 "zoned": false, 00:22:01.114 "supported_io_types": { 00:22:01.114 "read": true, 00:22:01.114 "write": true, 00:22:01.114 "unmap": true, 00:22:01.114 "flush": true, 00:22:01.114 "reset": true, 00:22:01.114 "nvme_admin": false, 00:22:01.114 "nvme_io": false, 00:22:01.114 "nvme_io_md": false, 00:22:01.114 "write_zeroes": true, 00:22:01.114 "zcopy": true, 00:22:01.114 "get_zone_info": false, 00:22:01.114 "zone_management": false, 00:22:01.114 "zone_append": false, 00:22:01.114 "compare": false, 00:22:01.114 "compare_and_write": false, 00:22:01.114 "abort": true, 00:22:01.114 "seek_hole": false, 00:22:01.114 "seek_data": false, 00:22:01.114 "copy": true, 00:22:01.114 "nvme_iov_md": false 00:22:01.114 }, 00:22:01.114 "memory_domains": [ 00:22:01.114 { 00:22:01.114 "dma_device_id": "system", 00:22:01.114 "dma_device_type": 1 00:22:01.114 }, 00:22:01.114 { 00:22:01.114 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:01.114 "dma_device_type": 2 00:22:01.114 } 00:22:01.114 ], 00:22:01.114 "driver_specific": { 00:22:01.114 "passthru": { 00:22:01.114 "name": "pt2", 00:22:01.114 "base_bdev_name": "malloc2" 00:22:01.114 } 00:22:01.114 } 00:22:01.114 }' 00:22:01.114 18:58:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:01.114 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:01.114 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:22:01.114 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:01.114 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:01.114 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:01.114 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:01.373 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:01.373 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:22:01.373 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:01.373 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:01.373 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:01.373 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:01.373 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:22:01.373 [2024-07-24 18:58:46.368945] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:01.632 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' 34354063-6106-4b7e-9059-b2c9f793c293 '!=' 34354063-6106-4b7e-9059-b2c9f793c293 ']' 00:22:01.632 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:22:01.632 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:01.632 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:22:01.632 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:01.632 [2024-07-24 18:58:46.541252] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:22:01.632 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:01.632 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:01.632 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:01.632 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:01.632 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:01.632 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:01.632 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:01.632 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:01.632 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:01.632 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:01.632 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.632 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:01.890 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:01.890 "name": "raid_bdev1", 00:22:01.890 "uuid": "34354063-6106-4b7e-9059-b2c9f793c293", 00:22:01.890 "strip_size_kb": 0, 00:22:01.890 "state": "online", 00:22:01.890 "raid_level": "raid1", 00:22:01.890 "superblock": true, 00:22:01.890 "num_base_bdevs": 2, 00:22:01.890 "num_base_bdevs_discovered": 1, 00:22:01.890 "num_base_bdevs_operational": 1, 00:22:01.890 "base_bdevs_list": [ 00:22:01.890 { 00:22:01.890 "name": null, 00:22:01.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.890 "is_configured": false, 00:22:01.890 "data_offset": 256, 00:22:01.890 "data_size": 7936 00:22:01.890 }, 00:22:01.890 { 00:22:01.890 "name": "pt2", 00:22:01.890 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:01.890 "is_configured": true, 00:22:01.890 "data_offset": 256, 00:22:01.890 "data_size": 7936 00:22:01.890 } 00:22:01.890 ] 00:22:01.890 }' 00:22:01.890 18:58:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:01.890 18:58:46 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:02.455 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:02.455 [2024-07-24 18:58:47.383409] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:02.455 [2024-07-24 18:58:47.383427] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:02.455 [2024-07-24 18:58:47.383476] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:02.455 [2024-07-24 18:58:47.383506] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:02.455 [2024-07-24 18:58:47.383512] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23f9280 name raid_bdev1, state offline 00:22:02.455 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.455 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:22:02.714 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:22:02.714 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:22:02.714 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:22:02.714 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:02.714 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:02.714 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:02.714 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:02.714 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:22:02.714 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:02.714 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:22:02.714 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:02.972 [2024-07-24 18:58:47.864648] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:02.972 [2024-07-24 18:58:47.864682] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:02.972 [2024-07-24 18:58:47.864690] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23f8760 00:22:02.972 [2024-07-24 18:58:47.864695] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:02.972 [2024-07-24 18:58:47.865798] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:02.972 [2024-07-24 18:58:47.865818] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:02.972 [2024-07-24 18:58:47.865848] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:02.972 [2024-07-24 18:58:47.865866] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:02.972 [2024-07-24 18:58:47.865920] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x257e8f0 00:22:02.972 [2024-07-24 18:58:47.865925] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:02.972 [2024-07-24 18:58:47.865968] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x257da60 00:22:02.972 [2024-07-24 18:58:47.866032] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x257e8f0 00:22:02.972 [2024-07-24 18:58:47.866037] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x257e8f0 00:22:02.972 [2024-07-24 18:58:47.866080] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:02.972 pt2 00:22:02.972 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:02.972 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:02.972 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:02.972 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:02.972 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:02.972 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:02.972 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:02.972 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:02.972 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:02.972 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:02.972 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.972 18:58:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:03.230 18:58:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:03.230 "name": "raid_bdev1", 00:22:03.230 "uuid": "34354063-6106-4b7e-9059-b2c9f793c293", 00:22:03.230 "strip_size_kb": 0, 00:22:03.230 "state": "online", 00:22:03.230 "raid_level": "raid1", 00:22:03.230 "superblock": true, 00:22:03.230 "num_base_bdevs": 2, 00:22:03.230 "num_base_bdevs_discovered": 1, 00:22:03.230 "num_base_bdevs_operational": 1, 00:22:03.230 "base_bdevs_list": [ 00:22:03.230 { 00:22:03.230 "name": null, 00:22:03.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:03.230 "is_configured": false, 00:22:03.230 "data_offset": 256, 00:22:03.230 "data_size": 7936 00:22:03.230 }, 00:22:03.230 { 00:22:03.230 "name": "pt2", 00:22:03.230 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:03.230 "is_configured": true, 00:22:03.230 "data_offset": 256, 00:22:03.230 "data_size": 7936 00:22:03.230 } 00:22:03.230 ] 00:22:03.230 }' 00:22:03.230 18:58:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:03.230 18:58:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:03.797 18:58:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:03.797 [2024-07-24 18:58:48.690763] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:03.797 [2024-07-24 18:58:48.690780] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:03.797 [2024-07-24 18:58:48.690815] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:03.797 [2024-07-24 18:58:48.690845] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:03.797 [2024-07-24 18:58:48.690851] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x257e8f0 name raid_bdev1, state offline 00:22:03.797 18:58:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:03.797 18:58:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:22:04.056 18:58:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:22:04.056 18:58:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:22:04.056 18:58:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:22:04.056 18:58:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:04.056 [2024-07-24 18:58:49.027626] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:04.056 [2024-07-24 18:58:49.027657] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:04.056 [2024-07-24 18:58:49.027666] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23faec0 00:22:04.056 [2024-07-24 18:58:49.027672] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:04.056 [2024-07-24 18:58:49.028743] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:04.056 [2024-07-24 18:58:49.028762] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:04.056 [2024-07-24 18:58:49.028792] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:04.056 [2024-07-24 18:58:49.028813] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:04.056 [2024-07-24 18:58:49.028874] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:22:04.056 [2024-07-24 18:58:49.028880] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:04.056 [2024-07-24 18:58:49.028888] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x257f630 name raid_bdev1, state configuring 00:22:04.056 [2024-07-24 18:58:49.028901] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:04.056 [2024-07-24 18:58:49.028936] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x257f630 00:22:04.056 [2024-07-24 18:58:49.028941] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:04.056 [2024-07-24 18:58:49.028978] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25804f0 00:22:04.056 [2024-07-24 18:58:49.029042] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x257f630 00:22:04.056 [2024-07-24 18:58:49.029047] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x257f630 00:22:04.056 [2024-07-24 18:58:49.029091] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:04.056 pt1 00:22:04.056 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:22:04.056 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:04.056 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:04.056 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:04.056 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:04.056 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:04.056 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:04.056 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:04.056 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:04.056 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:04.056 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:04.056 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.056 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:04.315 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:04.315 "name": "raid_bdev1", 00:22:04.315 "uuid": "34354063-6106-4b7e-9059-b2c9f793c293", 00:22:04.315 "strip_size_kb": 0, 00:22:04.315 "state": "online", 00:22:04.315 "raid_level": "raid1", 00:22:04.315 "superblock": true, 00:22:04.315 "num_base_bdevs": 2, 00:22:04.315 "num_base_bdevs_discovered": 1, 00:22:04.315 "num_base_bdevs_operational": 1, 00:22:04.315 "base_bdevs_list": [ 00:22:04.315 { 00:22:04.315 "name": null, 00:22:04.315 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:04.315 "is_configured": false, 00:22:04.315 "data_offset": 256, 00:22:04.315 "data_size": 7936 00:22:04.315 }, 00:22:04.315 { 00:22:04.315 "name": "pt2", 00:22:04.315 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:04.315 "is_configured": true, 00:22:04.315 "data_offset": 256, 00:22:04.315 "data_size": 7936 00:22:04.315 } 00:22:04.315 ] 00:22:04.315 }' 00:22:04.315 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:04.315 18:58:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:04.882 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:22:04.882 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:04.882 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:22:05.143 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:05.143 18:58:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:22:05.143 [2024-07-24 18:58:50.042417] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:05.143 18:58:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 34354063-6106-4b7e-9059-b2c9f793c293 '!=' 34354063-6106-4b7e-9059-b2c9f793c293 ']' 00:22:05.143 18:58:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 2192007 00:22:05.143 18:58:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2192007 ']' 00:22:05.143 18:58:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 2192007 00:22:05.143 18:58:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:22:05.143 18:58:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:05.143 18:58:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2192007 00:22:05.143 18:58:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:05.143 18:58:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:05.143 18:58:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2192007' 00:22:05.143 killing process with pid 2192007 00:22:05.143 18:58:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 2192007 00:22:05.143 [2024-07-24 18:58:50.108079] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:05.143 [2024-07-24 18:58:50.108124] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:05.144 [2024-07-24 18:58:50.108154] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:05.144 [2024-07-24 18:58:50.108159] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x257f630 name raid_bdev1, state offline 00:22:05.144 18:58:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 2192007 00:22:05.144 [2024-07-24 18:58:50.127100] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:05.439 18:58:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:22:05.439 00:22:05.439 real 0m11.516s 00:22:05.439 user 0m21.201s 00:22:05.439 sys 0m1.784s 00:22:05.439 18:58:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:05.439 18:58:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:05.439 ************************************ 00:22:05.439 END TEST raid_superblock_test_md_separate 00:22:05.439 ************************************ 00:22:05.439 18:58:50 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:22:05.439 18:58:50 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:22:05.439 18:58:50 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:05.439 18:58:50 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:05.439 18:58:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:05.439 ************************************ 00:22:05.439 START TEST raid_rebuild_test_sb_md_separate 00:22:05.439 ************************************ 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=2194149 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 2194149 /var/tmp/spdk-raid.sock 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2194149 ']' 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:05.439 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:05.439 18:58:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:05.439 [2024-07-24 18:58:50.424586] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:22:05.439 [2024-07-24 18:58:50.424626] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2194149 ] 00:22:05.439 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:05.439 Zero copy mechanism will not be used. 00:22:05.709 [2024-07-24 18:58:50.487391] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:05.709 [2024-07-24 18:58:50.566283] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:05.709 [2024-07-24 18:58:50.626218] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:05.709 [2024-07-24 18:58:50.626245] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:06.275 18:58:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:06.275 18:58:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:22:06.275 18:58:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:06.275 18:58:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:22:06.533 BaseBdev1_malloc 00:22:06.533 18:58:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:06.533 [2024-07-24 18:58:51.538656] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:06.533 [2024-07-24 18:58:51.538690] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:06.533 [2024-07-24 18:58:51.538703] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x81eac0 00:22:06.533 [2024-07-24 18:58:51.538709] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:06.533 [2024-07-24 18:58:51.539630] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:06.533 [2024-07-24 18:58:51.539649] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:06.791 BaseBdev1 00:22:06.791 18:58:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:06.791 18:58:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:22:06.791 BaseBdev2_malloc 00:22:06.791 18:58:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:07.049 [2024-07-24 18:58:51.899714] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:07.049 [2024-07-24 18:58:51.899744] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:07.050 [2024-07-24 18:58:51.899756] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9b08f0 00:22:07.050 [2024-07-24 18:58:51.899762] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:07.050 [2024-07-24 18:58:51.900814] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:07.050 [2024-07-24 18:58:51.900834] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:07.050 BaseBdev2 00:22:07.050 18:58:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:22:07.308 spare_malloc 00:22:07.308 18:58:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:07.308 spare_delay 00:22:07.308 18:58:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:07.566 [2024-07-24 18:58:52.409244] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:07.566 [2024-07-24 18:58:52.409276] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:07.567 [2024-07-24 18:58:52.409289] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x816940 00:22:07.567 [2024-07-24 18:58:52.409296] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:07.567 [2024-07-24 18:58:52.410267] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:07.567 [2024-07-24 18:58:52.410289] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:07.567 spare 00:22:07.567 18:58:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:07.567 [2024-07-24 18:58:52.569681] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:07.567 [2024-07-24 18:58:52.570542] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:07.567 [2024-07-24 18:58:52.570654] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x819250 00:22:07.567 [2024-07-24 18:58:52.570662] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:07.567 [2024-07-24 18:58:52.570711] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x818770 00:22:07.567 [2024-07-24 18:58:52.570788] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x819250 00:22:07.567 [2024-07-24 18:58:52.570793] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x819250 00:22:07.567 [2024-07-24 18:58:52.570836] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:07.826 18:58:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:07.826 18:58:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:07.826 18:58:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:07.826 18:58:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:07.826 18:58:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:07.826 18:58:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:07.826 18:58:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:07.826 18:58:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:07.826 18:58:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:07.826 18:58:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:07.826 18:58:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.826 18:58:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:07.826 18:58:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:07.826 "name": "raid_bdev1", 00:22:07.826 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:07.826 "strip_size_kb": 0, 00:22:07.826 "state": "online", 00:22:07.826 "raid_level": "raid1", 00:22:07.826 "superblock": true, 00:22:07.826 "num_base_bdevs": 2, 00:22:07.826 "num_base_bdevs_discovered": 2, 00:22:07.826 "num_base_bdevs_operational": 2, 00:22:07.826 "base_bdevs_list": [ 00:22:07.826 { 00:22:07.826 "name": "BaseBdev1", 00:22:07.826 "uuid": "e687c655-42c7-535e-bbd3-e848152b83c1", 00:22:07.826 "is_configured": true, 00:22:07.826 "data_offset": 256, 00:22:07.826 "data_size": 7936 00:22:07.826 }, 00:22:07.826 { 00:22:07.826 "name": "BaseBdev2", 00:22:07.826 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:07.826 "is_configured": true, 00:22:07.826 "data_offset": 256, 00:22:07.826 "data_size": 7936 00:22:07.826 } 00:22:07.826 ] 00:22:07.826 }' 00:22:07.826 18:58:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:07.826 18:58:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:08.392 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:08.392 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:08.392 [2024-07-24 18:58:53.379925] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:08.392 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:22:08.651 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.651 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:08.651 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:22:08.651 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:08.651 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:22:08.651 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:22:08.651 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:08.651 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:08.651 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:08.651 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:08.651 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:08.651 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:08.651 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:22:08.651 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:08.651 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:08.651 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:08.910 [2024-07-24 18:58:53.716659] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x818770 00:22:08.910 /dev/nbd0 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:08.910 1+0 records in 00:22:08.910 1+0 records out 00:22:08.910 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229451 s, 17.9 MB/s 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:22:08.910 18:58:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:22:09.478 7936+0 records in 00:22:09.478 7936+0 records out 00:22:09.478 32505856 bytes (33 MB, 31 MiB) copied, 0.500831 s, 64.9 MB/s 00:22:09.478 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:09.478 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:09.478 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:09.478 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:09.478 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:22:09.478 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:09.478 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:09.478 [2024-07-24 18:58:54.452837] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:09.478 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:09.478 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:09.478 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:09.478 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:09.478 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:09.478 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:09.478 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:22:09.478 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:22:09.479 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:09.737 [2024-07-24 18:58:54.613271] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:09.737 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:09.737 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:09.737 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:09.737 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:09.737 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:09.737 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:09.737 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:09.737 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:09.737 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:09.737 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:09.737 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.737 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:09.996 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:09.996 "name": "raid_bdev1", 00:22:09.996 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:09.996 "strip_size_kb": 0, 00:22:09.996 "state": "online", 00:22:09.996 "raid_level": "raid1", 00:22:09.996 "superblock": true, 00:22:09.996 "num_base_bdevs": 2, 00:22:09.996 "num_base_bdevs_discovered": 1, 00:22:09.996 "num_base_bdevs_operational": 1, 00:22:09.996 "base_bdevs_list": [ 00:22:09.996 { 00:22:09.996 "name": null, 00:22:09.996 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:09.996 "is_configured": false, 00:22:09.996 "data_offset": 256, 00:22:09.996 "data_size": 7936 00:22:09.996 }, 00:22:09.996 { 00:22:09.996 "name": "BaseBdev2", 00:22:09.996 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:09.996 "is_configured": true, 00:22:09.996 "data_offset": 256, 00:22:09.996 "data_size": 7936 00:22:09.996 } 00:22:09.996 ] 00:22:09.996 }' 00:22:09.996 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:09.996 18:58:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:10.562 18:58:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:10.562 [2024-07-24 18:58:55.427396] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:10.562 [2024-07-24 18:58:55.429332] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x818770 00:22:10.562 [2024-07-24 18:58:55.430685] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:10.562 18:58:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:11.497 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:11.497 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:11.497 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:11.497 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:11.497 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:11.497 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:11.497 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:11.756 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:11.756 "name": "raid_bdev1", 00:22:11.756 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:11.756 "strip_size_kb": 0, 00:22:11.756 "state": "online", 00:22:11.756 "raid_level": "raid1", 00:22:11.756 "superblock": true, 00:22:11.756 "num_base_bdevs": 2, 00:22:11.756 "num_base_bdevs_discovered": 2, 00:22:11.756 "num_base_bdevs_operational": 2, 00:22:11.756 "process": { 00:22:11.756 "type": "rebuild", 00:22:11.756 "target": "spare", 00:22:11.756 "progress": { 00:22:11.756 "blocks": 2816, 00:22:11.756 "percent": 35 00:22:11.756 } 00:22:11.756 }, 00:22:11.756 "base_bdevs_list": [ 00:22:11.756 { 00:22:11.756 "name": "spare", 00:22:11.756 "uuid": "7b94b5e0-56d2-5b86-a216-1b4926fc58fd", 00:22:11.756 "is_configured": true, 00:22:11.756 "data_offset": 256, 00:22:11.756 "data_size": 7936 00:22:11.756 }, 00:22:11.756 { 00:22:11.756 "name": "BaseBdev2", 00:22:11.756 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:11.756 "is_configured": true, 00:22:11.756 "data_offset": 256, 00:22:11.756 "data_size": 7936 00:22:11.756 } 00:22:11.756 ] 00:22:11.756 }' 00:22:11.756 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:11.756 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:11.756 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:11.756 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:11.756 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:12.014 [2024-07-24 18:58:56.871835] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:12.014 [2024-07-24 18:58:56.941277] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:12.014 [2024-07-24 18:58:56.941307] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:12.014 [2024-07-24 18:58:56.941317] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:12.014 [2024-07-24 18:58:56.941337] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:12.014 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:12.014 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:12.014 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:12.014 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:12.014 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:12.014 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:12.014 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:12.014 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:12.014 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:12.014 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:12.015 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:12.015 18:58:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:12.273 18:58:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:12.273 "name": "raid_bdev1", 00:22:12.273 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:12.273 "strip_size_kb": 0, 00:22:12.273 "state": "online", 00:22:12.273 "raid_level": "raid1", 00:22:12.273 "superblock": true, 00:22:12.273 "num_base_bdevs": 2, 00:22:12.273 "num_base_bdevs_discovered": 1, 00:22:12.273 "num_base_bdevs_operational": 1, 00:22:12.273 "base_bdevs_list": [ 00:22:12.273 { 00:22:12.273 "name": null, 00:22:12.273 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:12.273 "is_configured": false, 00:22:12.273 "data_offset": 256, 00:22:12.273 "data_size": 7936 00:22:12.273 }, 00:22:12.273 { 00:22:12.273 "name": "BaseBdev2", 00:22:12.273 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:12.273 "is_configured": true, 00:22:12.273 "data_offset": 256, 00:22:12.273 "data_size": 7936 00:22:12.273 } 00:22:12.273 ] 00:22:12.273 }' 00:22:12.273 18:58:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:12.273 18:58:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:12.840 18:58:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:12.840 18:58:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:12.840 18:58:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:12.840 18:58:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:12.840 18:58:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:12.840 18:58:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:12.840 18:58:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:12.840 18:58:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:12.840 "name": "raid_bdev1", 00:22:12.840 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:12.840 "strip_size_kb": 0, 00:22:12.840 "state": "online", 00:22:12.840 "raid_level": "raid1", 00:22:12.840 "superblock": true, 00:22:12.840 "num_base_bdevs": 2, 00:22:12.840 "num_base_bdevs_discovered": 1, 00:22:12.840 "num_base_bdevs_operational": 1, 00:22:12.840 "base_bdevs_list": [ 00:22:12.840 { 00:22:12.840 "name": null, 00:22:12.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:12.840 "is_configured": false, 00:22:12.840 "data_offset": 256, 00:22:12.840 "data_size": 7936 00:22:12.840 }, 00:22:12.840 { 00:22:12.840 "name": "BaseBdev2", 00:22:12.840 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:12.840 "is_configured": true, 00:22:12.840 "data_offset": 256, 00:22:12.840 "data_size": 7936 00:22:12.840 } 00:22:12.840 ] 00:22:12.840 }' 00:22:12.840 18:58:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:12.840 18:58:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:12.840 18:58:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:12.840 18:58:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:12.840 18:58:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:13.098 [2024-07-24 18:58:58.002910] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:13.098 [2024-07-24 18:58:58.004875] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x818770 00:22:13.098 [2024-07-24 18:58:58.005892] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:13.098 18:58:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:14.034 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:14.034 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:14.034 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:14.034 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:14.034 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:14.034 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.034 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:14.292 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:14.292 "name": "raid_bdev1", 00:22:14.292 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:14.292 "strip_size_kb": 0, 00:22:14.292 "state": "online", 00:22:14.292 "raid_level": "raid1", 00:22:14.292 "superblock": true, 00:22:14.292 "num_base_bdevs": 2, 00:22:14.292 "num_base_bdevs_discovered": 2, 00:22:14.292 "num_base_bdevs_operational": 2, 00:22:14.292 "process": { 00:22:14.292 "type": "rebuild", 00:22:14.292 "target": "spare", 00:22:14.292 "progress": { 00:22:14.292 "blocks": 2816, 00:22:14.292 "percent": 35 00:22:14.292 } 00:22:14.292 }, 00:22:14.292 "base_bdevs_list": [ 00:22:14.292 { 00:22:14.292 "name": "spare", 00:22:14.292 "uuid": "7b94b5e0-56d2-5b86-a216-1b4926fc58fd", 00:22:14.292 "is_configured": true, 00:22:14.292 "data_offset": 256, 00:22:14.292 "data_size": 7936 00:22:14.292 }, 00:22:14.292 { 00:22:14.292 "name": "BaseBdev2", 00:22:14.292 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:14.292 "is_configured": true, 00:22:14.292 "data_offset": 256, 00:22:14.292 "data_size": 7936 00:22:14.292 } 00:22:14.292 ] 00:22:14.292 }' 00:22:14.292 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:14.292 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:14.293 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:14.293 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:14.293 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:22:14.293 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:22:14.293 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:22:14.293 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:14.293 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:14.293 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:14.293 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=820 00:22:14.293 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:14.293 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:14.293 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:14.293 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:14.293 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:14.293 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:14.293 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.293 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:14.551 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:14.551 "name": "raid_bdev1", 00:22:14.551 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:14.551 "strip_size_kb": 0, 00:22:14.551 "state": "online", 00:22:14.551 "raid_level": "raid1", 00:22:14.551 "superblock": true, 00:22:14.551 "num_base_bdevs": 2, 00:22:14.551 "num_base_bdevs_discovered": 2, 00:22:14.551 "num_base_bdevs_operational": 2, 00:22:14.551 "process": { 00:22:14.551 "type": "rebuild", 00:22:14.551 "target": "spare", 00:22:14.551 "progress": { 00:22:14.551 "blocks": 3328, 00:22:14.551 "percent": 41 00:22:14.551 } 00:22:14.551 }, 00:22:14.551 "base_bdevs_list": [ 00:22:14.551 { 00:22:14.551 "name": "spare", 00:22:14.551 "uuid": "7b94b5e0-56d2-5b86-a216-1b4926fc58fd", 00:22:14.551 "is_configured": true, 00:22:14.551 "data_offset": 256, 00:22:14.551 "data_size": 7936 00:22:14.551 }, 00:22:14.551 { 00:22:14.551 "name": "BaseBdev2", 00:22:14.551 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:14.551 "is_configured": true, 00:22:14.551 "data_offset": 256, 00:22:14.551 "data_size": 7936 00:22:14.551 } 00:22:14.551 ] 00:22:14.551 }' 00:22:14.551 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:14.551 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:14.551 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:14.551 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:14.551 18:58:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:15.926 18:59:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:15.926 18:59:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:15.926 18:59:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:15.926 18:59:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:15.926 18:59:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:15.926 18:59:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:15.926 18:59:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.926 18:59:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:15.927 18:59:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:15.927 "name": "raid_bdev1", 00:22:15.927 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:15.927 "strip_size_kb": 0, 00:22:15.927 "state": "online", 00:22:15.927 "raid_level": "raid1", 00:22:15.927 "superblock": true, 00:22:15.927 "num_base_bdevs": 2, 00:22:15.927 "num_base_bdevs_discovered": 2, 00:22:15.927 "num_base_bdevs_operational": 2, 00:22:15.927 "process": { 00:22:15.927 "type": "rebuild", 00:22:15.927 "target": "spare", 00:22:15.927 "progress": { 00:22:15.927 "blocks": 6656, 00:22:15.927 "percent": 83 00:22:15.927 } 00:22:15.927 }, 00:22:15.927 "base_bdevs_list": [ 00:22:15.927 { 00:22:15.927 "name": "spare", 00:22:15.927 "uuid": "7b94b5e0-56d2-5b86-a216-1b4926fc58fd", 00:22:15.927 "is_configured": true, 00:22:15.927 "data_offset": 256, 00:22:15.927 "data_size": 7936 00:22:15.927 }, 00:22:15.927 { 00:22:15.927 "name": "BaseBdev2", 00:22:15.927 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:15.927 "is_configured": true, 00:22:15.927 "data_offset": 256, 00:22:15.927 "data_size": 7936 00:22:15.927 } 00:22:15.927 ] 00:22:15.927 }' 00:22:15.927 18:59:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:15.927 18:59:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:15.927 18:59:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:15.927 18:59:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:15.927 18:59:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:16.184 [2024-07-24 18:59:01.127720] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:16.184 [2024-07-24 18:59:01.127761] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:16.184 [2024-07-24 18:59:01.127821] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:17.120 18:59:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:17.120 18:59:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:17.120 18:59:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:17.120 18:59:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:17.120 18:59:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:17.120 18:59:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:17.120 18:59:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.120 18:59:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:17.120 18:59:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:17.120 "name": "raid_bdev1", 00:22:17.120 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:17.120 "strip_size_kb": 0, 00:22:17.120 "state": "online", 00:22:17.120 "raid_level": "raid1", 00:22:17.120 "superblock": true, 00:22:17.120 "num_base_bdevs": 2, 00:22:17.120 "num_base_bdevs_discovered": 2, 00:22:17.120 "num_base_bdevs_operational": 2, 00:22:17.120 "base_bdevs_list": [ 00:22:17.120 { 00:22:17.120 "name": "spare", 00:22:17.120 "uuid": "7b94b5e0-56d2-5b86-a216-1b4926fc58fd", 00:22:17.120 "is_configured": true, 00:22:17.120 "data_offset": 256, 00:22:17.120 "data_size": 7936 00:22:17.120 }, 00:22:17.120 { 00:22:17.120 "name": "BaseBdev2", 00:22:17.120 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:17.120 "is_configured": true, 00:22:17.120 "data_offset": 256, 00:22:17.120 "data_size": 7936 00:22:17.120 } 00:22:17.120 ] 00:22:17.120 }' 00:22:17.120 18:59:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:17.120 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:17.120 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:17.120 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:17.120 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:22:17.120 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:17.120 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:17.120 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:17.120 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:17.120 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:17.120 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.120 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:17.379 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:17.379 "name": "raid_bdev1", 00:22:17.379 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:17.379 "strip_size_kb": 0, 00:22:17.379 "state": "online", 00:22:17.379 "raid_level": "raid1", 00:22:17.379 "superblock": true, 00:22:17.379 "num_base_bdevs": 2, 00:22:17.379 "num_base_bdevs_discovered": 2, 00:22:17.379 "num_base_bdevs_operational": 2, 00:22:17.379 "base_bdevs_list": [ 00:22:17.379 { 00:22:17.379 "name": "spare", 00:22:17.379 "uuid": "7b94b5e0-56d2-5b86-a216-1b4926fc58fd", 00:22:17.379 "is_configured": true, 00:22:17.379 "data_offset": 256, 00:22:17.379 "data_size": 7936 00:22:17.379 }, 00:22:17.379 { 00:22:17.379 "name": "BaseBdev2", 00:22:17.379 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:17.379 "is_configured": true, 00:22:17.379 "data_offset": 256, 00:22:17.379 "data_size": 7936 00:22:17.379 } 00:22:17.379 ] 00:22:17.379 }' 00:22:17.379 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:17.379 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:17.379 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:17.379 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:17.379 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:17.379 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:17.379 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:17.379 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:17.379 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:17.379 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:17.379 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:17.379 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:17.379 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:17.379 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:17.379 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.379 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:17.638 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:17.638 "name": "raid_bdev1", 00:22:17.638 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:17.638 "strip_size_kb": 0, 00:22:17.638 "state": "online", 00:22:17.638 "raid_level": "raid1", 00:22:17.638 "superblock": true, 00:22:17.638 "num_base_bdevs": 2, 00:22:17.638 "num_base_bdevs_discovered": 2, 00:22:17.638 "num_base_bdevs_operational": 2, 00:22:17.638 "base_bdevs_list": [ 00:22:17.638 { 00:22:17.638 "name": "spare", 00:22:17.638 "uuid": "7b94b5e0-56d2-5b86-a216-1b4926fc58fd", 00:22:17.638 "is_configured": true, 00:22:17.638 "data_offset": 256, 00:22:17.638 "data_size": 7936 00:22:17.638 }, 00:22:17.638 { 00:22:17.638 "name": "BaseBdev2", 00:22:17.638 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:17.638 "is_configured": true, 00:22:17.638 "data_offset": 256, 00:22:17.638 "data_size": 7936 00:22:17.638 } 00:22:17.638 ] 00:22:17.638 }' 00:22:17.638 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:17.638 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:18.205 18:59:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:18.205 [2024-07-24 18:59:03.111518] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:18.205 [2024-07-24 18:59:03.111538] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:18.205 [2024-07-24 18:59:03.111580] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:18.205 [2024-07-24 18:59:03.111618] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:18.205 [2024-07-24 18:59:03.111623] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x819250 name raid_bdev1, state offline 00:22:18.205 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.205 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:22:18.464 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:18.464 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:18.464 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:22:18.464 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:18.464 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:18.464 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:18.464 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:18.464 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:18.464 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:18.464 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:22:18.464 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:18.464 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:18.464 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:18.722 /dev/nbd0 00:22:18.722 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:18.722 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:18.722 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:18.722 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:22:18.722 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:18.722 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:18.722 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:18.722 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:22:18.722 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:18.722 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:18.722 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:18.722 1+0 records in 00:22:18.722 1+0 records out 00:22:18.723 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000223006 s, 18.4 MB/s 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:18.723 /dev/nbd1 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:18.723 1+0 records in 00:22:18.723 1+0 records out 00:22:18.723 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000186275 s, 22.0 MB/s 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:18.723 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:22:18.981 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:18.981 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:18.981 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:18.981 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:18.981 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:22:18.981 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:18.981 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:18.981 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:18.981 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:18.981 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:18.981 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:18.981 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:18.981 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:18.981 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:22:18.981 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:22:18.981 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:18.981 18:59:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:19.239 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:19.239 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:19.239 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:19.239 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:19.239 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:19.239 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:19.239 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:22:19.239 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:22:19.239 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:22:19.239 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:19.498 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:19.498 [2024-07-24 18:59:04.479936] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:19.498 [2024-07-24 18:59:04.479967] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:19.498 [2024-07-24 18:59:04.479979] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x81c0f0 00:22:19.498 [2024-07-24 18:59:04.479984] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:19.498 [2024-07-24 18:59:04.481054] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:19.498 [2024-07-24 18:59:04.481073] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:19.498 [2024-07-24 18:59:04.481110] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:19.498 [2024-07-24 18:59:04.481127] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:19.498 [2024-07-24 18:59:04.481192] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:19.498 spare 00:22:19.498 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:19.498 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:19.498 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:19.498 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:19.498 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:19.498 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:19.498 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:19.498 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:19.498 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:19.498 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:19.498 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.498 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:19.756 [2024-07-24 18:59:04.581477] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x81a200 00:22:19.756 [2024-07-24 18:59:04.581487] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:19.756 [2024-07-24 18:59:04.581538] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x81d750 00:22:19.756 [2024-07-24 18:59:04.581623] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x81a200 00:22:19.756 [2024-07-24 18:59:04.581629] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x81a200 00:22:19.757 [2024-07-24 18:59:04.581680] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:19.757 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:19.757 "name": "raid_bdev1", 00:22:19.757 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:19.757 "strip_size_kb": 0, 00:22:19.757 "state": "online", 00:22:19.757 "raid_level": "raid1", 00:22:19.757 "superblock": true, 00:22:19.757 "num_base_bdevs": 2, 00:22:19.757 "num_base_bdevs_discovered": 2, 00:22:19.757 "num_base_bdevs_operational": 2, 00:22:19.757 "base_bdevs_list": [ 00:22:19.757 { 00:22:19.757 "name": "spare", 00:22:19.757 "uuid": "7b94b5e0-56d2-5b86-a216-1b4926fc58fd", 00:22:19.757 "is_configured": true, 00:22:19.757 "data_offset": 256, 00:22:19.757 "data_size": 7936 00:22:19.757 }, 00:22:19.757 { 00:22:19.757 "name": "BaseBdev2", 00:22:19.757 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:19.757 "is_configured": true, 00:22:19.757 "data_offset": 256, 00:22:19.757 "data_size": 7936 00:22:19.757 } 00:22:19.757 ] 00:22:19.757 }' 00:22:19.757 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:19.757 18:59:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:20.323 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:20.323 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:20.323 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:20.323 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:20.323 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:20.323 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:20.323 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.582 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:20.582 "name": "raid_bdev1", 00:22:20.582 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:20.582 "strip_size_kb": 0, 00:22:20.582 "state": "online", 00:22:20.582 "raid_level": "raid1", 00:22:20.582 "superblock": true, 00:22:20.582 "num_base_bdevs": 2, 00:22:20.582 "num_base_bdevs_discovered": 2, 00:22:20.582 "num_base_bdevs_operational": 2, 00:22:20.582 "base_bdevs_list": [ 00:22:20.582 { 00:22:20.582 "name": "spare", 00:22:20.582 "uuid": "7b94b5e0-56d2-5b86-a216-1b4926fc58fd", 00:22:20.582 "is_configured": true, 00:22:20.582 "data_offset": 256, 00:22:20.582 "data_size": 7936 00:22:20.582 }, 00:22:20.582 { 00:22:20.582 "name": "BaseBdev2", 00:22:20.582 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:20.582 "is_configured": true, 00:22:20.582 "data_offset": 256, 00:22:20.582 "data_size": 7936 00:22:20.582 } 00:22:20.582 ] 00:22:20.582 }' 00:22:20.582 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:20.582 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:20.582 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:20.582 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:20.582 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:22:20.582 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.582 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:22:20.582 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:20.840 [2024-07-24 18:59:05.739265] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:20.840 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:20.840 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:20.840 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:20.840 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:20.840 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:20.840 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:20.840 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:20.840 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:20.840 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:20.840 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:20.840 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.840 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:21.099 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:21.099 "name": "raid_bdev1", 00:22:21.099 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:21.099 "strip_size_kb": 0, 00:22:21.099 "state": "online", 00:22:21.099 "raid_level": "raid1", 00:22:21.099 "superblock": true, 00:22:21.099 "num_base_bdevs": 2, 00:22:21.099 "num_base_bdevs_discovered": 1, 00:22:21.099 "num_base_bdevs_operational": 1, 00:22:21.099 "base_bdevs_list": [ 00:22:21.099 { 00:22:21.099 "name": null, 00:22:21.099 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:21.099 "is_configured": false, 00:22:21.099 "data_offset": 256, 00:22:21.099 "data_size": 7936 00:22:21.099 }, 00:22:21.099 { 00:22:21.099 "name": "BaseBdev2", 00:22:21.099 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:21.099 "is_configured": true, 00:22:21.099 "data_offset": 256, 00:22:21.099 "data_size": 7936 00:22:21.099 } 00:22:21.099 ] 00:22:21.099 }' 00:22:21.099 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:21.099 18:59:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:21.665 18:59:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:21.665 [2024-07-24 18:59:06.549364] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:21.665 [2024-07-24 18:59:06.549483] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:21.665 [2024-07-24 18:59:06.549494] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:21.665 [2024-07-24 18:59:06.549513] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:21.665 [2024-07-24 18:59:06.551389] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x81de00 00:22:21.665 [2024-07-24 18:59:06.552792] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:21.665 18:59:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:22:22.600 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:22.600 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:22.600 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:22.600 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:22.600 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:22.600 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.600 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:22.858 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:22.858 "name": "raid_bdev1", 00:22:22.858 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:22.858 "strip_size_kb": 0, 00:22:22.858 "state": "online", 00:22:22.858 "raid_level": "raid1", 00:22:22.858 "superblock": true, 00:22:22.858 "num_base_bdevs": 2, 00:22:22.858 "num_base_bdevs_discovered": 2, 00:22:22.858 "num_base_bdevs_operational": 2, 00:22:22.858 "process": { 00:22:22.858 "type": "rebuild", 00:22:22.858 "target": "spare", 00:22:22.858 "progress": { 00:22:22.858 "blocks": 2816, 00:22:22.858 "percent": 35 00:22:22.858 } 00:22:22.858 }, 00:22:22.858 "base_bdevs_list": [ 00:22:22.858 { 00:22:22.858 "name": "spare", 00:22:22.858 "uuid": "7b94b5e0-56d2-5b86-a216-1b4926fc58fd", 00:22:22.858 "is_configured": true, 00:22:22.858 "data_offset": 256, 00:22:22.858 "data_size": 7936 00:22:22.858 }, 00:22:22.858 { 00:22:22.858 "name": "BaseBdev2", 00:22:22.858 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:22.858 "is_configured": true, 00:22:22.858 "data_offset": 256, 00:22:22.858 "data_size": 7936 00:22:22.858 } 00:22:22.858 ] 00:22:22.858 }' 00:22:22.858 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:22.858 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:22.858 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:22.858 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:22.858 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:23.117 [2024-07-24 18:59:07.961917] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:23.117 [2024-07-24 18:59:07.962742] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:23.117 [2024-07-24 18:59:07.962769] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:23.117 [2024-07-24 18:59:07.962778] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:23.117 [2024-07-24 18:59:07.962782] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:23.117 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:23.117 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:23.117 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:23.117 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:23.117 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:23.117 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:23.117 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:23.117 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:23.117 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:23.117 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:23.117 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.117 18:59:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:23.402 18:59:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:23.402 "name": "raid_bdev1", 00:22:23.402 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:23.402 "strip_size_kb": 0, 00:22:23.402 "state": "online", 00:22:23.402 "raid_level": "raid1", 00:22:23.402 "superblock": true, 00:22:23.402 "num_base_bdevs": 2, 00:22:23.402 "num_base_bdevs_discovered": 1, 00:22:23.402 "num_base_bdevs_operational": 1, 00:22:23.402 "base_bdevs_list": [ 00:22:23.402 { 00:22:23.402 "name": null, 00:22:23.402 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:23.402 "is_configured": false, 00:22:23.402 "data_offset": 256, 00:22:23.402 "data_size": 7936 00:22:23.402 }, 00:22:23.402 { 00:22:23.402 "name": "BaseBdev2", 00:22:23.402 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:23.402 "is_configured": true, 00:22:23.402 "data_offset": 256, 00:22:23.402 "data_size": 7936 00:22:23.402 } 00:22:23.402 ] 00:22:23.402 }' 00:22:23.402 18:59:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:23.402 18:59:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:23.661 18:59:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:23.919 [2024-07-24 18:59:08.768073] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:23.919 [2024-07-24 18:59:08.768111] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:23.919 [2024-07-24 18:59:08.768126] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x81cf60 00:22:23.919 [2024-07-24 18:59:08.768133] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:23.919 [2024-07-24 18:59:08.768296] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:23.919 [2024-07-24 18:59:08.768305] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:23.919 [2024-07-24 18:59:08.768344] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:23.919 [2024-07-24 18:59:08.768351] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:23.919 [2024-07-24 18:59:08.768356] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:23.919 [2024-07-24 18:59:08.768366] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:23.919 [2024-07-24 18:59:08.770226] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x81de00 00:22:23.919 [2024-07-24 18:59:08.771258] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:23.919 spare 00:22:23.919 18:59:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:22:24.854 18:59:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:24.854 18:59:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:24.854 18:59:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:24.854 18:59:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:24.854 18:59:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:24.854 18:59:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.854 18:59:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:25.112 18:59:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:25.112 "name": "raid_bdev1", 00:22:25.112 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:25.112 "strip_size_kb": 0, 00:22:25.112 "state": "online", 00:22:25.112 "raid_level": "raid1", 00:22:25.112 "superblock": true, 00:22:25.112 "num_base_bdevs": 2, 00:22:25.112 "num_base_bdevs_discovered": 2, 00:22:25.112 "num_base_bdevs_operational": 2, 00:22:25.112 "process": { 00:22:25.112 "type": "rebuild", 00:22:25.112 "target": "spare", 00:22:25.112 "progress": { 00:22:25.112 "blocks": 2816, 00:22:25.112 "percent": 35 00:22:25.112 } 00:22:25.112 }, 00:22:25.112 "base_bdevs_list": [ 00:22:25.112 { 00:22:25.112 "name": "spare", 00:22:25.112 "uuid": "7b94b5e0-56d2-5b86-a216-1b4926fc58fd", 00:22:25.112 "is_configured": true, 00:22:25.112 "data_offset": 256, 00:22:25.112 "data_size": 7936 00:22:25.112 }, 00:22:25.112 { 00:22:25.112 "name": "BaseBdev2", 00:22:25.112 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:25.112 "is_configured": true, 00:22:25.112 "data_offset": 256, 00:22:25.112 "data_size": 7936 00:22:25.113 } 00:22:25.113 ] 00:22:25.113 }' 00:22:25.113 18:59:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:25.113 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:25.113 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:25.113 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:25.113 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:25.371 [2024-07-24 18:59:10.200835] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:25.371 [2024-07-24 18:59:10.281887] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:25.371 [2024-07-24 18:59:10.281916] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:25.371 [2024-07-24 18:59:10.281925] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:25.371 [2024-07-24 18:59:10.281928] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:25.371 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:25.371 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:25.371 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:25.371 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:25.371 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:25.371 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:25.371 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:25.371 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:25.371 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:25.371 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:25.371 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:25.371 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:25.630 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:25.630 "name": "raid_bdev1", 00:22:25.630 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:25.630 "strip_size_kb": 0, 00:22:25.630 "state": "online", 00:22:25.630 "raid_level": "raid1", 00:22:25.630 "superblock": true, 00:22:25.630 "num_base_bdevs": 2, 00:22:25.630 "num_base_bdevs_discovered": 1, 00:22:25.630 "num_base_bdevs_operational": 1, 00:22:25.630 "base_bdevs_list": [ 00:22:25.630 { 00:22:25.630 "name": null, 00:22:25.630 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:25.630 "is_configured": false, 00:22:25.630 "data_offset": 256, 00:22:25.630 "data_size": 7936 00:22:25.630 }, 00:22:25.630 { 00:22:25.630 "name": "BaseBdev2", 00:22:25.630 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:25.630 "is_configured": true, 00:22:25.630 "data_offset": 256, 00:22:25.630 "data_size": 7936 00:22:25.630 } 00:22:25.630 ] 00:22:25.630 }' 00:22:25.630 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:25.630 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:26.196 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:26.196 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:26.196 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:26.196 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:26.196 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:26.196 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:26.196 18:59:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:26.196 18:59:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:26.196 "name": "raid_bdev1", 00:22:26.196 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:26.196 "strip_size_kb": 0, 00:22:26.196 "state": "online", 00:22:26.196 "raid_level": "raid1", 00:22:26.196 "superblock": true, 00:22:26.196 "num_base_bdevs": 2, 00:22:26.196 "num_base_bdevs_discovered": 1, 00:22:26.196 "num_base_bdevs_operational": 1, 00:22:26.196 "base_bdevs_list": [ 00:22:26.196 { 00:22:26.196 "name": null, 00:22:26.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:26.196 "is_configured": false, 00:22:26.196 "data_offset": 256, 00:22:26.196 "data_size": 7936 00:22:26.196 }, 00:22:26.196 { 00:22:26.196 "name": "BaseBdev2", 00:22:26.196 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:26.196 "is_configured": true, 00:22:26.196 "data_offset": 256, 00:22:26.196 "data_size": 7936 00:22:26.196 } 00:22:26.196 ] 00:22:26.196 }' 00:22:26.196 18:59:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:26.196 18:59:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:26.196 18:59:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:26.196 18:59:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:26.196 18:59:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:22:26.454 18:59:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:26.712 [2024-07-24 18:59:11.496788] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:26.712 [2024-07-24 18:59:11.496822] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:26.712 [2024-07-24 18:59:11.496833] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9a82e0 00:22:26.712 [2024-07-24 18:59:11.496855] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:26.712 [2024-07-24 18:59:11.496996] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:26.712 [2024-07-24 18:59:11.497005] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:26.712 [2024-07-24 18:59:11.497034] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:22:26.712 [2024-07-24 18:59:11.497041] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:22:26.712 [2024-07-24 18:59:11.497046] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:26.712 BaseBdev1 00:22:26.712 18:59:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:22:27.646 18:59:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:27.646 18:59:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:27.646 18:59:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:27.646 18:59:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:27.646 18:59:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:27.646 18:59:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:27.646 18:59:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:27.646 18:59:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:27.646 18:59:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:27.646 18:59:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:27.646 18:59:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.646 18:59:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:27.904 18:59:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:27.904 "name": "raid_bdev1", 00:22:27.904 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:27.904 "strip_size_kb": 0, 00:22:27.904 "state": "online", 00:22:27.904 "raid_level": "raid1", 00:22:27.904 "superblock": true, 00:22:27.904 "num_base_bdevs": 2, 00:22:27.904 "num_base_bdevs_discovered": 1, 00:22:27.904 "num_base_bdevs_operational": 1, 00:22:27.904 "base_bdevs_list": [ 00:22:27.904 { 00:22:27.904 "name": null, 00:22:27.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:27.904 "is_configured": false, 00:22:27.904 "data_offset": 256, 00:22:27.904 "data_size": 7936 00:22:27.904 }, 00:22:27.904 { 00:22:27.904 "name": "BaseBdev2", 00:22:27.904 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:27.904 "is_configured": true, 00:22:27.904 "data_offset": 256, 00:22:27.904 "data_size": 7936 00:22:27.904 } 00:22:27.904 ] 00:22:27.904 }' 00:22:27.904 18:59:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:27.904 18:59:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:28.161 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:28.161 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:28.161 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:28.161 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:28.161 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:28.161 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.161 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:28.420 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:28.420 "name": "raid_bdev1", 00:22:28.420 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:28.420 "strip_size_kb": 0, 00:22:28.420 "state": "online", 00:22:28.420 "raid_level": "raid1", 00:22:28.420 "superblock": true, 00:22:28.420 "num_base_bdevs": 2, 00:22:28.420 "num_base_bdevs_discovered": 1, 00:22:28.420 "num_base_bdevs_operational": 1, 00:22:28.420 "base_bdevs_list": [ 00:22:28.420 { 00:22:28.420 "name": null, 00:22:28.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:28.420 "is_configured": false, 00:22:28.420 "data_offset": 256, 00:22:28.420 "data_size": 7936 00:22:28.420 }, 00:22:28.420 { 00:22:28.420 "name": "BaseBdev2", 00:22:28.420 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:28.420 "is_configured": true, 00:22:28.420 "data_offset": 256, 00:22:28.420 "data_size": 7936 00:22:28.420 } 00:22:28.420 ] 00:22:28.420 }' 00:22:28.420 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:28.420 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:28.420 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:28.420 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:28.420 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:28.420 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:22:28.420 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:28.420 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:28.420 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:28.420 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:28.420 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:28.420 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:28.420 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:28.420 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:28.420 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:28.420 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:28.678 [2024-07-24 18:59:13.546116] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:28.678 [2024-07-24 18:59:13.546214] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:22:28.678 [2024-07-24 18:59:13.546223] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:28.678 request: 00:22:28.678 { 00:22:28.678 "base_bdev": "BaseBdev1", 00:22:28.678 "raid_bdev": "raid_bdev1", 00:22:28.678 "method": "bdev_raid_add_base_bdev", 00:22:28.678 "req_id": 1 00:22:28.678 } 00:22:28.678 Got JSON-RPC error response 00:22:28.678 response: 00:22:28.678 { 00:22:28.678 "code": -22, 00:22:28.678 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:22:28.678 } 00:22:28.678 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:22:28.678 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:28.678 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:28.678 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:28.678 18:59:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:22:29.612 18:59:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:29.612 18:59:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:29.612 18:59:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:29.612 18:59:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:29.612 18:59:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:29.612 18:59:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:29.612 18:59:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:29.612 18:59:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:29.612 18:59:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:29.612 18:59:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:29.612 18:59:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.612 18:59:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:29.869 18:59:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:29.869 "name": "raid_bdev1", 00:22:29.869 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:29.869 "strip_size_kb": 0, 00:22:29.869 "state": "online", 00:22:29.869 "raid_level": "raid1", 00:22:29.869 "superblock": true, 00:22:29.869 "num_base_bdevs": 2, 00:22:29.869 "num_base_bdevs_discovered": 1, 00:22:29.869 "num_base_bdevs_operational": 1, 00:22:29.869 "base_bdevs_list": [ 00:22:29.869 { 00:22:29.869 "name": null, 00:22:29.869 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:29.869 "is_configured": false, 00:22:29.869 "data_offset": 256, 00:22:29.869 "data_size": 7936 00:22:29.869 }, 00:22:29.869 { 00:22:29.869 "name": "BaseBdev2", 00:22:29.869 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:29.869 "is_configured": true, 00:22:29.869 "data_offset": 256, 00:22:29.869 "data_size": 7936 00:22:29.869 } 00:22:29.869 ] 00:22:29.869 }' 00:22:29.869 18:59:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:29.869 18:59:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:30.433 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:30.433 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:30.433 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:30.433 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:30.433 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:30.433 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:30.433 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.433 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:30.433 "name": "raid_bdev1", 00:22:30.433 "uuid": "51383e88-1213-4783-b7bc-344d73e5fabd", 00:22:30.433 "strip_size_kb": 0, 00:22:30.433 "state": "online", 00:22:30.433 "raid_level": "raid1", 00:22:30.433 "superblock": true, 00:22:30.433 "num_base_bdevs": 2, 00:22:30.433 "num_base_bdevs_discovered": 1, 00:22:30.433 "num_base_bdevs_operational": 1, 00:22:30.434 "base_bdevs_list": [ 00:22:30.434 { 00:22:30.434 "name": null, 00:22:30.434 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:30.434 "is_configured": false, 00:22:30.434 "data_offset": 256, 00:22:30.434 "data_size": 7936 00:22:30.434 }, 00:22:30.434 { 00:22:30.434 "name": "BaseBdev2", 00:22:30.434 "uuid": "01050d37-dbea-599c-9c9d-93d353b677b0", 00:22:30.434 "is_configured": true, 00:22:30.434 "data_offset": 256, 00:22:30.434 "data_size": 7936 00:22:30.434 } 00:22:30.434 ] 00:22:30.434 }' 00:22:30.434 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:30.434 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:30.434 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:30.692 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:30.692 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 2194149 00:22:30.692 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2194149 ']' 00:22:30.692 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 2194149 00:22:30.692 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:22:30.692 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:30.692 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2194149 00:22:30.692 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:30.692 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:30.692 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2194149' 00:22:30.692 killing process with pid 2194149 00:22:30.692 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 2194149 00:22:30.692 Received shutdown signal, test time was about 60.000000 seconds 00:22:30.692 00:22:30.692 Latency(us) 00:22:30.692 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:30.692 =================================================================================================================== 00:22:30.692 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:30.692 [2024-07-24 18:59:15.514988] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:30.692 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 2194149 00:22:30.692 [2024-07-24 18:59:15.515054] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:30.692 [2024-07-24 18:59:15.515087] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:30.692 [2024-07-24 18:59:15.515093] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x81a200 name raid_bdev1, state offline 00:22:30.692 [2024-07-24 18:59:15.542526] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:30.951 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:22:30.951 00:22:30.951 real 0m25.346s 00:22:30.951 user 0m38.838s 00:22:30.951 sys 0m3.251s 00:22:30.951 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:30.951 18:59:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:30.951 ************************************ 00:22:30.951 END TEST raid_rebuild_test_sb_md_separate 00:22:30.951 ************************************ 00:22:30.951 18:59:15 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:22:30.951 18:59:15 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:22:30.951 18:59:15 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:30.951 18:59:15 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:30.951 18:59:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:30.951 ************************************ 00:22:30.951 START TEST raid_state_function_test_sb_md_interleaved 00:22:30.951 ************************************ 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=2198712 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2198712' 00:22:30.951 Process raid pid: 2198712 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 2198712 /var/tmp/spdk-raid.sock 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2198712 ']' 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:30.951 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:30.951 18:59:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:30.951 [2024-07-24 18:59:15.830168] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:22:30.951 [2024-07-24 18:59:15.830207] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:30.951 [2024-07-24 18:59:15.892732] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:31.209 [2024-07-24 18:59:15.965461] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:31.209 [2024-07-24 18:59:16.021181] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:31.209 [2024-07-24 18:59:16.021205] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:31.774 18:59:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:31.774 18:59:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:22:31.774 18:59:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:32.032 [2024-07-24 18:59:16.788253] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:32.032 [2024-07-24 18:59:16.788282] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:32.032 [2024-07-24 18:59:16.788288] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:32.032 [2024-07-24 18:59:16.788293] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:32.032 18:59:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:32.032 18:59:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:32.032 18:59:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:32.032 18:59:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:32.032 18:59:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:32.032 18:59:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:32.032 18:59:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:32.032 18:59:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:32.032 18:59:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:32.032 18:59:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:32.032 18:59:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.032 18:59:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:32.032 18:59:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:32.032 "name": "Existed_Raid", 00:22:32.032 "uuid": "9cc78f77-445a-4d73-92ec-443d42698e42", 00:22:32.032 "strip_size_kb": 0, 00:22:32.032 "state": "configuring", 00:22:32.032 "raid_level": "raid1", 00:22:32.032 "superblock": true, 00:22:32.032 "num_base_bdevs": 2, 00:22:32.032 "num_base_bdevs_discovered": 0, 00:22:32.032 "num_base_bdevs_operational": 2, 00:22:32.032 "base_bdevs_list": [ 00:22:32.032 { 00:22:32.032 "name": "BaseBdev1", 00:22:32.032 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.032 "is_configured": false, 00:22:32.032 "data_offset": 0, 00:22:32.032 "data_size": 0 00:22:32.032 }, 00:22:32.032 { 00:22:32.032 "name": "BaseBdev2", 00:22:32.032 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.032 "is_configured": false, 00:22:32.032 "data_offset": 0, 00:22:32.032 "data_size": 0 00:22:32.032 } 00:22:32.032 ] 00:22:32.032 }' 00:22:32.032 18:59:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:32.032 18:59:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:32.598 18:59:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:32.856 [2024-07-24 18:59:17.626334] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:32.856 [2024-07-24 18:59:17.626356] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1832b80 name Existed_Raid, state configuring 00:22:32.856 18:59:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:32.856 [2024-07-24 18:59:17.794783] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:32.856 [2024-07-24 18:59:17.794801] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:32.856 [2024-07-24 18:59:17.794806] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:32.856 [2024-07-24 18:59:17.794811] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:32.856 18:59:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:22:33.115 [2024-07-24 18:59:17.967385] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:33.115 BaseBdev1 00:22:33.115 18:59:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:33.115 18:59:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:33.115 18:59:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:33.115 18:59:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:22:33.115 18:59:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:33.115 18:59:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:33.115 18:59:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:33.373 18:59:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:33.373 [ 00:22:33.373 { 00:22:33.373 "name": "BaseBdev1", 00:22:33.373 "aliases": [ 00:22:33.373 "69591515-bb31-4e2d-b8f7-2cb49c6925f9" 00:22:33.373 ], 00:22:33.373 "product_name": "Malloc disk", 00:22:33.373 "block_size": 4128, 00:22:33.373 "num_blocks": 8192, 00:22:33.373 "uuid": "69591515-bb31-4e2d-b8f7-2cb49c6925f9", 00:22:33.373 "md_size": 32, 00:22:33.373 "md_interleave": true, 00:22:33.373 "dif_type": 0, 00:22:33.373 "assigned_rate_limits": { 00:22:33.373 "rw_ios_per_sec": 0, 00:22:33.373 "rw_mbytes_per_sec": 0, 00:22:33.373 "r_mbytes_per_sec": 0, 00:22:33.373 "w_mbytes_per_sec": 0 00:22:33.373 }, 00:22:33.373 "claimed": true, 00:22:33.373 "claim_type": "exclusive_write", 00:22:33.373 "zoned": false, 00:22:33.373 "supported_io_types": { 00:22:33.373 "read": true, 00:22:33.373 "write": true, 00:22:33.373 "unmap": true, 00:22:33.373 "flush": true, 00:22:33.373 "reset": true, 00:22:33.373 "nvme_admin": false, 00:22:33.373 "nvme_io": false, 00:22:33.373 "nvme_io_md": false, 00:22:33.373 "write_zeroes": true, 00:22:33.373 "zcopy": true, 00:22:33.373 "get_zone_info": false, 00:22:33.373 "zone_management": false, 00:22:33.373 "zone_append": false, 00:22:33.373 "compare": false, 00:22:33.373 "compare_and_write": false, 00:22:33.373 "abort": true, 00:22:33.373 "seek_hole": false, 00:22:33.373 "seek_data": false, 00:22:33.373 "copy": true, 00:22:33.373 "nvme_iov_md": false 00:22:33.373 }, 00:22:33.373 "memory_domains": [ 00:22:33.373 { 00:22:33.373 "dma_device_id": "system", 00:22:33.373 "dma_device_type": 1 00:22:33.373 }, 00:22:33.373 { 00:22:33.373 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:33.373 "dma_device_type": 2 00:22:33.373 } 00:22:33.373 ], 00:22:33.373 "driver_specific": {} 00:22:33.373 } 00:22:33.373 ] 00:22:33.373 18:59:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:22:33.373 18:59:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:33.373 18:59:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:33.373 18:59:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:33.373 18:59:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:33.373 18:59:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:33.373 18:59:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:33.373 18:59:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:33.373 18:59:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:33.373 18:59:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:33.373 18:59:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:33.373 18:59:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.373 18:59:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:33.632 18:59:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:33.632 "name": "Existed_Raid", 00:22:33.632 "uuid": "691a5008-4326-4c1e-94f5-992b27adcef1", 00:22:33.632 "strip_size_kb": 0, 00:22:33.632 "state": "configuring", 00:22:33.632 "raid_level": "raid1", 00:22:33.632 "superblock": true, 00:22:33.632 "num_base_bdevs": 2, 00:22:33.632 "num_base_bdevs_discovered": 1, 00:22:33.632 "num_base_bdevs_operational": 2, 00:22:33.632 "base_bdevs_list": [ 00:22:33.632 { 00:22:33.632 "name": "BaseBdev1", 00:22:33.632 "uuid": "69591515-bb31-4e2d-b8f7-2cb49c6925f9", 00:22:33.632 "is_configured": true, 00:22:33.632 "data_offset": 256, 00:22:33.632 "data_size": 7936 00:22:33.632 }, 00:22:33.632 { 00:22:33.632 "name": "BaseBdev2", 00:22:33.632 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:33.632 "is_configured": false, 00:22:33.632 "data_offset": 0, 00:22:33.632 "data_size": 0 00:22:33.632 } 00:22:33.632 ] 00:22:33.632 }' 00:22:33.632 18:59:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:33.632 18:59:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:34.198 18:59:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:34.198 [2024-07-24 18:59:19.146463] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:34.198 [2024-07-24 18:59:19.146503] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1832470 name Existed_Raid, state configuring 00:22:34.198 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:34.456 [2024-07-24 18:59:19.314927] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:34.456 [2024-07-24 18:59:19.315937] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:34.456 [2024-07-24 18:59:19.315961] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:34.456 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:34.456 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:34.456 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:34.456 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:34.456 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:34.456 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:34.456 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:34.456 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:34.456 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:34.456 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:34.456 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:34.456 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:34.456 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.456 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:34.714 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:34.714 "name": "Existed_Raid", 00:22:34.715 "uuid": "8be6d5dd-8f9e-4bb3-9027-37742b627d9f", 00:22:34.715 "strip_size_kb": 0, 00:22:34.715 "state": "configuring", 00:22:34.715 "raid_level": "raid1", 00:22:34.715 "superblock": true, 00:22:34.715 "num_base_bdevs": 2, 00:22:34.715 "num_base_bdevs_discovered": 1, 00:22:34.715 "num_base_bdevs_operational": 2, 00:22:34.715 "base_bdevs_list": [ 00:22:34.715 { 00:22:34.715 "name": "BaseBdev1", 00:22:34.715 "uuid": "69591515-bb31-4e2d-b8f7-2cb49c6925f9", 00:22:34.715 "is_configured": true, 00:22:34.715 "data_offset": 256, 00:22:34.715 "data_size": 7936 00:22:34.715 }, 00:22:34.715 { 00:22:34.715 "name": "BaseBdev2", 00:22:34.715 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:34.715 "is_configured": false, 00:22:34.715 "data_offset": 0, 00:22:34.715 "data_size": 0 00:22:34.715 } 00:22:34.715 ] 00:22:34.715 }' 00:22:34.715 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:34.715 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:35.281 18:59:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:22:35.281 [2024-07-24 18:59:20.160043] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:35.281 [2024-07-24 18:59:20.160145] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1831c80 00:22:35.281 [2024-07-24 18:59:20.160153] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:22:35.281 [2024-07-24 18:59:20.160212] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18b4520 00:22:35.281 [2024-07-24 18:59:20.160262] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1831c80 00:22:35.281 [2024-07-24 18:59:20.160267] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1831c80 00:22:35.282 [2024-07-24 18:59:20.160305] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:35.282 BaseBdev2 00:22:35.282 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:35.282 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:35.282 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:35.282 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:22:35.282 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:35.282 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:35.282 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:35.539 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:35.540 [ 00:22:35.540 { 00:22:35.540 "name": "BaseBdev2", 00:22:35.540 "aliases": [ 00:22:35.540 "4cb6da4d-2b27-4640-8e93-7b39d273e088" 00:22:35.540 ], 00:22:35.540 "product_name": "Malloc disk", 00:22:35.540 "block_size": 4128, 00:22:35.540 "num_blocks": 8192, 00:22:35.540 "uuid": "4cb6da4d-2b27-4640-8e93-7b39d273e088", 00:22:35.540 "md_size": 32, 00:22:35.540 "md_interleave": true, 00:22:35.540 "dif_type": 0, 00:22:35.540 "assigned_rate_limits": { 00:22:35.540 "rw_ios_per_sec": 0, 00:22:35.540 "rw_mbytes_per_sec": 0, 00:22:35.540 "r_mbytes_per_sec": 0, 00:22:35.540 "w_mbytes_per_sec": 0 00:22:35.540 }, 00:22:35.540 "claimed": true, 00:22:35.540 "claim_type": "exclusive_write", 00:22:35.540 "zoned": false, 00:22:35.540 "supported_io_types": { 00:22:35.540 "read": true, 00:22:35.540 "write": true, 00:22:35.540 "unmap": true, 00:22:35.540 "flush": true, 00:22:35.540 "reset": true, 00:22:35.540 "nvme_admin": false, 00:22:35.540 "nvme_io": false, 00:22:35.540 "nvme_io_md": false, 00:22:35.540 "write_zeroes": true, 00:22:35.540 "zcopy": true, 00:22:35.540 "get_zone_info": false, 00:22:35.540 "zone_management": false, 00:22:35.540 "zone_append": false, 00:22:35.540 "compare": false, 00:22:35.540 "compare_and_write": false, 00:22:35.540 "abort": true, 00:22:35.540 "seek_hole": false, 00:22:35.540 "seek_data": false, 00:22:35.540 "copy": true, 00:22:35.540 "nvme_iov_md": false 00:22:35.540 }, 00:22:35.540 "memory_domains": [ 00:22:35.540 { 00:22:35.540 "dma_device_id": "system", 00:22:35.540 "dma_device_type": 1 00:22:35.540 }, 00:22:35.540 { 00:22:35.540 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:35.540 "dma_device_type": 2 00:22:35.540 } 00:22:35.540 ], 00:22:35.540 "driver_specific": {} 00:22:35.540 } 00:22:35.540 ] 00:22:35.540 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:22:35.540 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:35.540 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:35.540 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:22:35.540 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:35.540 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:35.540 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:35.540 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:35.540 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:35.540 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:35.540 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:35.540 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:35.540 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:35.540 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.540 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:35.798 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:35.798 "name": "Existed_Raid", 00:22:35.798 "uuid": "8be6d5dd-8f9e-4bb3-9027-37742b627d9f", 00:22:35.798 "strip_size_kb": 0, 00:22:35.798 "state": "online", 00:22:35.798 "raid_level": "raid1", 00:22:35.798 "superblock": true, 00:22:35.798 "num_base_bdevs": 2, 00:22:35.798 "num_base_bdevs_discovered": 2, 00:22:35.798 "num_base_bdevs_operational": 2, 00:22:35.798 "base_bdevs_list": [ 00:22:35.798 { 00:22:35.798 "name": "BaseBdev1", 00:22:35.798 "uuid": "69591515-bb31-4e2d-b8f7-2cb49c6925f9", 00:22:35.798 "is_configured": true, 00:22:35.798 "data_offset": 256, 00:22:35.798 "data_size": 7936 00:22:35.798 }, 00:22:35.798 { 00:22:35.798 "name": "BaseBdev2", 00:22:35.798 "uuid": "4cb6da4d-2b27-4640-8e93-7b39d273e088", 00:22:35.798 "is_configured": true, 00:22:35.798 "data_offset": 256, 00:22:35.798 "data_size": 7936 00:22:35.798 } 00:22:35.798 ] 00:22:35.798 }' 00:22:35.798 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:35.798 18:59:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:36.364 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:36.364 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:36.364 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:36.364 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:36.364 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:36.364 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:22:36.364 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:36.364 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:36.364 [2024-07-24 18:59:21.339307] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:36.364 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:36.364 "name": "Existed_Raid", 00:22:36.364 "aliases": [ 00:22:36.364 "8be6d5dd-8f9e-4bb3-9027-37742b627d9f" 00:22:36.364 ], 00:22:36.364 "product_name": "Raid Volume", 00:22:36.364 "block_size": 4128, 00:22:36.364 "num_blocks": 7936, 00:22:36.364 "uuid": "8be6d5dd-8f9e-4bb3-9027-37742b627d9f", 00:22:36.364 "md_size": 32, 00:22:36.364 "md_interleave": true, 00:22:36.364 "dif_type": 0, 00:22:36.364 "assigned_rate_limits": { 00:22:36.364 "rw_ios_per_sec": 0, 00:22:36.364 "rw_mbytes_per_sec": 0, 00:22:36.364 "r_mbytes_per_sec": 0, 00:22:36.364 "w_mbytes_per_sec": 0 00:22:36.364 }, 00:22:36.364 "claimed": false, 00:22:36.364 "zoned": false, 00:22:36.364 "supported_io_types": { 00:22:36.364 "read": true, 00:22:36.364 "write": true, 00:22:36.364 "unmap": false, 00:22:36.364 "flush": false, 00:22:36.364 "reset": true, 00:22:36.364 "nvme_admin": false, 00:22:36.364 "nvme_io": false, 00:22:36.364 "nvme_io_md": false, 00:22:36.364 "write_zeroes": true, 00:22:36.364 "zcopy": false, 00:22:36.364 "get_zone_info": false, 00:22:36.364 "zone_management": false, 00:22:36.364 "zone_append": false, 00:22:36.364 "compare": false, 00:22:36.364 "compare_and_write": false, 00:22:36.364 "abort": false, 00:22:36.364 "seek_hole": false, 00:22:36.364 "seek_data": false, 00:22:36.364 "copy": false, 00:22:36.364 "nvme_iov_md": false 00:22:36.364 }, 00:22:36.364 "memory_domains": [ 00:22:36.364 { 00:22:36.364 "dma_device_id": "system", 00:22:36.364 "dma_device_type": 1 00:22:36.364 }, 00:22:36.364 { 00:22:36.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:36.364 "dma_device_type": 2 00:22:36.364 }, 00:22:36.364 { 00:22:36.364 "dma_device_id": "system", 00:22:36.364 "dma_device_type": 1 00:22:36.364 }, 00:22:36.364 { 00:22:36.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:36.364 "dma_device_type": 2 00:22:36.364 } 00:22:36.364 ], 00:22:36.364 "driver_specific": { 00:22:36.364 "raid": { 00:22:36.364 "uuid": "8be6d5dd-8f9e-4bb3-9027-37742b627d9f", 00:22:36.364 "strip_size_kb": 0, 00:22:36.364 "state": "online", 00:22:36.364 "raid_level": "raid1", 00:22:36.364 "superblock": true, 00:22:36.364 "num_base_bdevs": 2, 00:22:36.364 "num_base_bdevs_discovered": 2, 00:22:36.364 "num_base_bdevs_operational": 2, 00:22:36.364 "base_bdevs_list": [ 00:22:36.364 { 00:22:36.364 "name": "BaseBdev1", 00:22:36.364 "uuid": "69591515-bb31-4e2d-b8f7-2cb49c6925f9", 00:22:36.364 "is_configured": true, 00:22:36.364 "data_offset": 256, 00:22:36.364 "data_size": 7936 00:22:36.364 }, 00:22:36.364 { 00:22:36.364 "name": "BaseBdev2", 00:22:36.364 "uuid": "4cb6da4d-2b27-4640-8e93-7b39d273e088", 00:22:36.364 "is_configured": true, 00:22:36.364 "data_offset": 256, 00:22:36.364 "data_size": 7936 00:22:36.364 } 00:22:36.364 ] 00:22:36.364 } 00:22:36.364 } 00:22:36.364 }' 00:22:36.364 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:36.622 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:36.622 BaseBdev2' 00:22:36.622 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:36.622 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:36.622 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:36.622 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:36.622 "name": "BaseBdev1", 00:22:36.623 "aliases": [ 00:22:36.623 "69591515-bb31-4e2d-b8f7-2cb49c6925f9" 00:22:36.623 ], 00:22:36.623 "product_name": "Malloc disk", 00:22:36.623 "block_size": 4128, 00:22:36.623 "num_blocks": 8192, 00:22:36.623 "uuid": "69591515-bb31-4e2d-b8f7-2cb49c6925f9", 00:22:36.623 "md_size": 32, 00:22:36.623 "md_interleave": true, 00:22:36.623 "dif_type": 0, 00:22:36.623 "assigned_rate_limits": { 00:22:36.623 "rw_ios_per_sec": 0, 00:22:36.623 "rw_mbytes_per_sec": 0, 00:22:36.623 "r_mbytes_per_sec": 0, 00:22:36.623 "w_mbytes_per_sec": 0 00:22:36.623 }, 00:22:36.623 "claimed": true, 00:22:36.623 "claim_type": "exclusive_write", 00:22:36.623 "zoned": false, 00:22:36.623 "supported_io_types": { 00:22:36.623 "read": true, 00:22:36.623 "write": true, 00:22:36.623 "unmap": true, 00:22:36.623 "flush": true, 00:22:36.623 "reset": true, 00:22:36.623 "nvme_admin": false, 00:22:36.623 "nvme_io": false, 00:22:36.623 "nvme_io_md": false, 00:22:36.623 "write_zeroes": true, 00:22:36.623 "zcopy": true, 00:22:36.623 "get_zone_info": false, 00:22:36.623 "zone_management": false, 00:22:36.623 "zone_append": false, 00:22:36.623 "compare": false, 00:22:36.623 "compare_and_write": false, 00:22:36.623 "abort": true, 00:22:36.623 "seek_hole": false, 00:22:36.623 "seek_data": false, 00:22:36.623 "copy": true, 00:22:36.623 "nvme_iov_md": false 00:22:36.623 }, 00:22:36.623 "memory_domains": [ 00:22:36.623 { 00:22:36.623 "dma_device_id": "system", 00:22:36.623 "dma_device_type": 1 00:22:36.623 }, 00:22:36.623 { 00:22:36.623 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:36.623 "dma_device_type": 2 00:22:36.623 } 00:22:36.623 ], 00:22:36.623 "driver_specific": {} 00:22:36.623 }' 00:22:36.623 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:36.881 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:36.881 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:22:36.881 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:36.881 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:36.881 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:36.881 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:36.881 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:36.881 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:22:36.881 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:36.881 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:37.140 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:37.140 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:37.140 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:37.140 18:59:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:37.140 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:37.140 "name": "BaseBdev2", 00:22:37.140 "aliases": [ 00:22:37.140 "4cb6da4d-2b27-4640-8e93-7b39d273e088" 00:22:37.140 ], 00:22:37.140 "product_name": "Malloc disk", 00:22:37.140 "block_size": 4128, 00:22:37.140 "num_blocks": 8192, 00:22:37.140 "uuid": "4cb6da4d-2b27-4640-8e93-7b39d273e088", 00:22:37.140 "md_size": 32, 00:22:37.140 "md_interleave": true, 00:22:37.140 "dif_type": 0, 00:22:37.140 "assigned_rate_limits": { 00:22:37.140 "rw_ios_per_sec": 0, 00:22:37.140 "rw_mbytes_per_sec": 0, 00:22:37.140 "r_mbytes_per_sec": 0, 00:22:37.140 "w_mbytes_per_sec": 0 00:22:37.140 }, 00:22:37.140 "claimed": true, 00:22:37.140 "claim_type": "exclusive_write", 00:22:37.140 "zoned": false, 00:22:37.140 "supported_io_types": { 00:22:37.140 "read": true, 00:22:37.140 "write": true, 00:22:37.140 "unmap": true, 00:22:37.140 "flush": true, 00:22:37.140 "reset": true, 00:22:37.140 "nvme_admin": false, 00:22:37.140 "nvme_io": false, 00:22:37.140 "nvme_io_md": false, 00:22:37.140 "write_zeroes": true, 00:22:37.140 "zcopy": true, 00:22:37.140 "get_zone_info": false, 00:22:37.140 "zone_management": false, 00:22:37.140 "zone_append": false, 00:22:37.140 "compare": false, 00:22:37.140 "compare_and_write": false, 00:22:37.140 "abort": true, 00:22:37.140 "seek_hole": false, 00:22:37.140 "seek_data": false, 00:22:37.140 "copy": true, 00:22:37.140 "nvme_iov_md": false 00:22:37.140 }, 00:22:37.140 "memory_domains": [ 00:22:37.140 { 00:22:37.140 "dma_device_id": "system", 00:22:37.140 "dma_device_type": 1 00:22:37.140 }, 00:22:37.140 { 00:22:37.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:37.140 "dma_device_type": 2 00:22:37.140 } 00:22:37.140 ], 00:22:37.140 "driver_specific": {} 00:22:37.140 }' 00:22:37.140 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:37.140 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:37.140 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:22:37.398 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:37.398 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:37.398 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:37.398 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:37.398 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:37.398 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:22:37.398 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:37.398 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:37.398 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:37.398 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:37.657 [2024-07-24 18:59:22.526219] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:37.657 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:37.657 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:22:37.657 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:37.657 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:22:37.657 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:22:37.657 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:22:37.657 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:37.657 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:37.657 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:37.657 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:37.657 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:37.657 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:37.657 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:37.657 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:37.657 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:37.657 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.657 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:37.914 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:37.914 "name": "Existed_Raid", 00:22:37.914 "uuid": "8be6d5dd-8f9e-4bb3-9027-37742b627d9f", 00:22:37.914 "strip_size_kb": 0, 00:22:37.914 "state": "online", 00:22:37.914 "raid_level": "raid1", 00:22:37.914 "superblock": true, 00:22:37.914 "num_base_bdevs": 2, 00:22:37.914 "num_base_bdevs_discovered": 1, 00:22:37.914 "num_base_bdevs_operational": 1, 00:22:37.914 "base_bdevs_list": [ 00:22:37.915 { 00:22:37.915 "name": null, 00:22:37.915 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:37.915 "is_configured": false, 00:22:37.915 "data_offset": 256, 00:22:37.915 "data_size": 7936 00:22:37.915 }, 00:22:37.915 { 00:22:37.915 "name": "BaseBdev2", 00:22:37.915 "uuid": "4cb6da4d-2b27-4640-8e93-7b39d273e088", 00:22:37.915 "is_configured": true, 00:22:37.915 "data_offset": 256, 00:22:37.915 "data_size": 7936 00:22:37.915 } 00:22:37.915 ] 00:22:37.915 }' 00:22:37.915 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:37.915 18:59:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:38.481 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:38.481 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:38.481 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:38.481 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.481 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:38.481 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:38.481 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:38.753 [2024-07-24 18:59:23.517703] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:38.753 [2024-07-24 18:59:23.517771] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:38.753 [2024-07-24 18:59:23.527893] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:38.753 [2024-07-24 18:59:23.527917] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:38.753 [2024-07-24 18:59:23.527923] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1831c80 name Existed_Raid, state offline 00:22:38.753 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:38.753 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:38.753 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.753 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:38.753 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:38.753 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:38.753 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:22:38.753 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 2198712 00:22:38.753 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2198712 ']' 00:22:38.753 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2198712 00:22:38.753 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:22:38.753 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:38.753 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2198712 00:22:38.753 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:39.011 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:39.011 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2198712' 00:22:39.011 killing process with pid 2198712 00:22:39.011 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 2198712 00:22:39.011 [2024-07-24 18:59:23.763615] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:39.011 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 2198712 00:22:39.011 [2024-07-24 18:59:23.764389] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:39.011 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:22:39.011 00:22:39.011 real 0m8.158s 00:22:39.011 user 0m14.562s 00:22:39.011 sys 0m1.374s 00:22:39.011 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:39.011 18:59:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:39.011 ************************************ 00:22:39.011 END TEST raid_state_function_test_sb_md_interleaved 00:22:39.011 ************************************ 00:22:39.011 18:59:23 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:22:39.011 18:59:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:22:39.011 18:59:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:39.011 18:59:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:39.011 ************************************ 00:22:39.011 START TEST raid_superblock_test_md_interleaved 00:22:39.011 ************************************ 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=2200299 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 2200299 /var/tmp/spdk-raid.sock 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2200299 ']' 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:39.011 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:39.011 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:39.269 [2024-07-24 18:59:24.043974] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:22:39.269 [2024-07-24 18:59:24.044010] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2200299 ] 00:22:39.269 [2024-07-24 18:59:24.102205] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:39.269 [2024-07-24 18:59:24.181360] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:39.269 [2024-07-24 18:59:24.227855] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:39.269 [2024-07-24 18:59:24.227871] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:40.203 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:40.203 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:22:40.203 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:22:40.203 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:40.203 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:22:40.203 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:22:40.203 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:22:40.203 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:40.203 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:40.203 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:40.203 18:59:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:22:40.203 malloc1 00:22:40.203 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:40.203 [2024-07-24 18:59:25.184150] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:40.203 [2024-07-24 18:59:25.184185] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:40.203 [2024-07-24 18:59:25.184196] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x138a770 00:22:40.203 [2024-07-24 18:59:25.184202] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:40.203 [2024-07-24 18:59:25.185157] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:40.203 [2024-07-24 18:59:25.185177] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:40.203 pt1 00:22:40.203 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:40.203 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:40.461 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:22:40.461 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:22:40.461 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:22:40.461 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:40.461 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:40.461 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:40.461 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:22:40.461 malloc2 00:22:40.461 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:40.720 [2024-07-24 18:59:25.540882] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:40.720 [2024-07-24 18:59:25.540915] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:40.720 [2024-07-24 18:59:25.540924] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1517f10 00:22:40.720 [2024-07-24 18:59:25.540930] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:40.720 [2024-07-24 18:59:25.541833] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:40.720 [2024-07-24 18:59:25.541852] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:40.720 pt2 00:22:40.720 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:40.720 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:40.720 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:22:40.720 [2024-07-24 18:59:25.725370] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:40.720 [2024-07-24 18:59:25.726370] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:40.720 [2024-07-24 18:59:25.726483] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1519880 00:22:40.720 [2024-07-24 18:59:25.726493] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:22:40.720 [2024-07-24 18:59:25.726537] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1388960 00:22:40.720 [2024-07-24 18:59:25.726595] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1519880 00:22:40.720 [2024-07-24 18:59:25.726601] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1519880 00:22:40.720 [2024-07-24 18:59:25.726637] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:40.978 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:40.978 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:40.978 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:40.978 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:40.978 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:40.978 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:40.978 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:40.978 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:40.978 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:40.978 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:40.978 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:40.978 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:40.978 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:40.978 "name": "raid_bdev1", 00:22:40.978 "uuid": "a991cbdb-5957-46d5-8d53-f29784d803cc", 00:22:40.978 "strip_size_kb": 0, 00:22:40.978 "state": "online", 00:22:40.978 "raid_level": "raid1", 00:22:40.978 "superblock": true, 00:22:40.978 "num_base_bdevs": 2, 00:22:40.978 "num_base_bdevs_discovered": 2, 00:22:40.978 "num_base_bdevs_operational": 2, 00:22:40.978 "base_bdevs_list": [ 00:22:40.978 { 00:22:40.978 "name": "pt1", 00:22:40.978 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:40.978 "is_configured": true, 00:22:40.978 "data_offset": 256, 00:22:40.978 "data_size": 7936 00:22:40.978 }, 00:22:40.978 { 00:22:40.978 "name": "pt2", 00:22:40.978 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:40.978 "is_configured": true, 00:22:40.978 "data_offset": 256, 00:22:40.978 "data_size": 7936 00:22:40.978 } 00:22:40.978 ] 00:22:40.978 }' 00:22:40.978 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:40.978 18:59:25 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:41.545 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:22:41.545 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:41.545 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:41.545 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:41.545 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:41.545 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:22:41.545 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:41.545 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:41.803 [2024-07-24 18:59:26.563725] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:41.803 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:41.803 "name": "raid_bdev1", 00:22:41.803 "aliases": [ 00:22:41.803 "a991cbdb-5957-46d5-8d53-f29784d803cc" 00:22:41.803 ], 00:22:41.803 "product_name": "Raid Volume", 00:22:41.803 "block_size": 4128, 00:22:41.803 "num_blocks": 7936, 00:22:41.803 "uuid": "a991cbdb-5957-46d5-8d53-f29784d803cc", 00:22:41.803 "md_size": 32, 00:22:41.803 "md_interleave": true, 00:22:41.803 "dif_type": 0, 00:22:41.803 "assigned_rate_limits": { 00:22:41.803 "rw_ios_per_sec": 0, 00:22:41.803 "rw_mbytes_per_sec": 0, 00:22:41.803 "r_mbytes_per_sec": 0, 00:22:41.803 "w_mbytes_per_sec": 0 00:22:41.803 }, 00:22:41.803 "claimed": false, 00:22:41.803 "zoned": false, 00:22:41.803 "supported_io_types": { 00:22:41.803 "read": true, 00:22:41.803 "write": true, 00:22:41.803 "unmap": false, 00:22:41.803 "flush": false, 00:22:41.803 "reset": true, 00:22:41.803 "nvme_admin": false, 00:22:41.803 "nvme_io": false, 00:22:41.803 "nvme_io_md": false, 00:22:41.803 "write_zeroes": true, 00:22:41.803 "zcopy": false, 00:22:41.803 "get_zone_info": false, 00:22:41.803 "zone_management": false, 00:22:41.803 "zone_append": false, 00:22:41.803 "compare": false, 00:22:41.803 "compare_and_write": false, 00:22:41.803 "abort": false, 00:22:41.803 "seek_hole": false, 00:22:41.803 "seek_data": false, 00:22:41.803 "copy": false, 00:22:41.803 "nvme_iov_md": false 00:22:41.803 }, 00:22:41.803 "memory_domains": [ 00:22:41.803 { 00:22:41.803 "dma_device_id": "system", 00:22:41.803 "dma_device_type": 1 00:22:41.803 }, 00:22:41.803 { 00:22:41.803 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.803 "dma_device_type": 2 00:22:41.803 }, 00:22:41.803 { 00:22:41.803 "dma_device_id": "system", 00:22:41.803 "dma_device_type": 1 00:22:41.803 }, 00:22:41.803 { 00:22:41.803 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.803 "dma_device_type": 2 00:22:41.803 } 00:22:41.803 ], 00:22:41.803 "driver_specific": { 00:22:41.803 "raid": { 00:22:41.803 "uuid": "a991cbdb-5957-46d5-8d53-f29784d803cc", 00:22:41.803 "strip_size_kb": 0, 00:22:41.803 "state": "online", 00:22:41.803 "raid_level": "raid1", 00:22:41.803 "superblock": true, 00:22:41.803 "num_base_bdevs": 2, 00:22:41.803 "num_base_bdevs_discovered": 2, 00:22:41.803 "num_base_bdevs_operational": 2, 00:22:41.803 "base_bdevs_list": [ 00:22:41.803 { 00:22:41.803 "name": "pt1", 00:22:41.803 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:41.803 "is_configured": true, 00:22:41.803 "data_offset": 256, 00:22:41.803 "data_size": 7936 00:22:41.803 }, 00:22:41.803 { 00:22:41.803 "name": "pt2", 00:22:41.803 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:41.803 "is_configured": true, 00:22:41.803 "data_offset": 256, 00:22:41.803 "data_size": 7936 00:22:41.803 } 00:22:41.803 ] 00:22:41.803 } 00:22:41.803 } 00:22:41.803 }' 00:22:41.803 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:41.803 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:41.804 pt2' 00:22:41.804 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:41.804 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:41.804 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:41.804 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:41.804 "name": "pt1", 00:22:41.804 "aliases": [ 00:22:41.804 "00000000-0000-0000-0000-000000000001" 00:22:41.804 ], 00:22:41.804 "product_name": "passthru", 00:22:41.804 "block_size": 4128, 00:22:41.804 "num_blocks": 8192, 00:22:41.804 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:41.804 "md_size": 32, 00:22:41.804 "md_interleave": true, 00:22:41.804 "dif_type": 0, 00:22:41.804 "assigned_rate_limits": { 00:22:41.804 "rw_ios_per_sec": 0, 00:22:41.804 "rw_mbytes_per_sec": 0, 00:22:41.804 "r_mbytes_per_sec": 0, 00:22:41.804 "w_mbytes_per_sec": 0 00:22:41.804 }, 00:22:41.804 "claimed": true, 00:22:41.804 "claim_type": "exclusive_write", 00:22:41.804 "zoned": false, 00:22:41.804 "supported_io_types": { 00:22:41.804 "read": true, 00:22:41.804 "write": true, 00:22:41.804 "unmap": true, 00:22:41.804 "flush": true, 00:22:41.804 "reset": true, 00:22:41.804 "nvme_admin": false, 00:22:41.804 "nvme_io": false, 00:22:41.804 "nvme_io_md": false, 00:22:41.804 "write_zeroes": true, 00:22:41.804 "zcopy": true, 00:22:41.804 "get_zone_info": false, 00:22:41.804 "zone_management": false, 00:22:41.804 "zone_append": false, 00:22:41.804 "compare": false, 00:22:41.804 "compare_and_write": false, 00:22:41.804 "abort": true, 00:22:41.804 "seek_hole": false, 00:22:41.804 "seek_data": false, 00:22:41.804 "copy": true, 00:22:41.804 "nvme_iov_md": false 00:22:41.804 }, 00:22:41.804 "memory_domains": [ 00:22:41.804 { 00:22:41.804 "dma_device_id": "system", 00:22:41.804 "dma_device_type": 1 00:22:41.804 }, 00:22:41.804 { 00:22:41.804 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.804 "dma_device_type": 2 00:22:41.804 } 00:22:41.804 ], 00:22:41.804 "driver_specific": { 00:22:41.804 "passthru": { 00:22:41.804 "name": "pt1", 00:22:41.804 "base_bdev_name": "malloc1" 00:22:41.804 } 00:22:41.804 } 00:22:41.804 }' 00:22:41.804 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:42.062 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:42.062 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:22:42.062 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:42.062 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:42.062 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:42.062 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:42.062 18:59:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:42.062 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:22:42.062 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:42.334 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:42.334 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:42.334 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:42.334 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:42.334 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:42.334 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:42.334 "name": "pt2", 00:22:42.334 "aliases": [ 00:22:42.334 "00000000-0000-0000-0000-000000000002" 00:22:42.334 ], 00:22:42.334 "product_name": "passthru", 00:22:42.334 "block_size": 4128, 00:22:42.334 "num_blocks": 8192, 00:22:42.334 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:42.334 "md_size": 32, 00:22:42.334 "md_interleave": true, 00:22:42.334 "dif_type": 0, 00:22:42.334 "assigned_rate_limits": { 00:22:42.334 "rw_ios_per_sec": 0, 00:22:42.334 "rw_mbytes_per_sec": 0, 00:22:42.334 "r_mbytes_per_sec": 0, 00:22:42.334 "w_mbytes_per_sec": 0 00:22:42.334 }, 00:22:42.334 "claimed": true, 00:22:42.334 "claim_type": "exclusive_write", 00:22:42.334 "zoned": false, 00:22:42.334 "supported_io_types": { 00:22:42.334 "read": true, 00:22:42.334 "write": true, 00:22:42.334 "unmap": true, 00:22:42.334 "flush": true, 00:22:42.334 "reset": true, 00:22:42.334 "nvme_admin": false, 00:22:42.334 "nvme_io": false, 00:22:42.334 "nvme_io_md": false, 00:22:42.334 "write_zeroes": true, 00:22:42.334 "zcopy": true, 00:22:42.334 "get_zone_info": false, 00:22:42.334 "zone_management": false, 00:22:42.334 "zone_append": false, 00:22:42.334 "compare": false, 00:22:42.334 "compare_and_write": false, 00:22:42.334 "abort": true, 00:22:42.334 "seek_hole": false, 00:22:42.334 "seek_data": false, 00:22:42.334 "copy": true, 00:22:42.334 "nvme_iov_md": false 00:22:42.334 }, 00:22:42.334 "memory_domains": [ 00:22:42.334 { 00:22:42.334 "dma_device_id": "system", 00:22:42.334 "dma_device_type": 1 00:22:42.334 }, 00:22:42.334 { 00:22:42.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:42.334 "dma_device_type": 2 00:22:42.334 } 00:22:42.334 ], 00:22:42.334 "driver_specific": { 00:22:42.334 "passthru": { 00:22:42.334 "name": "pt2", 00:22:42.334 "base_bdev_name": "malloc2" 00:22:42.334 } 00:22:42.334 } 00:22:42.334 }' 00:22:42.334 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:42.334 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:42.592 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:22:42.592 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:42.592 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:42.592 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:42.592 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:42.592 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:42.592 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:22:42.592 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:42.592 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:42.592 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:42.592 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:42.592 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:22:42.849 [2024-07-24 18:59:27.746720] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:42.849 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=a991cbdb-5957-46d5-8d53-f29784d803cc 00:22:42.849 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z a991cbdb-5957-46d5-8d53-f29784d803cc ']' 00:22:42.849 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:43.108 [2024-07-24 18:59:27.923021] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:43.108 [2024-07-24 18:59:27.923031] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:43.108 [2024-07-24 18:59:27.923069] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:43.108 [2024-07-24 18:59:27.923104] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:43.108 [2024-07-24 18:59:27.923110] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1519880 name raid_bdev1, state offline 00:22:43.108 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.108 18:59:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:22:43.108 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:22:43.108 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:22:43.108 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:43.108 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:43.378 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:43.378 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:43.724 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:22:43.724 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:22:43.724 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:22:43.724 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:22:43.724 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:22:43.724 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:22:43.724 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:43.724 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:43.724 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:43.724 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:43.724 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:43.724 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:43.724 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:43.724 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:43.724 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:22:43.983 [2024-07-24 18:59:28.757174] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:22:43.983 [2024-07-24 18:59:28.758155] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:22:43.983 [2024-07-24 18:59:28.758196] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:22:43.983 [2024-07-24 18:59:28.758224] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:22:43.983 [2024-07-24 18:59:28.758233] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:43.983 [2024-07-24 18:59:28.758254] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1516dc0 name raid_bdev1, state configuring 00:22:43.983 request: 00:22:43.983 { 00:22:43.983 "name": "raid_bdev1", 00:22:43.983 "raid_level": "raid1", 00:22:43.983 "base_bdevs": [ 00:22:43.983 "malloc1", 00:22:43.983 "malloc2" 00:22:43.983 ], 00:22:43.983 "superblock": false, 00:22:43.983 "method": "bdev_raid_create", 00:22:43.983 "req_id": 1 00:22:43.983 } 00:22:43.983 Got JSON-RPC error response 00:22:43.983 response: 00:22:43.983 { 00:22:43.983 "code": -17, 00:22:43.983 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:22:43.983 } 00:22:43.983 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:22:43.983 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:43.983 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:43.983 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:43.983 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.983 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:22:43.983 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:22:43.983 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:22:43.983 18:59:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:44.242 [2024-07-24 18:59:29.102019] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:44.242 [2024-07-24 18:59:29.102053] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:44.242 [2024-07-24 18:59:29.102063] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1388760 00:22:44.242 [2024-07-24 18:59:29.102085] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:44.242 [2024-07-24 18:59:29.103098] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:44.242 [2024-07-24 18:59:29.103117] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:44.242 [2024-07-24 18:59:29.103147] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:44.242 [2024-07-24 18:59:29.103164] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:44.242 pt1 00:22:44.242 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:22:44.242 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:44.242 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:44.242 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:44.242 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:44.242 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:44.242 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:44.242 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:44.242 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:44.242 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:44.242 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.242 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:44.501 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:44.501 "name": "raid_bdev1", 00:22:44.501 "uuid": "a991cbdb-5957-46d5-8d53-f29784d803cc", 00:22:44.501 "strip_size_kb": 0, 00:22:44.501 "state": "configuring", 00:22:44.501 "raid_level": "raid1", 00:22:44.501 "superblock": true, 00:22:44.501 "num_base_bdevs": 2, 00:22:44.501 "num_base_bdevs_discovered": 1, 00:22:44.501 "num_base_bdevs_operational": 2, 00:22:44.501 "base_bdevs_list": [ 00:22:44.501 { 00:22:44.501 "name": "pt1", 00:22:44.501 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:44.501 "is_configured": true, 00:22:44.501 "data_offset": 256, 00:22:44.501 "data_size": 7936 00:22:44.501 }, 00:22:44.501 { 00:22:44.501 "name": null, 00:22:44.501 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:44.501 "is_configured": false, 00:22:44.501 "data_offset": 256, 00:22:44.501 "data_size": 7936 00:22:44.501 } 00:22:44.501 ] 00:22:44.501 }' 00:22:44.501 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:44.501 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:44.759 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:22:44.759 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:22:44.759 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:44.759 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:45.016 [2024-07-24 18:59:29.916119] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:45.016 [2024-07-24 18:59:29.916149] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:45.017 [2024-07-24 18:59:29.916160] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151cb90 00:22:45.017 [2024-07-24 18:59:29.916181] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:45.017 [2024-07-24 18:59:29.916301] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:45.017 [2024-07-24 18:59:29.916311] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:45.017 [2024-07-24 18:59:29.916338] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:45.017 [2024-07-24 18:59:29.916349] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:45.017 [2024-07-24 18:59:29.916403] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x151b2a0 00:22:45.017 [2024-07-24 18:59:29.916409] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:22:45.017 [2024-07-24 18:59:29.916446] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x151c2a0 00:22:45.017 [2024-07-24 18:59:29.916505] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x151b2a0 00:22:45.017 [2024-07-24 18:59:29.916511] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x151b2a0 00:22:45.017 [2024-07-24 18:59:29.916550] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:45.017 pt2 00:22:45.017 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:45.017 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:45.017 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:45.017 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:45.017 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:45.017 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:45.017 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:45.017 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:45.017 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:45.017 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:45.017 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:45.017 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:45.017 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.017 18:59:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:45.275 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:45.275 "name": "raid_bdev1", 00:22:45.275 "uuid": "a991cbdb-5957-46d5-8d53-f29784d803cc", 00:22:45.275 "strip_size_kb": 0, 00:22:45.275 "state": "online", 00:22:45.275 "raid_level": "raid1", 00:22:45.275 "superblock": true, 00:22:45.275 "num_base_bdevs": 2, 00:22:45.275 "num_base_bdevs_discovered": 2, 00:22:45.275 "num_base_bdevs_operational": 2, 00:22:45.275 "base_bdevs_list": [ 00:22:45.275 { 00:22:45.275 "name": "pt1", 00:22:45.275 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:45.275 "is_configured": true, 00:22:45.275 "data_offset": 256, 00:22:45.275 "data_size": 7936 00:22:45.275 }, 00:22:45.275 { 00:22:45.275 "name": "pt2", 00:22:45.275 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:45.275 "is_configured": true, 00:22:45.275 "data_offset": 256, 00:22:45.275 "data_size": 7936 00:22:45.275 } 00:22:45.275 ] 00:22:45.275 }' 00:22:45.275 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:45.275 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:45.842 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:22:45.842 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:45.842 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:45.842 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:45.842 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:45.842 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:22:45.842 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:45.842 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:45.842 [2024-07-24 18:59:30.754462] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:45.842 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:45.842 "name": "raid_bdev1", 00:22:45.842 "aliases": [ 00:22:45.842 "a991cbdb-5957-46d5-8d53-f29784d803cc" 00:22:45.842 ], 00:22:45.842 "product_name": "Raid Volume", 00:22:45.842 "block_size": 4128, 00:22:45.842 "num_blocks": 7936, 00:22:45.842 "uuid": "a991cbdb-5957-46d5-8d53-f29784d803cc", 00:22:45.842 "md_size": 32, 00:22:45.842 "md_interleave": true, 00:22:45.842 "dif_type": 0, 00:22:45.842 "assigned_rate_limits": { 00:22:45.842 "rw_ios_per_sec": 0, 00:22:45.842 "rw_mbytes_per_sec": 0, 00:22:45.842 "r_mbytes_per_sec": 0, 00:22:45.842 "w_mbytes_per_sec": 0 00:22:45.842 }, 00:22:45.842 "claimed": false, 00:22:45.842 "zoned": false, 00:22:45.842 "supported_io_types": { 00:22:45.842 "read": true, 00:22:45.842 "write": true, 00:22:45.842 "unmap": false, 00:22:45.842 "flush": false, 00:22:45.842 "reset": true, 00:22:45.842 "nvme_admin": false, 00:22:45.842 "nvme_io": false, 00:22:45.842 "nvme_io_md": false, 00:22:45.842 "write_zeroes": true, 00:22:45.842 "zcopy": false, 00:22:45.842 "get_zone_info": false, 00:22:45.842 "zone_management": false, 00:22:45.842 "zone_append": false, 00:22:45.842 "compare": false, 00:22:45.842 "compare_and_write": false, 00:22:45.842 "abort": false, 00:22:45.842 "seek_hole": false, 00:22:45.842 "seek_data": false, 00:22:45.842 "copy": false, 00:22:45.842 "nvme_iov_md": false 00:22:45.842 }, 00:22:45.842 "memory_domains": [ 00:22:45.842 { 00:22:45.842 "dma_device_id": "system", 00:22:45.842 "dma_device_type": 1 00:22:45.842 }, 00:22:45.842 { 00:22:45.842 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.842 "dma_device_type": 2 00:22:45.842 }, 00:22:45.842 { 00:22:45.842 "dma_device_id": "system", 00:22:45.842 "dma_device_type": 1 00:22:45.842 }, 00:22:45.842 { 00:22:45.842 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.842 "dma_device_type": 2 00:22:45.842 } 00:22:45.842 ], 00:22:45.842 "driver_specific": { 00:22:45.842 "raid": { 00:22:45.842 "uuid": "a991cbdb-5957-46d5-8d53-f29784d803cc", 00:22:45.842 "strip_size_kb": 0, 00:22:45.842 "state": "online", 00:22:45.842 "raid_level": "raid1", 00:22:45.842 "superblock": true, 00:22:45.842 "num_base_bdevs": 2, 00:22:45.842 "num_base_bdevs_discovered": 2, 00:22:45.842 "num_base_bdevs_operational": 2, 00:22:45.842 "base_bdevs_list": [ 00:22:45.842 { 00:22:45.842 "name": "pt1", 00:22:45.842 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:45.842 "is_configured": true, 00:22:45.842 "data_offset": 256, 00:22:45.842 "data_size": 7936 00:22:45.842 }, 00:22:45.842 { 00:22:45.842 "name": "pt2", 00:22:45.842 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:45.842 "is_configured": true, 00:22:45.842 "data_offset": 256, 00:22:45.842 "data_size": 7936 00:22:45.842 } 00:22:45.842 ] 00:22:45.842 } 00:22:45.842 } 00:22:45.842 }' 00:22:45.842 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:45.842 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:45.842 pt2' 00:22:45.842 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:45.842 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:45.842 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:46.100 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:46.100 "name": "pt1", 00:22:46.100 "aliases": [ 00:22:46.101 "00000000-0000-0000-0000-000000000001" 00:22:46.101 ], 00:22:46.101 "product_name": "passthru", 00:22:46.101 "block_size": 4128, 00:22:46.101 "num_blocks": 8192, 00:22:46.101 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:46.101 "md_size": 32, 00:22:46.101 "md_interleave": true, 00:22:46.101 "dif_type": 0, 00:22:46.101 "assigned_rate_limits": { 00:22:46.101 "rw_ios_per_sec": 0, 00:22:46.101 "rw_mbytes_per_sec": 0, 00:22:46.101 "r_mbytes_per_sec": 0, 00:22:46.101 "w_mbytes_per_sec": 0 00:22:46.101 }, 00:22:46.101 "claimed": true, 00:22:46.101 "claim_type": "exclusive_write", 00:22:46.101 "zoned": false, 00:22:46.101 "supported_io_types": { 00:22:46.101 "read": true, 00:22:46.101 "write": true, 00:22:46.101 "unmap": true, 00:22:46.101 "flush": true, 00:22:46.101 "reset": true, 00:22:46.101 "nvme_admin": false, 00:22:46.101 "nvme_io": false, 00:22:46.101 "nvme_io_md": false, 00:22:46.101 "write_zeroes": true, 00:22:46.101 "zcopy": true, 00:22:46.101 "get_zone_info": false, 00:22:46.101 "zone_management": false, 00:22:46.101 "zone_append": false, 00:22:46.101 "compare": false, 00:22:46.101 "compare_and_write": false, 00:22:46.101 "abort": true, 00:22:46.101 "seek_hole": false, 00:22:46.101 "seek_data": false, 00:22:46.101 "copy": true, 00:22:46.101 "nvme_iov_md": false 00:22:46.101 }, 00:22:46.101 "memory_domains": [ 00:22:46.101 { 00:22:46.101 "dma_device_id": "system", 00:22:46.101 "dma_device_type": 1 00:22:46.101 }, 00:22:46.101 { 00:22:46.101 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:46.101 "dma_device_type": 2 00:22:46.101 } 00:22:46.101 ], 00:22:46.101 "driver_specific": { 00:22:46.101 "passthru": { 00:22:46.101 "name": "pt1", 00:22:46.101 "base_bdev_name": "malloc1" 00:22:46.101 } 00:22:46.101 } 00:22:46.101 }' 00:22:46.101 18:59:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:46.101 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:46.101 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:22:46.101 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:46.101 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:46.359 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:46.359 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:46.359 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:46.359 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:22:46.359 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:46.359 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:46.359 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:46.359 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:46.359 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:46.359 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:46.617 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:46.617 "name": "pt2", 00:22:46.617 "aliases": [ 00:22:46.617 "00000000-0000-0000-0000-000000000002" 00:22:46.617 ], 00:22:46.617 "product_name": "passthru", 00:22:46.617 "block_size": 4128, 00:22:46.617 "num_blocks": 8192, 00:22:46.617 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:46.617 "md_size": 32, 00:22:46.617 "md_interleave": true, 00:22:46.617 "dif_type": 0, 00:22:46.617 "assigned_rate_limits": { 00:22:46.617 "rw_ios_per_sec": 0, 00:22:46.617 "rw_mbytes_per_sec": 0, 00:22:46.617 "r_mbytes_per_sec": 0, 00:22:46.617 "w_mbytes_per_sec": 0 00:22:46.617 }, 00:22:46.617 "claimed": true, 00:22:46.617 "claim_type": "exclusive_write", 00:22:46.617 "zoned": false, 00:22:46.617 "supported_io_types": { 00:22:46.617 "read": true, 00:22:46.617 "write": true, 00:22:46.617 "unmap": true, 00:22:46.617 "flush": true, 00:22:46.617 "reset": true, 00:22:46.617 "nvme_admin": false, 00:22:46.617 "nvme_io": false, 00:22:46.617 "nvme_io_md": false, 00:22:46.617 "write_zeroes": true, 00:22:46.617 "zcopy": true, 00:22:46.617 "get_zone_info": false, 00:22:46.617 "zone_management": false, 00:22:46.617 "zone_append": false, 00:22:46.617 "compare": false, 00:22:46.617 "compare_and_write": false, 00:22:46.617 "abort": true, 00:22:46.617 "seek_hole": false, 00:22:46.617 "seek_data": false, 00:22:46.617 "copy": true, 00:22:46.617 "nvme_iov_md": false 00:22:46.617 }, 00:22:46.617 "memory_domains": [ 00:22:46.617 { 00:22:46.617 "dma_device_id": "system", 00:22:46.617 "dma_device_type": 1 00:22:46.617 }, 00:22:46.617 { 00:22:46.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:46.617 "dma_device_type": 2 00:22:46.617 } 00:22:46.617 ], 00:22:46.617 "driver_specific": { 00:22:46.617 "passthru": { 00:22:46.617 "name": "pt2", 00:22:46.617 "base_bdev_name": "malloc2" 00:22:46.617 } 00:22:46.617 } 00:22:46.617 }' 00:22:46.617 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:46.617 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:46.617 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:22:46.617 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:46.617 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:46.617 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:46.617 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:46.617 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:46.875 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:22:46.876 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:46.876 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:46.876 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:46.876 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:46.876 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:22:46.876 [2024-07-24 18:59:31.845274] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:46.876 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' a991cbdb-5957-46d5-8d53-f29784d803cc '!=' a991cbdb-5957-46d5-8d53-f29784d803cc ']' 00:22:46.876 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:22:46.876 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:46.876 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:22:46.876 18:59:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:47.134 [2024-07-24 18:59:32.013562] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:22:47.134 18:59:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:47.134 18:59:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:47.134 18:59:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:47.134 18:59:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:47.134 18:59:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:47.134 18:59:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:47.134 18:59:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:47.134 18:59:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:47.134 18:59:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:47.134 18:59:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:47.134 18:59:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.134 18:59:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:47.393 18:59:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:47.393 "name": "raid_bdev1", 00:22:47.393 "uuid": "a991cbdb-5957-46d5-8d53-f29784d803cc", 00:22:47.393 "strip_size_kb": 0, 00:22:47.393 "state": "online", 00:22:47.393 "raid_level": "raid1", 00:22:47.393 "superblock": true, 00:22:47.393 "num_base_bdevs": 2, 00:22:47.393 "num_base_bdevs_discovered": 1, 00:22:47.393 "num_base_bdevs_operational": 1, 00:22:47.393 "base_bdevs_list": [ 00:22:47.393 { 00:22:47.393 "name": null, 00:22:47.393 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:47.393 "is_configured": false, 00:22:47.393 "data_offset": 256, 00:22:47.393 "data_size": 7936 00:22:47.393 }, 00:22:47.393 { 00:22:47.393 "name": "pt2", 00:22:47.393 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:47.393 "is_configured": true, 00:22:47.393 "data_offset": 256, 00:22:47.393 "data_size": 7936 00:22:47.393 } 00:22:47.393 ] 00:22:47.393 }' 00:22:47.393 18:59:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:47.393 18:59:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:47.960 18:59:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:47.960 [2024-07-24 18:59:32.831651] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:47.960 [2024-07-24 18:59:32.831671] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:47.960 [2024-07-24 18:59:32.831714] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:47.960 [2024-07-24 18:59:32.831748] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:47.960 [2024-07-24 18:59:32.831754] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x151b2a0 name raid_bdev1, state offline 00:22:47.960 18:59:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:22:47.960 18:59:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.219 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:22:48.219 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:22:48.219 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:22:48.219 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:48.219 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:48.219 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:48.219 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:48.219 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:22:48.219 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:48.219 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:22:48.219 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:48.486 [2024-07-24 18:59:33.324912] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:48.486 [2024-07-24 18:59:33.324943] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:48.486 [2024-07-24 18:59:33.324956] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151cf40 00:22:48.486 [2024-07-24 18:59:33.324962] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:48.486 [2024-07-24 18:59:33.325998] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:48.486 [2024-07-24 18:59:33.326017] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:48.486 [2024-07-24 18:59:33.326049] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:48.486 [2024-07-24 18:59:33.326066] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:48.486 [2024-07-24 18:59:33.326111] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x151b9c0 00:22:48.486 [2024-07-24 18:59:33.326116] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:22:48.486 [2024-07-24 18:59:33.326155] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1518ab0 00:22:48.486 [2024-07-24 18:59:33.326204] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x151b9c0 00:22:48.486 [2024-07-24 18:59:33.326209] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x151b9c0 00:22:48.486 [2024-07-24 18:59:33.326246] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:48.486 pt2 00:22:48.486 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:48.486 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:48.486 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:48.486 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:48.486 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:48.486 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:48.486 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:48.486 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:48.486 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:48.486 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:48.486 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.486 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:48.747 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:48.747 "name": "raid_bdev1", 00:22:48.747 "uuid": "a991cbdb-5957-46d5-8d53-f29784d803cc", 00:22:48.747 "strip_size_kb": 0, 00:22:48.747 "state": "online", 00:22:48.747 "raid_level": "raid1", 00:22:48.747 "superblock": true, 00:22:48.747 "num_base_bdevs": 2, 00:22:48.747 "num_base_bdevs_discovered": 1, 00:22:48.747 "num_base_bdevs_operational": 1, 00:22:48.747 "base_bdevs_list": [ 00:22:48.747 { 00:22:48.747 "name": null, 00:22:48.747 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:48.747 "is_configured": false, 00:22:48.747 "data_offset": 256, 00:22:48.747 "data_size": 7936 00:22:48.747 }, 00:22:48.747 { 00:22:48.747 "name": "pt2", 00:22:48.747 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:48.747 "is_configured": true, 00:22:48.747 "data_offset": 256, 00:22:48.747 "data_size": 7936 00:22:48.747 } 00:22:48.747 ] 00:22:48.747 }' 00:22:48.747 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:48.747 18:59:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:49.005 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:49.276 [2024-07-24 18:59:34.167098] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:49.276 [2024-07-24 18:59:34.167122] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:49.276 [2024-07-24 18:59:34.167166] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:49.276 [2024-07-24 18:59:34.167202] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:49.276 [2024-07-24 18:59:34.167208] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x151b9c0 name raid_bdev1, state offline 00:22:49.276 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.276 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:22:49.535 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:22:49.535 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:22:49.535 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:22:49.535 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:49.535 [2024-07-24 18:59:34.515984] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:49.535 [2024-07-24 18:59:34.516015] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:49.535 [2024-07-24 18:59:34.516024] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1518930 00:22:49.535 [2024-07-24 18:59:34.516046] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:49.535 [2024-07-24 18:59:34.517073] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:49.535 [2024-07-24 18:59:34.517091] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:49.535 [2024-07-24 18:59:34.517122] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:49.535 [2024-07-24 18:59:34.517138] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:49.535 [2024-07-24 18:59:34.517188] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:22:49.535 [2024-07-24 18:59:34.517194] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:49.535 [2024-07-24 18:59:34.517203] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x151c5d0 name raid_bdev1, state configuring 00:22:49.535 [2024-07-24 18:59:34.517216] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:49.535 [2024-07-24 18:59:34.517249] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x138a9a0 00:22:49.535 [2024-07-24 18:59:34.517254] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:22:49.535 [2024-07-24 18:59:34.517291] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x151c750 00:22:49.535 [2024-07-24 18:59:34.517336] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x138a9a0 00:22:49.535 [2024-07-24 18:59:34.517341] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x138a9a0 00:22:49.535 [2024-07-24 18:59:34.517378] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:49.535 pt1 00:22:49.535 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:22:49.535 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:49.535 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:49.535 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:49.535 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:49.535 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:49.535 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:49.535 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:49.535 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:49.535 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:49.535 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:49.535 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.535 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.800 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:49.800 "name": "raid_bdev1", 00:22:49.800 "uuid": "a991cbdb-5957-46d5-8d53-f29784d803cc", 00:22:49.800 "strip_size_kb": 0, 00:22:49.800 "state": "online", 00:22:49.800 "raid_level": "raid1", 00:22:49.800 "superblock": true, 00:22:49.800 "num_base_bdevs": 2, 00:22:49.800 "num_base_bdevs_discovered": 1, 00:22:49.800 "num_base_bdevs_operational": 1, 00:22:49.800 "base_bdevs_list": [ 00:22:49.800 { 00:22:49.800 "name": null, 00:22:49.800 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:49.800 "is_configured": false, 00:22:49.800 "data_offset": 256, 00:22:49.800 "data_size": 7936 00:22:49.800 }, 00:22:49.800 { 00:22:49.800 "name": "pt2", 00:22:49.800 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:49.800 "is_configured": true, 00:22:49.800 "data_offset": 256, 00:22:49.800 "data_size": 7936 00:22:49.800 } 00:22:49.800 ] 00:22:49.800 }' 00:22:49.800 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:49.800 18:59:34 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:50.368 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:50.368 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:22:50.368 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:22:50.368 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:50.368 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:22:50.626 [2024-07-24 18:59:35.522753] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:50.626 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' a991cbdb-5957-46d5-8d53-f29784d803cc '!=' a991cbdb-5957-46d5-8d53-f29784d803cc ']' 00:22:50.626 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 2200299 00:22:50.626 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2200299 ']' 00:22:50.626 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2200299 00:22:50.626 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:22:50.626 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:50.627 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2200299 00:22:50.627 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:50.627 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:50.627 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2200299' 00:22:50.627 killing process with pid 2200299 00:22:50.627 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 2200299 00:22:50.627 [2024-07-24 18:59:35.576299] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:50.627 [2024-07-24 18:59:35.576345] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:50.627 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 2200299 00:22:50.627 [2024-07-24 18:59:35.576377] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:50.627 [2024-07-24 18:59:35.576384] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x138a9a0 name raid_bdev1, state offline 00:22:50.627 [2024-07-24 18:59:35.591896] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:50.885 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:22:50.885 00:22:50.885 real 0m11.760s 00:22:50.885 user 0m21.543s 00:22:50.885 sys 0m1.926s 00:22:50.885 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:50.885 18:59:35 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:50.885 ************************************ 00:22:50.885 END TEST raid_superblock_test_md_interleaved 00:22:50.885 ************************************ 00:22:50.885 18:59:35 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:22:50.885 18:59:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:50.886 18:59:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:50.886 18:59:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:50.886 ************************************ 00:22:50.886 START TEST raid_rebuild_test_sb_md_interleaved 00:22:50.886 ************************************ 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=2202658 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 2202658 /var/tmp/spdk-raid.sock 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2202658 ']' 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:50.886 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:50.886 18:59:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:50.886 [2024-07-24 18:59:35.881823] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:22:50.886 [2024-07-24 18:59:35.881861] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2202658 ] 00:22:50.886 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:50.886 Zero copy mechanism will not be used. 00:22:51.145 [2024-07-24 18:59:35.945453] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:51.145 [2024-07-24 18:59:36.023652] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:51.145 [2024-07-24 18:59:36.074044] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:51.145 [2024-07-24 18:59:36.074072] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:51.711 18:59:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:51.711 18:59:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:22:51.712 18:59:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:51.712 18:59:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:22:51.969 BaseBdev1_malloc 00:22:51.969 18:59:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:52.227 [2024-07-24 18:59:36.998282] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:52.227 [2024-07-24 18:59:36.998315] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:52.227 [2024-07-24 18:59:36.998329] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1efaa90 00:22:52.227 [2024-07-24 18:59:36.998350] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:52.228 [2024-07-24 18:59:36.999353] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:52.228 [2024-07-24 18:59:36.999372] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:52.228 BaseBdev1 00:22:52.228 18:59:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:52.228 18:59:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:22:52.228 BaseBdev2_malloc 00:22:52.228 18:59:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:52.487 [2024-07-24 18:59:37.338843] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:52.487 [2024-07-24 18:59:37.338876] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:52.487 [2024-07-24 18:59:37.338887] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2088370 00:22:52.487 [2024-07-24 18:59:37.338908] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:52.487 [2024-07-24 18:59:37.339918] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:52.487 [2024-07-24 18:59:37.339937] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:52.487 BaseBdev2 00:22:52.487 18:59:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:22:52.745 spare_malloc 00:22:52.745 18:59:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:52.745 spare_delay 00:22:52.745 18:59:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:53.004 [2024-07-24 18:59:37.847940] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:53.004 [2024-07-24 18:59:37.847973] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:53.004 [2024-07-24 18:59:37.847985] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2088a10 00:22:53.004 [2024-07-24 18:59:37.847990] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:53.004 [2024-07-24 18:59:37.849083] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:53.004 [2024-07-24 18:59:37.849105] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:53.004 spare 00:22:53.004 18:59:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:53.269 [2024-07-24 18:59:38.016397] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:53.269 [2024-07-24 18:59:38.017303] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:53.270 [2024-07-24 18:59:38.017423] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x207cd80 00:22:53.270 [2024-07-24 18:59:38.017431] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:22:53.270 [2024-07-24 18:59:38.017490] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2086ab0 00:22:53.270 [2024-07-24 18:59:38.017548] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x207cd80 00:22:53.270 [2024-07-24 18:59:38.017553] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x207cd80 00:22:53.270 [2024-07-24 18:59:38.017590] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:53.270 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:53.270 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:53.270 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:53.270 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:53.270 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:53.270 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:53.270 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:53.270 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:53.270 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:53.270 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:53.270 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.270 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:53.270 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:53.270 "name": "raid_bdev1", 00:22:53.270 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:22:53.270 "strip_size_kb": 0, 00:22:53.270 "state": "online", 00:22:53.270 "raid_level": "raid1", 00:22:53.270 "superblock": true, 00:22:53.270 "num_base_bdevs": 2, 00:22:53.270 "num_base_bdevs_discovered": 2, 00:22:53.270 "num_base_bdevs_operational": 2, 00:22:53.270 "base_bdevs_list": [ 00:22:53.270 { 00:22:53.270 "name": "BaseBdev1", 00:22:53.270 "uuid": "e5f53c1d-2608-5d93-ab76-1d23a95aee66", 00:22:53.270 "is_configured": true, 00:22:53.270 "data_offset": 256, 00:22:53.270 "data_size": 7936 00:22:53.270 }, 00:22:53.270 { 00:22:53.270 "name": "BaseBdev2", 00:22:53.270 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:22:53.270 "is_configured": true, 00:22:53.270 "data_offset": 256, 00:22:53.270 "data_size": 7936 00:22:53.270 } 00:22:53.270 ] 00:22:53.270 }' 00:22:53.270 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:53.270 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:53.837 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:53.837 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:53.837 [2024-07-24 18:59:38.838667] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:54.095 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:22:54.095 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.095 18:59:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:54.095 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:22:54.095 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:54.095 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:22:54.095 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:54.354 [2024-07-24 18:59:39.183510] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:54.354 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:54.354 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:54.354 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:54.354 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:54.354 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:54.354 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:54.354 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:54.354 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:54.354 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:54.354 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:54.354 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.354 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:54.613 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:54.613 "name": "raid_bdev1", 00:22:54.613 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:22:54.613 "strip_size_kb": 0, 00:22:54.613 "state": "online", 00:22:54.613 "raid_level": "raid1", 00:22:54.613 "superblock": true, 00:22:54.613 "num_base_bdevs": 2, 00:22:54.613 "num_base_bdevs_discovered": 1, 00:22:54.613 "num_base_bdevs_operational": 1, 00:22:54.614 "base_bdevs_list": [ 00:22:54.614 { 00:22:54.614 "name": null, 00:22:54.614 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:54.614 "is_configured": false, 00:22:54.614 "data_offset": 256, 00:22:54.614 "data_size": 7936 00:22:54.614 }, 00:22:54.614 { 00:22:54.614 "name": "BaseBdev2", 00:22:54.614 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:22:54.614 "is_configured": true, 00:22:54.614 "data_offset": 256, 00:22:54.614 "data_size": 7936 00:22:54.614 } 00:22:54.614 ] 00:22:54.614 }' 00:22:54.614 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:54.614 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:54.872 18:59:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:55.130 [2024-07-24 18:59:40.025703] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:55.130 [2024-07-24 18:59:40.028901] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2086ab0 00:22:55.130 [2024-07-24 18:59:40.030298] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:55.130 18:59:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:56.067 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:56.067 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:56.067 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:56.067 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:56.067 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:56.067 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.067 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:56.325 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:56.325 "name": "raid_bdev1", 00:22:56.325 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:22:56.325 "strip_size_kb": 0, 00:22:56.325 "state": "online", 00:22:56.325 "raid_level": "raid1", 00:22:56.325 "superblock": true, 00:22:56.325 "num_base_bdevs": 2, 00:22:56.325 "num_base_bdevs_discovered": 2, 00:22:56.325 "num_base_bdevs_operational": 2, 00:22:56.325 "process": { 00:22:56.325 "type": "rebuild", 00:22:56.325 "target": "spare", 00:22:56.325 "progress": { 00:22:56.325 "blocks": 2816, 00:22:56.325 "percent": 35 00:22:56.325 } 00:22:56.325 }, 00:22:56.325 "base_bdevs_list": [ 00:22:56.325 { 00:22:56.325 "name": "spare", 00:22:56.325 "uuid": "652092a8-7ddb-589c-845c-208832851159", 00:22:56.325 "is_configured": true, 00:22:56.325 "data_offset": 256, 00:22:56.325 "data_size": 7936 00:22:56.325 }, 00:22:56.325 { 00:22:56.325 "name": "BaseBdev2", 00:22:56.325 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:22:56.325 "is_configured": true, 00:22:56.325 "data_offset": 256, 00:22:56.325 "data_size": 7936 00:22:56.325 } 00:22:56.325 ] 00:22:56.325 }' 00:22:56.325 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:56.325 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:56.325 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:56.325 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:56.325 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:56.584 [2024-07-24 18:59:41.442524] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:56.584 [2024-07-24 18:59:41.540863] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:56.584 [2024-07-24 18:59:41.540895] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:56.584 [2024-07-24 18:59:41.540904] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:56.584 [2024-07-24 18:59:41.540919] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:56.584 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:56.584 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:56.584 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:56.584 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:56.584 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:56.584 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:56.584 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:56.584 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:56.584 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:56.584 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:56.584 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.584 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:56.843 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:56.843 "name": "raid_bdev1", 00:22:56.843 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:22:56.843 "strip_size_kb": 0, 00:22:56.843 "state": "online", 00:22:56.843 "raid_level": "raid1", 00:22:56.843 "superblock": true, 00:22:56.843 "num_base_bdevs": 2, 00:22:56.843 "num_base_bdevs_discovered": 1, 00:22:56.843 "num_base_bdevs_operational": 1, 00:22:56.843 "base_bdevs_list": [ 00:22:56.843 { 00:22:56.843 "name": null, 00:22:56.843 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:56.843 "is_configured": false, 00:22:56.843 "data_offset": 256, 00:22:56.843 "data_size": 7936 00:22:56.843 }, 00:22:56.843 { 00:22:56.843 "name": "BaseBdev2", 00:22:56.843 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:22:56.843 "is_configured": true, 00:22:56.843 "data_offset": 256, 00:22:56.843 "data_size": 7936 00:22:56.843 } 00:22:56.843 ] 00:22:56.843 }' 00:22:56.843 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:56.843 18:59:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:57.410 18:59:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:57.410 18:59:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:57.410 18:59:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:57.410 18:59:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:57.410 18:59:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:57.410 18:59:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.410 18:59:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:57.410 18:59:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:57.410 "name": "raid_bdev1", 00:22:57.410 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:22:57.410 "strip_size_kb": 0, 00:22:57.410 "state": "online", 00:22:57.410 "raid_level": "raid1", 00:22:57.410 "superblock": true, 00:22:57.410 "num_base_bdevs": 2, 00:22:57.410 "num_base_bdevs_discovered": 1, 00:22:57.410 "num_base_bdevs_operational": 1, 00:22:57.410 "base_bdevs_list": [ 00:22:57.410 { 00:22:57.410 "name": null, 00:22:57.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:57.410 "is_configured": false, 00:22:57.410 "data_offset": 256, 00:22:57.410 "data_size": 7936 00:22:57.410 }, 00:22:57.410 { 00:22:57.410 "name": "BaseBdev2", 00:22:57.410 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:22:57.410 "is_configured": true, 00:22:57.410 "data_offset": 256, 00:22:57.410 "data_size": 7936 00:22:57.410 } 00:22:57.410 ] 00:22:57.410 }' 00:22:57.410 18:59:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:57.410 18:59:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:57.410 18:59:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:57.668 18:59:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:57.668 18:59:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:57.668 [2024-07-24 18:59:42.611147] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:57.668 [2024-07-24 18:59:42.614241] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2086ab0 00:22:57.668 [2024-07-24 18:59:42.615231] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:57.668 18:59:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:59.051 "name": "raid_bdev1", 00:22:59.051 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:22:59.051 "strip_size_kb": 0, 00:22:59.051 "state": "online", 00:22:59.051 "raid_level": "raid1", 00:22:59.051 "superblock": true, 00:22:59.051 "num_base_bdevs": 2, 00:22:59.051 "num_base_bdevs_discovered": 2, 00:22:59.051 "num_base_bdevs_operational": 2, 00:22:59.051 "process": { 00:22:59.051 "type": "rebuild", 00:22:59.051 "target": "spare", 00:22:59.051 "progress": { 00:22:59.051 "blocks": 2816, 00:22:59.051 "percent": 35 00:22:59.051 } 00:22:59.051 }, 00:22:59.051 "base_bdevs_list": [ 00:22:59.051 { 00:22:59.051 "name": "spare", 00:22:59.051 "uuid": "652092a8-7ddb-589c-845c-208832851159", 00:22:59.051 "is_configured": true, 00:22:59.051 "data_offset": 256, 00:22:59.051 "data_size": 7936 00:22:59.051 }, 00:22:59.051 { 00:22:59.051 "name": "BaseBdev2", 00:22:59.051 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:22:59.051 "is_configured": true, 00:22:59.051 "data_offset": 256, 00:22:59.051 "data_size": 7936 00:22:59.051 } 00:22:59.051 ] 00:22:59.051 }' 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:22:59.051 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=864 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.051 18:59:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.309 18:59:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:59.309 "name": "raid_bdev1", 00:22:59.309 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:22:59.309 "strip_size_kb": 0, 00:22:59.309 "state": "online", 00:22:59.309 "raid_level": "raid1", 00:22:59.309 "superblock": true, 00:22:59.309 "num_base_bdevs": 2, 00:22:59.309 "num_base_bdevs_discovered": 2, 00:22:59.309 "num_base_bdevs_operational": 2, 00:22:59.309 "process": { 00:22:59.309 "type": "rebuild", 00:22:59.309 "target": "spare", 00:22:59.309 "progress": { 00:22:59.309 "blocks": 3584, 00:22:59.309 "percent": 45 00:22:59.309 } 00:22:59.309 }, 00:22:59.309 "base_bdevs_list": [ 00:22:59.309 { 00:22:59.309 "name": "spare", 00:22:59.309 "uuid": "652092a8-7ddb-589c-845c-208832851159", 00:22:59.309 "is_configured": true, 00:22:59.309 "data_offset": 256, 00:22:59.309 "data_size": 7936 00:22:59.309 }, 00:22:59.309 { 00:22:59.309 "name": "BaseBdev2", 00:22:59.309 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:22:59.309 "is_configured": true, 00:22:59.309 "data_offset": 256, 00:22:59.309 "data_size": 7936 00:22:59.309 } 00:22:59.309 ] 00:22:59.309 }' 00:22:59.309 18:59:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:59.309 18:59:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:59.309 18:59:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:59.309 18:59:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:59.309 18:59:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:00.244 18:59:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:00.244 18:59:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:00.244 18:59:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:00.244 18:59:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:00.244 18:59:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:00.244 18:59:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:00.244 18:59:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.244 18:59:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:00.502 18:59:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:00.502 "name": "raid_bdev1", 00:23:00.502 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:23:00.502 "strip_size_kb": 0, 00:23:00.502 "state": "online", 00:23:00.502 "raid_level": "raid1", 00:23:00.502 "superblock": true, 00:23:00.502 "num_base_bdevs": 2, 00:23:00.502 "num_base_bdevs_discovered": 2, 00:23:00.502 "num_base_bdevs_operational": 2, 00:23:00.502 "process": { 00:23:00.502 "type": "rebuild", 00:23:00.502 "target": "spare", 00:23:00.502 "progress": { 00:23:00.503 "blocks": 6656, 00:23:00.503 "percent": 83 00:23:00.503 } 00:23:00.503 }, 00:23:00.503 "base_bdevs_list": [ 00:23:00.503 { 00:23:00.503 "name": "spare", 00:23:00.503 "uuid": "652092a8-7ddb-589c-845c-208832851159", 00:23:00.503 "is_configured": true, 00:23:00.503 "data_offset": 256, 00:23:00.503 "data_size": 7936 00:23:00.503 }, 00:23:00.503 { 00:23:00.503 "name": "BaseBdev2", 00:23:00.503 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:23:00.503 "is_configured": true, 00:23:00.503 "data_offset": 256, 00:23:00.503 "data_size": 7936 00:23:00.503 } 00:23:00.503 ] 00:23:00.503 }' 00:23:00.503 18:59:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:00.503 18:59:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:00.503 18:59:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:00.503 18:59:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:00.503 18:59:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:00.761 [2024-07-24 18:59:45.736804] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:00.761 [2024-07-24 18:59:45.736848] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:00.761 [2024-07-24 18:59:45.736906] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:01.695 "name": "raid_bdev1", 00:23:01.695 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:23:01.695 "strip_size_kb": 0, 00:23:01.695 "state": "online", 00:23:01.695 "raid_level": "raid1", 00:23:01.695 "superblock": true, 00:23:01.695 "num_base_bdevs": 2, 00:23:01.695 "num_base_bdevs_discovered": 2, 00:23:01.695 "num_base_bdevs_operational": 2, 00:23:01.695 "base_bdevs_list": [ 00:23:01.695 { 00:23:01.695 "name": "spare", 00:23:01.695 "uuid": "652092a8-7ddb-589c-845c-208832851159", 00:23:01.695 "is_configured": true, 00:23:01.695 "data_offset": 256, 00:23:01.695 "data_size": 7936 00:23:01.695 }, 00:23:01.695 { 00:23:01.695 "name": "BaseBdev2", 00:23:01.695 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:23:01.695 "is_configured": true, 00:23:01.695 "data_offset": 256, 00:23:01.695 "data_size": 7936 00:23:01.695 } 00:23:01.695 ] 00:23:01.695 }' 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:01.695 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.954 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:01.954 "name": "raid_bdev1", 00:23:01.954 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:23:01.954 "strip_size_kb": 0, 00:23:01.954 "state": "online", 00:23:01.954 "raid_level": "raid1", 00:23:01.954 "superblock": true, 00:23:01.954 "num_base_bdevs": 2, 00:23:01.954 "num_base_bdevs_discovered": 2, 00:23:01.954 "num_base_bdevs_operational": 2, 00:23:01.954 "base_bdevs_list": [ 00:23:01.954 { 00:23:01.954 "name": "spare", 00:23:01.954 "uuid": "652092a8-7ddb-589c-845c-208832851159", 00:23:01.954 "is_configured": true, 00:23:01.954 "data_offset": 256, 00:23:01.954 "data_size": 7936 00:23:01.954 }, 00:23:01.954 { 00:23:01.954 "name": "BaseBdev2", 00:23:01.954 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:23:01.954 "is_configured": true, 00:23:01.954 "data_offset": 256, 00:23:01.954 "data_size": 7936 00:23:01.954 } 00:23:01.954 ] 00:23:01.954 }' 00:23:01.954 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:01.954 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:01.954 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:01.955 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:01.955 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:01.955 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:01.955 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:01.955 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:01.955 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:01.955 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:01.955 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:01.955 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:01.955 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:01.955 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:01.955 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.955 18:59:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:02.214 18:59:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:02.214 "name": "raid_bdev1", 00:23:02.214 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:23:02.214 "strip_size_kb": 0, 00:23:02.214 "state": "online", 00:23:02.214 "raid_level": "raid1", 00:23:02.214 "superblock": true, 00:23:02.214 "num_base_bdevs": 2, 00:23:02.214 "num_base_bdevs_discovered": 2, 00:23:02.214 "num_base_bdevs_operational": 2, 00:23:02.214 "base_bdevs_list": [ 00:23:02.214 { 00:23:02.214 "name": "spare", 00:23:02.214 "uuid": "652092a8-7ddb-589c-845c-208832851159", 00:23:02.214 "is_configured": true, 00:23:02.214 "data_offset": 256, 00:23:02.214 "data_size": 7936 00:23:02.214 }, 00:23:02.214 { 00:23:02.214 "name": "BaseBdev2", 00:23:02.214 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:23:02.214 "is_configured": true, 00:23:02.214 "data_offset": 256, 00:23:02.214 "data_size": 7936 00:23:02.214 } 00:23:02.214 ] 00:23:02.214 }' 00:23:02.214 18:59:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:02.214 18:59:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:02.803 18:59:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:02.803 [2024-07-24 18:59:47.730161] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:02.803 [2024-07-24 18:59:47.730182] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:02.803 [2024-07-24 18:59:47.730226] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:02.803 [2024-07-24 18:59:47.730262] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:02.803 [2024-07-24 18:59:47.730267] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x207cd80 name raid_bdev1, state offline 00:23:02.803 18:59:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.803 18:59:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:23:03.123 18:59:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:03.123 18:59:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:23:03.123 18:59:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:03.123 18:59:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:03.123 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:03.382 [2024-07-24 18:59:48.235490] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:03.382 [2024-07-24 18:59:48.235520] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:03.382 [2024-07-24 18:59:48.235531] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20863f0 00:23:03.382 [2024-07-24 18:59:48.235537] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:03.382 [2024-07-24 18:59:48.236764] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:03.382 [2024-07-24 18:59:48.236784] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:03.382 [2024-07-24 18:59:48.236821] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:03.382 [2024-07-24 18:59:48.236839] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:03.382 [2024-07-24 18:59:48.236895] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:03.382 spare 00:23:03.382 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:03.382 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:03.382 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:03.382 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:03.382 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:03.382 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:03.382 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:03.382 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:03.382 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:03.382 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:03.382 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:03.382 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:03.382 [2024-07-24 18:59:48.337182] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x207da70 00:23:03.382 [2024-07-24 18:59:48.337192] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:23:03.382 [2024-07-24 18:59:48.337250] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x207dd50 00:23:03.382 [2024-07-24 18:59:48.337313] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x207da70 00:23:03.382 [2024-07-24 18:59:48.337319] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x207da70 00:23:03.382 [2024-07-24 18:59:48.337363] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:03.640 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:03.640 "name": "raid_bdev1", 00:23:03.640 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:23:03.641 "strip_size_kb": 0, 00:23:03.641 "state": "online", 00:23:03.641 "raid_level": "raid1", 00:23:03.641 "superblock": true, 00:23:03.641 "num_base_bdevs": 2, 00:23:03.641 "num_base_bdevs_discovered": 2, 00:23:03.641 "num_base_bdevs_operational": 2, 00:23:03.641 "base_bdevs_list": [ 00:23:03.641 { 00:23:03.641 "name": "spare", 00:23:03.641 "uuid": "652092a8-7ddb-589c-845c-208832851159", 00:23:03.641 "is_configured": true, 00:23:03.641 "data_offset": 256, 00:23:03.641 "data_size": 7936 00:23:03.641 }, 00:23:03.641 { 00:23:03.641 "name": "BaseBdev2", 00:23:03.641 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:23:03.641 "is_configured": true, 00:23:03.641 "data_offset": 256, 00:23:03.641 "data_size": 7936 00:23:03.641 } 00:23:03.641 ] 00:23:03.641 }' 00:23:03.641 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:03.641 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:03.899 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:03.899 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:03.899 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:03.899 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:03.899 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:03.899 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:03.899 18:59:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:04.158 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:04.158 "name": "raid_bdev1", 00:23:04.158 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:23:04.158 "strip_size_kb": 0, 00:23:04.158 "state": "online", 00:23:04.158 "raid_level": "raid1", 00:23:04.158 "superblock": true, 00:23:04.158 "num_base_bdevs": 2, 00:23:04.158 "num_base_bdevs_discovered": 2, 00:23:04.158 "num_base_bdevs_operational": 2, 00:23:04.158 "base_bdevs_list": [ 00:23:04.158 { 00:23:04.158 "name": "spare", 00:23:04.158 "uuid": "652092a8-7ddb-589c-845c-208832851159", 00:23:04.158 "is_configured": true, 00:23:04.158 "data_offset": 256, 00:23:04.158 "data_size": 7936 00:23:04.158 }, 00:23:04.158 { 00:23:04.158 "name": "BaseBdev2", 00:23:04.158 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:23:04.158 "is_configured": true, 00:23:04.158 "data_offset": 256, 00:23:04.158 "data_size": 7936 00:23:04.158 } 00:23:04.158 ] 00:23:04.158 }' 00:23:04.158 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:04.158 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:04.158 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:04.158 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:04.158 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.158 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:04.416 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:04.416 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:04.675 [2024-07-24 18:59:49.466726] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:04.675 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:04.675 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:04.675 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:04.675 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:04.675 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:04.675 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:04.675 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:04.675 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:04.675 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:04.675 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:04.675 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.675 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:04.675 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:04.675 "name": "raid_bdev1", 00:23:04.675 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:23:04.675 "strip_size_kb": 0, 00:23:04.675 "state": "online", 00:23:04.675 "raid_level": "raid1", 00:23:04.675 "superblock": true, 00:23:04.675 "num_base_bdevs": 2, 00:23:04.675 "num_base_bdevs_discovered": 1, 00:23:04.675 "num_base_bdevs_operational": 1, 00:23:04.675 "base_bdevs_list": [ 00:23:04.675 { 00:23:04.675 "name": null, 00:23:04.675 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:04.675 "is_configured": false, 00:23:04.675 "data_offset": 256, 00:23:04.675 "data_size": 7936 00:23:04.675 }, 00:23:04.675 { 00:23:04.675 "name": "BaseBdev2", 00:23:04.675 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:23:04.675 "is_configured": true, 00:23:04.675 "data_offset": 256, 00:23:04.675 "data_size": 7936 00:23:04.675 } 00:23:04.675 ] 00:23:04.675 }' 00:23:04.675 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:04.675 18:59:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:05.241 18:59:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:05.500 [2024-07-24 18:59:50.280850] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:05.500 [2024-07-24 18:59:50.280976] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:05.500 [2024-07-24 18:59:50.280986] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:05.500 [2024-07-24 18:59:50.281005] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:05.500 [2024-07-24 18:59:50.284052] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x207dd50 00:23:05.500 [2024-07-24 18:59:50.285462] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:05.500 18:59:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:06.433 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:06.433 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:06.433 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:06.433 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:06.433 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:06.433 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.433 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:06.692 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:06.692 "name": "raid_bdev1", 00:23:06.692 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:23:06.692 "strip_size_kb": 0, 00:23:06.692 "state": "online", 00:23:06.692 "raid_level": "raid1", 00:23:06.692 "superblock": true, 00:23:06.692 "num_base_bdevs": 2, 00:23:06.692 "num_base_bdevs_discovered": 2, 00:23:06.692 "num_base_bdevs_operational": 2, 00:23:06.692 "process": { 00:23:06.692 "type": "rebuild", 00:23:06.692 "target": "spare", 00:23:06.692 "progress": { 00:23:06.692 "blocks": 2816, 00:23:06.692 "percent": 35 00:23:06.692 } 00:23:06.692 }, 00:23:06.692 "base_bdevs_list": [ 00:23:06.692 { 00:23:06.692 "name": "spare", 00:23:06.692 "uuid": "652092a8-7ddb-589c-845c-208832851159", 00:23:06.692 "is_configured": true, 00:23:06.692 "data_offset": 256, 00:23:06.692 "data_size": 7936 00:23:06.692 }, 00:23:06.692 { 00:23:06.692 "name": "BaseBdev2", 00:23:06.692 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:23:06.692 "is_configured": true, 00:23:06.692 "data_offset": 256, 00:23:06.692 "data_size": 7936 00:23:06.692 } 00:23:06.692 ] 00:23:06.692 }' 00:23:06.692 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:06.692 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:06.692 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:06.692 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:06.692 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:06.951 [2024-07-24 18:59:51.713878] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:06.951 [2024-07-24 18:59:51.796115] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:06.952 [2024-07-24 18:59:51.796142] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:06.952 [2024-07-24 18:59:51.796150] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:06.952 [2024-07-24 18:59:51.796154] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:06.952 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:06.952 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:06.952 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:06.952 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:06.952 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:06.952 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:06.952 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:06.952 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:06.952 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:06.952 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:06.952 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.952 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:07.210 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:07.210 "name": "raid_bdev1", 00:23:07.210 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:23:07.210 "strip_size_kb": 0, 00:23:07.210 "state": "online", 00:23:07.210 "raid_level": "raid1", 00:23:07.210 "superblock": true, 00:23:07.210 "num_base_bdevs": 2, 00:23:07.210 "num_base_bdevs_discovered": 1, 00:23:07.210 "num_base_bdevs_operational": 1, 00:23:07.210 "base_bdevs_list": [ 00:23:07.210 { 00:23:07.210 "name": null, 00:23:07.210 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:07.211 "is_configured": false, 00:23:07.211 "data_offset": 256, 00:23:07.211 "data_size": 7936 00:23:07.211 }, 00:23:07.211 { 00:23:07.211 "name": "BaseBdev2", 00:23:07.211 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:23:07.211 "is_configured": true, 00:23:07.211 "data_offset": 256, 00:23:07.211 "data_size": 7936 00:23:07.211 } 00:23:07.211 ] 00:23:07.211 }' 00:23:07.211 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:07.211 18:59:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:07.469 18:59:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:07.727 [2024-07-24 18:59:52.629663] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:07.727 [2024-07-24 18:59:52.629709] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:07.727 [2024-07-24 18:59:52.629740] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20867b0 00:23:07.727 [2024-07-24 18:59:52.629746] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:07.727 [2024-07-24 18:59:52.629883] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:07.727 [2024-07-24 18:59:52.629892] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:07.727 [2024-07-24 18:59:52.629929] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:07.727 [2024-07-24 18:59:52.629936] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:07.727 [2024-07-24 18:59:52.629941] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:07.727 [2024-07-24 18:59:52.629951] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:07.727 [2024-07-24 18:59:52.632972] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ef46a0 00:23:07.727 [2024-07-24 18:59:52.634013] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:07.727 spare 00:23:07.727 18:59:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:23:08.661 18:59:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:08.661 18:59:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:08.661 18:59:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:08.661 18:59:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:08.661 18:59:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:08.661 18:59:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:08.661 18:59:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:08.919 18:59:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:08.919 "name": "raid_bdev1", 00:23:08.919 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:23:08.919 "strip_size_kb": 0, 00:23:08.919 "state": "online", 00:23:08.919 "raid_level": "raid1", 00:23:08.919 "superblock": true, 00:23:08.919 "num_base_bdevs": 2, 00:23:08.919 "num_base_bdevs_discovered": 2, 00:23:08.919 "num_base_bdevs_operational": 2, 00:23:08.919 "process": { 00:23:08.919 "type": "rebuild", 00:23:08.919 "target": "spare", 00:23:08.919 "progress": { 00:23:08.919 "blocks": 2816, 00:23:08.919 "percent": 35 00:23:08.919 } 00:23:08.919 }, 00:23:08.919 "base_bdevs_list": [ 00:23:08.919 { 00:23:08.919 "name": "spare", 00:23:08.919 "uuid": "652092a8-7ddb-589c-845c-208832851159", 00:23:08.919 "is_configured": true, 00:23:08.919 "data_offset": 256, 00:23:08.919 "data_size": 7936 00:23:08.919 }, 00:23:08.919 { 00:23:08.919 "name": "BaseBdev2", 00:23:08.919 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:23:08.919 "is_configured": true, 00:23:08.919 "data_offset": 256, 00:23:08.919 "data_size": 7936 00:23:08.919 } 00:23:08.919 ] 00:23:08.919 }' 00:23:08.919 18:59:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:08.919 18:59:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:08.919 18:59:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:08.919 18:59:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:08.919 18:59:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:09.178 [2024-07-24 18:59:54.050293] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:09.178 [2024-07-24 18:59:54.144647] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:09.178 [2024-07-24 18:59:54.144676] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:09.178 [2024-07-24 18:59:54.144685] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:09.178 [2024-07-24 18:59:54.144689] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:09.178 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:09.178 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:09.178 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:09.178 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:09.178 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:09.178 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:09.178 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:09.178 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:09.178 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:09.178 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:09.178 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.178 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:09.436 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:09.436 "name": "raid_bdev1", 00:23:09.436 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:23:09.436 "strip_size_kb": 0, 00:23:09.436 "state": "online", 00:23:09.436 "raid_level": "raid1", 00:23:09.436 "superblock": true, 00:23:09.436 "num_base_bdevs": 2, 00:23:09.436 "num_base_bdevs_discovered": 1, 00:23:09.436 "num_base_bdevs_operational": 1, 00:23:09.436 "base_bdevs_list": [ 00:23:09.436 { 00:23:09.436 "name": null, 00:23:09.436 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:09.436 "is_configured": false, 00:23:09.436 "data_offset": 256, 00:23:09.436 "data_size": 7936 00:23:09.436 }, 00:23:09.436 { 00:23:09.436 "name": "BaseBdev2", 00:23:09.436 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:23:09.436 "is_configured": true, 00:23:09.436 "data_offset": 256, 00:23:09.436 "data_size": 7936 00:23:09.436 } 00:23:09.436 ] 00:23:09.436 }' 00:23:09.436 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:09.436 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:10.003 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:10.003 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:10.003 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:10.003 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:10.003 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:10.003 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.003 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:10.003 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:10.003 "name": "raid_bdev1", 00:23:10.003 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:23:10.003 "strip_size_kb": 0, 00:23:10.003 "state": "online", 00:23:10.003 "raid_level": "raid1", 00:23:10.003 "superblock": true, 00:23:10.003 "num_base_bdevs": 2, 00:23:10.003 "num_base_bdevs_discovered": 1, 00:23:10.003 "num_base_bdevs_operational": 1, 00:23:10.003 "base_bdevs_list": [ 00:23:10.003 { 00:23:10.003 "name": null, 00:23:10.003 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:10.003 "is_configured": false, 00:23:10.003 "data_offset": 256, 00:23:10.003 "data_size": 7936 00:23:10.003 }, 00:23:10.003 { 00:23:10.003 "name": "BaseBdev2", 00:23:10.003 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:23:10.003 "is_configured": true, 00:23:10.003 "data_offset": 256, 00:23:10.003 "data_size": 7936 00:23:10.003 } 00:23:10.003 ] 00:23:10.003 }' 00:23:10.003 18:59:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:10.003 18:59:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:10.003 18:59:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:10.262 18:59:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:10.262 18:59:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:10.262 18:59:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:10.521 [2024-07-24 18:59:55.367239] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:10.521 [2024-07-24 18:59:55.367274] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:10.521 [2024-07-24 18:59:55.367285] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1efacc0 00:23:10.521 [2024-07-24 18:59:55.367306] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:10.521 [2024-07-24 18:59:55.367429] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:10.521 [2024-07-24 18:59:55.367443] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:10.521 [2024-07-24 18:59:55.367482] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:10.521 [2024-07-24 18:59:55.367489] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:10.521 [2024-07-24 18:59:55.367493] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:10.521 BaseBdev1 00:23:10.521 18:59:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:23:11.456 18:59:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:11.456 18:59:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:11.456 18:59:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:11.456 18:59:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:11.456 18:59:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:11.456 18:59:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:11.456 18:59:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:11.456 18:59:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:11.456 18:59:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:11.456 18:59:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:11.456 18:59:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.456 18:59:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.715 18:59:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:11.715 "name": "raid_bdev1", 00:23:11.715 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:23:11.715 "strip_size_kb": 0, 00:23:11.715 "state": "online", 00:23:11.715 "raid_level": "raid1", 00:23:11.715 "superblock": true, 00:23:11.715 "num_base_bdevs": 2, 00:23:11.715 "num_base_bdevs_discovered": 1, 00:23:11.715 "num_base_bdevs_operational": 1, 00:23:11.715 "base_bdevs_list": [ 00:23:11.715 { 00:23:11.715 "name": null, 00:23:11.715 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:11.715 "is_configured": false, 00:23:11.715 "data_offset": 256, 00:23:11.715 "data_size": 7936 00:23:11.715 }, 00:23:11.715 { 00:23:11.715 "name": "BaseBdev2", 00:23:11.715 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:23:11.715 "is_configured": true, 00:23:11.715 "data_offset": 256, 00:23:11.715 "data_size": 7936 00:23:11.715 } 00:23:11.715 ] 00:23:11.715 }' 00:23:11.715 18:59:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:11.715 18:59:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:12.282 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:12.282 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:12.282 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:12.282 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:12.282 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:12.282 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:12.282 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:12.282 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:12.282 "name": "raid_bdev1", 00:23:12.282 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:23:12.282 "strip_size_kb": 0, 00:23:12.282 "state": "online", 00:23:12.282 "raid_level": "raid1", 00:23:12.282 "superblock": true, 00:23:12.282 "num_base_bdevs": 2, 00:23:12.282 "num_base_bdevs_discovered": 1, 00:23:12.282 "num_base_bdevs_operational": 1, 00:23:12.282 "base_bdevs_list": [ 00:23:12.282 { 00:23:12.282 "name": null, 00:23:12.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:12.282 "is_configured": false, 00:23:12.282 "data_offset": 256, 00:23:12.282 "data_size": 7936 00:23:12.282 }, 00:23:12.282 { 00:23:12.282 "name": "BaseBdev2", 00:23:12.282 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:23:12.282 "is_configured": true, 00:23:12.282 "data_offset": 256, 00:23:12.282 "data_size": 7936 00:23:12.282 } 00:23:12.282 ] 00:23:12.282 }' 00:23:12.282 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:12.540 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:12.540 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:12.540 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:12.540 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:12.540 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:23:12.540 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:12.540 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:12.540 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:12.540 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:12.540 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:12.540 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:12.540 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:12.541 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:12.541 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:12.541 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:12.541 [2024-07-24 18:59:57.496834] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:12.541 [2024-07-24 18:59:57.496928] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:12.541 [2024-07-24 18:59:57.496937] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:12.541 request: 00:23:12.541 { 00:23:12.541 "base_bdev": "BaseBdev1", 00:23:12.541 "raid_bdev": "raid_bdev1", 00:23:12.541 "method": "bdev_raid_add_base_bdev", 00:23:12.541 "req_id": 1 00:23:12.541 } 00:23:12.541 Got JSON-RPC error response 00:23:12.541 response: 00:23:12.541 { 00:23:12.541 "code": -22, 00:23:12.541 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:12.541 } 00:23:12.541 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:23:12.541 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:12.541 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:12.541 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:12.541 18:59:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:13.913 18:59:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:13.913 18:59:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:13.913 18:59:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:13.913 18:59:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:13.913 18:59:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:13.913 18:59:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:13.913 18:59:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:13.913 18:59:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:13.913 18:59:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:13.913 18:59:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:13.913 18:59:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.913 18:59:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:13.913 18:59:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:13.913 "name": "raid_bdev1", 00:23:13.913 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:23:13.913 "strip_size_kb": 0, 00:23:13.913 "state": "online", 00:23:13.913 "raid_level": "raid1", 00:23:13.913 "superblock": true, 00:23:13.913 "num_base_bdevs": 2, 00:23:13.913 "num_base_bdevs_discovered": 1, 00:23:13.913 "num_base_bdevs_operational": 1, 00:23:13.913 "base_bdevs_list": [ 00:23:13.914 { 00:23:13.914 "name": null, 00:23:13.914 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:13.914 "is_configured": false, 00:23:13.914 "data_offset": 256, 00:23:13.914 "data_size": 7936 00:23:13.914 }, 00:23:13.914 { 00:23:13.914 "name": "BaseBdev2", 00:23:13.914 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:23:13.914 "is_configured": true, 00:23:13.914 "data_offset": 256, 00:23:13.914 "data_size": 7936 00:23:13.914 } 00:23:13.914 ] 00:23:13.914 }' 00:23:13.914 18:59:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:13.914 18:59:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:14.171 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:14.171 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:14.171 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:14.171 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:14.171 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:14.171 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.171 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.429 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:14.429 "name": "raid_bdev1", 00:23:14.429 "uuid": "24504337-89eb-4474-b492-9bce949b6a4a", 00:23:14.429 "strip_size_kb": 0, 00:23:14.429 "state": "online", 00:23:14.429 "raid_level": "raid1", 00:23:14.429 "superblock": true, 00:23:14.429 "num_base_bdevs": 2, 00:23:14.429 "num_base_bdevs_discovered": 1, 00:23:14.429 "num_base_bdevs_operational": 1, 00:23:14.429 "base_bdevs_list": [ 00:23:14.429 { 00:23:14.429 "name": null, 00:23:14.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:14.429 "is_configured": false, 00:23:14.429 "data_offset": 256, 00:23:14.429 "data_size": 7936 00:23:14.429 }, 00:23:14.429 { 00:23:14.429 "name": "BaseBdev2", 00:23:14.429 "uuid": "e1e66c25-2db5-5d04-ae8d-59a9d45f19f6", 00:23:14.429 "is_configured": true, 00:23:14.429 "data_offset": 256, 00:23:14.429 "data_size": 7936 00:23:14.429 } 00:23:14.429 ] 00:23:14.429 }' 00:23:14.429 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:14.429 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:14.429 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:14.429 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:14.429 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 2202658 00:23:14.430 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2202658 ']' 00:23:14.430 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2202658 00:23:14.430 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:23:14.430 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:14.688 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2202658 00:23:14.688 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:14.688 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:14.688 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2202658' 00:23:14.688 killing process with pid 2202658 00:23:14.688 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 2202658 00:23:14.688 Received shutdown signal, test time was about 60.000000 seconds 00:23:14.688 00:23:14.688 Latency(us) 00:23:14.688 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:14.688 =================================================================================================================== 00:23:14.688 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:14.688 [2024-07-24 18:59:59.477037] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:14.688 [2024-07-24 18:59:59.477102] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:14.688 [2024-07-24 18:59:59.477136] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:14.688 [2024-07-24 18:59:59.477141] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x207da70 name raid_bdev1, state offline 00:23:14.688 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 2202658 00:23:14.688 [2024-07-24 18:59:59.501047] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:14.688 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:23:14.688 00:23:14.688 real 0m23.847s 00:23:14.688 user 0m37.051s 00:23:14.688 sys 0m2.529s 00:23:14.688 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:14.688 18:59:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:14.688 ************************************ 00:23:14.688 END TEST raid_rebuild_test_sb_md_interleaved 00:23:14.688 ************************************ 00:23:14.947 18:59:59 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:23:14.947 18:59:59 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:23:14.947 18:59:59 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 2202658 ']' 00:23:14.947 18:59:59 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 2202658 00:23:14.947 18:59:59 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:23:14.947 00:23:14.947 real 14m10.549s 00:23:14.947 user 24m0.717s 00:23:14.947 sys 2m8.733s 00:23:14.947 18:59:59 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:14.947 18:59:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:14.947 ************************************ 00:23:14.947 END TEST bdev_raid 00:23:14.947 ************************************ 00:23:14.947 18:59:59 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:23:14.947 18:59:59 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:23:14.947 18:59:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:14.947 18:59:59 -- common/autotest_common.sh@10 -- # set +x 00:23:14.947 ************************************ 00:23:14.947 START TEST bdevperf_config 00:23:14.947 ************************************ 00:23:14.947 18:59:59 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:23:14.947 * Looking for test storage... 00:23:14.947 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:14.947 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:14.947 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:14.947 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:14.947 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:14.947 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:14.947 18:59:59 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:17.477 19:00:02 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-24 18:59:59.975439] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:23:17.477 [2024-07-24 18:59:59.975488] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2206994 ] 00:23:17.477 Using job config with 4 jobs 00:23:17.477 [2024-07-24 19:00:00.048987] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:17.477 [2024-07-24 19:00:00.136127] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:17.477 cpumask for '\''job0'\'' is too big 00:23:17.477 cpumask for '\''job1'\'' is too big 00:23:17.477 cpumask for '\''job2'\'' is too big 00:23:17.477 cpumask for '\''job3'\'' is too big 00:23:17.477 Running I/O for 2 seconds... 00:23:17.477 00:23:17.477 Latency(us) 00:23:17.477 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:17.477 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:17.477 Malloc0 : 2.01 35612.63 34.78 0.00 0.00 7181.74 1240.50 10610.59 00:23:17.477 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:17.477 Malloc0 : 2.01 35587.47 34.75 0.00 0.00 7176.02 1170.29 9487.12 00:23:17.477 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:17.477 Malloc0 : 2.02 35563.02 34.73 0.00 0.00 7169.88 1193.69 9050.21 00:23:17.477 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:17.477 Malloc0 : 2.02 35538.56 34.71 0.00 0.00 7165.12 1240.50 9050.21 00:23:17.477 =================================================================================================================== 00:23:17.477 Total : 142301.68 138.97 0.00 0.00 7173.19 1170.29 10610.59' 00:23:17.477 19:00:02 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-24 18:59:59.975439] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:23:17.477 [2024-07-24 18:59:59.975488] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2206994 ] 00:23:17.477 Using job config with 4 jobs 00:23:17.477 [2024-07-24 19:00:00.048987] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:17.477 [2024-07-24 19:00:00.136127] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:17.477 cpumask for '\''job0'\'' is too big 00:23:17.477 cpumask for '\''job1'\'' is too big 00:23:17.477 cpumask for '\''job2'\'' is too big 00:23:17.477 cpumask for '\''job3'\'' is too big 00:23:17.477 Running I/O for 2 seconds... 00:23:17.477 00:23:17.477 Latency(us) 00:23:17.477 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:17.477 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:17.477 Malloc0 : 2.01 35612.63 34.78 0.00 0.00 7181.74 1240.50 10610.59 00:23:17.477 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:17.477 Malloc0 : 2.01 35587.47 34.75 0.00 0.00 7176.02 1170.29 9487.12 00:23:17.477 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:17.477 Malloc0 : 2.02 35563.02 34.73 0.00 0.00 7169.88 1193.69 9050.21 00:23:17.477 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:17.477 Malloc0 : 2.02 35538.56 34.71 0.00 0.00 7165.12 1240.50 9050.21 00:23:17.477 =================================================================================================================== 00:23:17.477 Total : 142301.68 138.97 0.00 0.00 7173.19 1170.29 10610.59' 00:23:17.477 19:00:02 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-24 18:59:59.975439] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:23:17.477 [2024-07-24 18:59:59.975488] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2206994 ] 00:23:17.477 Using job config with 4 jobs 00:23:17.477 [2024-07-24 19:00:00.048987] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:17.477 [2024-07-24 19:00:00.136127] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:17.477 cpumask for '\''job0'\'' is too big 00:23:17.477 cpumask for '\''job1'\'' is too big 00:23:17.477 cpumask for '\''job2'\'' is too big 00:23:17.477 cpumask for '\''job3'\'' is too big 00:23:17.477 Running I/O for 2 seconds... 00:23:17.477 00:23:17.477 Latency(us) 00:23:17.477 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:17.477 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:17.477 Malloc0 : 2.01 35612.63 34.78 0.00 0.00 7181.74 1240.50 10610.59 00:23:17.477 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:17.477 Malloc0 : 2.01 35587.47 34.75 0.00 0.00 7176.02 1170.29 9487.12 00:23:17.477 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:17.477 Malloc0 : 2.02 35563.02 34.73 0.00 0.00 7169.88 1193.69 9050.21 00:23:17.477 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:17.477 Malloc0 : 2.02 35538.56 34.71 0.00 0.00 7165.12 1240.50 9050.21 00:23:17.477 =================================================================================================================== 00:23:17.477 Total : 142301.68 138.97 0.00 0.00 7173.19 1170.29 10610.59' 00:23:17.477 19:00:02 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:23:17.477 19:00:02 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:23:17.477 19:00:02 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:23:17.477 19:00:02 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:17.734 [2024-07-24 19:00:02.530464] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:23:17.734 [2024-07-24 19:00:02.530520] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2207574 ] 00:23:17.734 [2024-07-24 19:00:02.603786] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:17.734 [2024-07-24 19:00:02.687041] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:17.991 cpumask for 'job0' is too big 00:23:17.991 cpumask for 'job1' is too big 00:23:17.991 cpumask for 'job2' is too big 00:23:17.991 cpumask for 'job3' is too big 00:23:20.520 19:00:05 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:23:20.520 Running I/O for 2 seconds... 00:23:20.520 00:23:20.520 Latency(us) 00:23:20.520 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:20.520 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:20.520 Malloc0 : 2.01 37479.12 36.60 0.00 0.00 6824.03 1185.89 10298.51 00:23:20.520 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:20.520 Malloc0 : 2.01 37480.68 36.60 0.00 0.00 6814.12 1178.09 9112.62 00:23:20.520 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:20.520 Malloc0 : 2.02 37457.49 36.58 0.00 0.00 6808.37 1162.48 7926.74 00:23:20.520 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:20.520 Malloc0 : 2.02 37434.28 36.56 0.00 0.00 6802.98 1162.48 7708.28 00:23:20.520 =================================================================================================================== 00:23:20.520 Total : 149851.58 146.34 0.00 0.00 6812.37 1162.48 10298.51' 00:23:20.520 19:00:05 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:23:20.520 19:00:05 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:20.520 19:00:05 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:23:20.520 19:00:05 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:20.521 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:20.521 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:20.521 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:20.521 19:00:05 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:23.050 19:00:07 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-24 19:00:05.102087] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:23:23.050 [2024-07-24 19:00:05.102132] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2207944 ] 00:23:23.050 Using job config with 3 jobs 00:23:23.050 [2024-07-24 19:00:05.179727] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:23.050 [2024-07-24 19:00:05.276246] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:23.050 cpumask for '\''job0'\'' is too big 00:23:23.050 cpumask for '\''job1'\'' is too big 00:23:23.050 cpumask for '\''job2'\'' is too big 00:23:23.050 Running I/O for 2 seconds... 00:23:23.050 00:23:23.050 Latency(us) 00:23:23.050 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:23.050 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:23.050 Malloc0 : 2.01 49650.52 48.49 0.00 0.00 5153.36 1240.50 7365.00 00:23:23.050 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:23.050 Malloc0 : 2.01 49666.53 48.50 0.00 0.00 5144.29 1154.68 6803.26 00:23:23.050 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:23.050 Malloc0 : 2.01 49638.25 48.47 0.00 0.00 5139.61 1146.88 6834.47 00:23:23.050 =================================================================================================================== 00:23:23.051 Total : 148955.30 145.46 0.00 0.00 5145.75 1146.88 7365.00' 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-24 19:00:05.102087] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:23:23.051 [2024-07-24 19:00:05.102132] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2207944 ] 00:23:23.051 Using job config with 3 jobs 00:23:23.051 [2024-07-24 19:00:05.179727] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:23.051 [2024-07-24 19:00:05.276246] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:23.051 cpumask for '\''job0'\'' is too big 00:23:23.051 cpumask for '\''job1'\'' is too big 00:23:23.051 cpumask for '\''job2'\'' is too big 00:23:23.051 Running I/O for 2 seconds... 00:23:23.051 00:23:23.051 Latency(us) 00:23:23.051 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:23.051 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:23.051 Malloc0 : 2.01 49650.52 48.49 0.00 0.00 5153.36 1240.50 7365.00 00:23:23.051 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:23.051 Malloc0 : 2.01 49666.53 48.50 0.00 0.00 5144.29 1154.68 6803.26 00:23:23.051 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:23.051 Malloc0 : 2.01 49638.25 48.47 0.00 0.00 5139.61 1146.88 6834.47 00:23:23.051 =================================================================================================================== 00:23:23.051 Total : 148955.30 145.46 0.00 0.00 5145.75 1146.88 7365.00' 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-24 19:00:05.102087] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:23:23.051 [2024-07-24 19:00:05.102132] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2207944 ] 00:23:23.051 Using job config with 3 jobs 00:23:23.051 [2024-07-24 19:00:05.179727] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:23.051 [2024-07-24 19:00:05.276246] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:23.051 cpumask for '\''job0'\'' is too big 00:23:23.051 cpumask for '\''job1'\'' is too big 00:23:23.051 cpumask for '\''job2'\'' is too big 00:23:23.051 Running I/O for 2 seconds... 00:23:23.051 00:23:23.051 Latency(us) 00:23:23.051 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:23.051 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:23.051 Malloc0 : 2.01 49650.52 48.49 0.00 0.00 5153.36 1240.50 7365.00 00:23:23.051 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:23.051 Malloc0 : 2.01 49666.53 48.50 0.00 0.00 5144.29 1154.68 6803.26 00:23:23.051 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:23.051 Malloc0 : 2.01 49638.25 48.47 0.00 0.00 5139.61 1146.88 6834.47 00:23:23.051 =================================================================================================================== 00:23:23.051 Total : 148955.30 145.46 0.00 0.00 5145.75 1146.88 7365.00' 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:23.051 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:23.051 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:23.051 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:23.051 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:23.051 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:23.051 19:00:07 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:25.597 19:00:10 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-24 19:00:07.702041] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:23:25.597 [2024-07-24 19:00:07.702087] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2208319 ] 00:23:25.597 Using job config with 4 jobs 00:23:25.597 [2024-07-24 19:00:07.784335] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:25.597 [2024-07-24 19:00:07.884517] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:25.597 cpumask for '\''job0'\'' is too big 00:23:25.597 cpumask for '\''job1'\'' is too big 00:23:25.597 cpumask for '\''job2'\'' is too big 00:23:25.597 cpumask for '\''job3'\'' is too big 00:23:25.597 Running I/O for 2 seconds... 00:23:25.597 00:23:25.597 Latency(us) 00:23:25.597 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:25.597 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc0 : 2.03 17934.22 17.51 0.00 0.00 14263.91 2512.21 21845.33 00:23:25.597 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc1 : 2.03 17922.73 17.50 0.00 0.00 14264.47 3011.54 21845.33 00:23:25.597 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc0 : 2.03 17911.62 17.49 0.00 0.00 14240.27 2512.21 19348.72 00:23:25.597 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc1 : 2.03 17900.18 17.48 0.00 0.00 14239.38 3011.54 19348.72 00:23:25.597 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc0 : 2.03 17889.10 17.47 0.00 0.00 14214.80 2512.21 17850.76 00:23:25.597 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc1 : 2.03 17877.73 17.46 0.00 0.00 14215.40 3089.55 17850.76 00:23:25.597 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc0 : 2.03 17866.60 17.45 0.00 0.00 14189.61 2559.02 17850.76 00:23:25.597 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc1 : 2.04 17855.33 17.44 0.00 0.00 14188.96 3089.55 17850.76 00:23:25.597 =================================================================================================================== 00:23:25.597 Total : 143157.52 139.80 0.00 0.00 14227.10 2512.21 21845.33' 00:23:25.597 19:00:10 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-24 19:00:07.702041] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:23:25.597 [2024-07-24 19:00:07.702087] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2208319 ] 00:23:25.597 Using job config with 4 jobs 00:23:25.597 [2024-07-24 19:00:07.784335] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:25.597 [2024-07-24 19:00:07.884517] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:25.597 cpumask for '\''job0'\'' is too big 00:23:25.597 cpumask for '\''job1'\'' is too big 00:23:25.597 cpumask for '\''job2'\'' is too big 00:23:25.597 cpumask for '\''job3'\'' is too big 00:23:25.597 Running I/O for 2 seconds... 00:23:25.597 00:23:25.597 Latency(us) 00:23:25.597 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:25.597 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc0 : 2.03 17934.22 17.51 0.00 0.00 14263.91 2512.21 21845.33 00:23:25.597 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc1 : 2.03 17922.73 17.50 0.00 0.00 14264.47 3011.54 21845.33 00:23:25.597 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc0 : 2.03 17911.62 17.49 0.00 0.00 14240.27 2512.21 19348.72 00:23:25.597 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc1 : 2.03 17900.18 17.48 0.00 0.00 14239.38 3011.54 19348.72 00:23:25.597 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc0 : 2.03 17889.10 17.47 0.00 0.00 14214.80 2512.21 17850.76 00:23:25.597 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc1 : 2.03 17877.73 17.46 0.00 0.00 14215.40 3089.55 17850.76 00:23:25.597 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc0 : 2.03 17866.60 17.45 0.00 0.00 14189.61 2559.02 17850.76 00:23:25.597 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc1 : 2.04 17855.33 17.44 0.00 0.00 14188.96 3089.55 17850.76 00:23:25.597 =================================================================================================================== 00:23:25.597 Total : 143157.52 139.80 0.00 0.00 14227.10 2512.21 21845.33' 00:23:25.597 19:00:10 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-24 19:00:07.702041] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:23:25.597 [2024-07-24 19:00:07.702087] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2208319 ] 00:23:25.597 Using job config with 4 jobs 00:23:25.597 [2024-07-24 19:00:07.784335] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:25.597 [2024-07-24 19:00:07.884517] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:25.597 cpumask for '\''job0'\'' is too big 00:23:25.597 cpumask for '\''job1'\'' is too big 00:23:25.597 cpumask for '\''job2'\'' is too big 00:23:25.597 cpumask for '\''job3'\'' is too big 00:23:25.597 Running I/O for 2 seconds... 00:23:25.597 00:23:25.597 Latency(us) 00:23:25.597 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:25.597 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc0 : 2.03 17934.22 17.51 0.00 0.00 14263.91 2512.21 21845.33 00:23:25.597 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc1 : 2.03 17922.73 17.50 0.00 0.00 14264.47 3011.54 21845.33 00:23:25.597 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc0 : 2.03 17911.62 17.49 0.00 0.00 14240.27 2512.21 19348.72 00:23:25.597 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc1 : 2.03 17900.18 17.48 0.00 0.00 14239.38 3011.54 19348.72 00:23:25.597 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc0 : 2.03 17889.10 17.47 0.00 0.00 14214.80 2512.21 17850.76 00:23:25.597 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc1 : 2.03 17877.73 17.46 0.00 0.00 14215.40 3089.55 17850.76 00:23:25.597 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc0 : 2.03 17866.60 17.45 0.00 0.00 14189.61 2559.02 17850.76 00:23:25.597 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:25.597 Malloc1 : 2.04 17855.33 17.44 0.00 0.00 14188.96 3089.55 17850.76 00:23:25.597 =================================================================================================================== 00:23:25.597 Total : 143157.52 139.80 0.00 0.00 14227.10 2512.21 21845.33' 00:23:25.597 19:00:10 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:23:25.597 19:00:10 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:23:25.597 19:00:10 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:23:25.597 19:00:10 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:23:25.597 19:00:10 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:25.597 19:00:10 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:23:25.597 00:23:25.597 real 0m10.452s 00:23:25.597 user 0m9.482s 00:23:25.597 sys 0m0.807s 00:23:25.597 19:00:10 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:25.597 19:00:10 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:23:25.597 ************************************ 00:23:25.597 END TEST bdevperf_config 00:23:25.598 ************************************ 00:23:25.598 19:00:10 -- spdk/autotest.sh@192 -- # uname -s 00:23:25.598 19:00:10 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:23:25.598 19:00:10 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:23:25.598 19:00:10 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:23:25.598 19:00:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:25.598 19:00:10 -- common/autotest_common.sh@10 -- # set +x 00:23:25.598 ************************************ 00:23:25.598 START TEST reactor_set_interrupt 00:23:25.598 ************************************ 00:23:25.598 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:23:25.598 * Looking for test storage... 00:23:25.598 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:25.598 19:00:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:23:25.598 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:23:25.598 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:25.598 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:25.598 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:23:25.598 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:23:25.598 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:23:25.598 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:23:25.598 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:23:25.598 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:23:25.598 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:23:25.598 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:23:25.598 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:23:25.598 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:23:25.598 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:23:25.598 19:00:10 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:23:25.599 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:23:25.599 19:00:10 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:23:25.599 19:00:10 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:23:25.599 19:00:10 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:23:25.599 19:00:10 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:23:25.599 19:00:10 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:23:25.599 19:00:10 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:23:25.599 19:00:10 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:23:25.599 19:00:10 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:23:25.599 19:00:10 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:23:25.599 19:00:10 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:23:25.599 19:00:10 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:23:25.599 19:00:10 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:23:25.599 19:00:10 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:23:25.599 19:00:10 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:23:25.599 19:00:10 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:23:25.599 #define SPDK_CONFIG_H 00:23:25.599 #define SPDK_CONFIG_APPS 1 00:23:25.599 #define SPDK_CONFIG_ARCH native 00:23:25.599 #undef SPDK_CONFIG_ASAN 00:23:25.599 #undef SPDK_CONFIG_AVAHI 00:23:25.599 #undef SPDK_CONFIG_CET 00:23:25.599 #define SPDK_CONFIG_COVERAGE 1 00:23:25.599 #define SPDK_CONFIG_CROSS_PREFIX 00:23:25.599 #define SPDK_CONFIG_CRYPTO 1 00:23:25.599 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:23:25.599 #undef SPDK_CONFIG_CUSTOMOCF 00:23:25.599 #undef SPDK_CONFIG_DAOS 00:23:25.599 #define SPDK_CONFIG_DAOS_DIR 00:23:25.599 #define SPDK_CONFIG_DEBUG 1 00:23:25.599 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:23:25.599 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:23:25.599 #define SPDK_CONFIG_DPDK_INC_DIR 00:23:25.599 #define SPDK_CONFIG_DPDK_LIB_DIR 00:23:25.599 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:23:25.599 #undef SPDK_CONFIG_DPDK_UADK 00:23:25.599 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:23:25.599 #define SPDK_CONFIG_EXAMPLES 1 00:23:25.599 #undef SPDK_CONFIG_FC 00:23:25.599 #define SPDK_CONFIG_FC_PATH 00:23:25.599 #define SPDK_CONFIG_FIO_PLUGIN 1 00:23:25.599 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:23:25.599 #undef SPDK_CONFIG_FUSE 00:23:25.599 #undef SPDK_CONFIG_FUZZER 00:23:25.599 #define SPDK_CONFIG_FUZZER_LIB 00:23:25.599 #undef SPDK_CONFIG_GOLANG 00:23:25.599 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:23:25.599 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:23:25.599 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:23:25.599 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:23:25.599 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:23:25.599 #undef SPDK_CONFIG_HAVE_LIBBSD 00:23:25.599 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:23:25.599 #define SPDK_CONFIG_IDXD 1 00:23:25.599 #define SPDK_CONFIG_IDXD_KERNEL 1 00:23:25.599 #define SPDK_CONFIG_IPSEC_MB 1 00:23:25.599 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:23:25.599 #define SPDK_CONFIG_ISAL 1 00:23:25.599 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:23:25.599 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:23:25.599 #define SPDK_CONFIG_LIBDIR 00:23:25.599 #undef SPDK_CONFIG_LTO 00:23:25.599 #define SPDK_CONFIG_MAX_LCORES 128 00:23:25.599 #define SPDK_CONFIG_NVME_CUSE 1 00:23:25.599 #undef SPDK_CONFIG_OCF 00:23:25.599 #define SPDK_CONFIG_OCF_PATH 00:23:25.599 #define SPDK_CONFIG_OPENSSL_PATH 00:23:25.599 #undef SPDK_CONFIG_PGO_CAPTURE 00:23:25.599 #define SPDK_CONFIG_PGO_DIR 00:23:25.599 #undef SPDK_CONFIG_PGO_USE 00:23:25.599 #define SPDK_CONFIG_PREFIX /usr/local 00:23:25.599 #undef SPDK_CONFIG_RAID5F 00:23:25.599 #undef SPDK_CONFIG_RBD 00:23:25.599 #define SPDK_CONFIG_RDMA 1 00:23:25.599 #define SPDK_CONFIG_RDMA_PROV verbs 00:23:25.599 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:23:25.599 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:23:25.599 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:23:25.599 #define SPDK_CONFIG_SHARED 1 00:23:25.599 #undef SPDK_CONFIG_SMA 00:23:25.599 #define SPDK_CONFIG_TESTS 1 00:23:25.599 #undef SPDK_CONFIG_TSAN 00:23:25.599 #define SPDK_CONFIG_UBLK 1 00:23:25.599 #define SPDK_CONFIG_UBSAN 1 00:23:25.599 #undef SPDK_CONFIG_UNIT_TESTS 00:23:25.599 #undef SPDK_CONFIG_URING 00:23:25.599 #define SPDK_CONFIG_URING_PATH 00:23:25.599 #undef SPDK_CONFIG_URING_ZNS 00:23:25.599 #undef SPDK_CONFIG_USDT 00:23:25.599 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:23:25.599 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:23:25.599 #undef SPDK_CONFIG_VFIO_USER 00:23:25.599 #define SPDK_CONFIG_VFIO_USER_DIR 00:23:25.599 #define SPDK_CONFIG_VHOST 1 00:23:25.599 #define SPDK_CONFIG_VIRTIO 1 00:23:25.599 #undef SPDK_CONFIG_VTUNE 00:23:25.599 #define SPDK_CONFIG_VTUNE_DIR 00:23:25.599 #define SPDK_CONFIG_WERROR 1 00:23:25.599 #define SPDK_CONFIG_WPDK_DIR 00:23:25.599 #undef SPDK_CONFIG_XNVME 00:23:25.599 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:23:25.599 19:00:10 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:23:25.599 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:23:25.599 19:00:10 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:23:25.599 19:00:10 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:23:25.599 19:00:10 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:23:25.599 19:00:10 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:25.599 19:00:10 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:25.599 19:00:10 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:25.599 19:00:10 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:23:25.599 19:00:10 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:25.599 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:23:25.599 19:00:10 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:23:25.600 19:00:10 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:23:25.600 19:00:10 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:23:25.600 19:00:10 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:23:25.600 19:00:10 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:23:25.600 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j96 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 2209175 ]] 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 2209175 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@1678 -- # set_test_storage 2147483648 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.EwY7t3 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.EwY7t3/tests/interrupt /tmp/spdk.EwY7t3 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:23:25.601 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=895512576 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4388917248 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=89610723328 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=95562764288 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=5952040960 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47725748224 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47781380096 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=55631872 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=19102932992 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=19112554496 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9621504 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47780880384 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47781384192 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=503808 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9556271104 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9556275200 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:23:25.602 * Looking for test storage... 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=89610723328 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=8166633472 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:25.602 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set -o errtrace 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@1681 -- # shopt -s extdebug 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@1685 -- # true 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # xtrace_fd 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:23:25.602 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:23:25.602 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:23:25.602 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:25.602 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:23:25.602 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:23:25.602 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:23:25.602 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:23:25.602 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:23:25.602 19:00:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:23:25.603 19:00:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:23:25.603 19:00:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:23:25.603 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:25.603 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:23:25.603 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2209313 00:23:25.603 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:25.603 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:23:25.603 19:00:10 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2209313 /var/tmp/spdk.sock 00:23:25.603 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 2209313 ']' 00:23:25.603 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:25.603 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:25.603 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:25.603 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:25.603 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:25.603 19:00:10 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:23:25.603 [2024-07-24 19:00:10.583838] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:23:25.603 [2024-07-24 19:00:10.583881] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2209313 ] 00:23:25.862 [2024-07-24 19:00:10.647214] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:23:25.862 [2024-07-24 19:00:10.727569] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:23:25.862 [2024-07-24 19:00:10.727665] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:23:25.862 [2024-07-24 19:00:10.727667] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:25.862 [2024-07-24 19:00:10.790987] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:23:26.502 19:00:11 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:26.502 19:00:11 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:23:26.502 19:00:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:23:26.502 19:00:11 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:26.761 Malloc0 00:23:26.761 Malloc1 00:23:26.761 Malloc2 00:23:26.761 19:00:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:23:26.761 19:00:11 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:23:26.761 19:00:11 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:23:26.761 19:00:11 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:23:26.761 5000+0 records in 00:23:26.761 5000+0 records out 00:23:26.761 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0165288 s, 620 MB/s 00:23:26.761 19:00:11 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:23:27.020 AIO0 00:23:27.020 19:00:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 2209313 00:23:27.020 19:00:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 2209313 without_thd 00:23:27.020 19:00:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=2209313 00:23:27.020 19:00:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:23:27.020 19:00:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:23:27.020 19:00:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:23:27.020 19:00:11 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:23:27.020 19:00:11 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:23:27.020 19:00:11 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:23:27.020 19:00:11 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:27.020 19:00:11 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:23:27.020 19:00:11 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:27.020 19:00:12 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:23:27.020 19:00:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:23:27.020 19:00:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:23:27.020 19:00:12 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:23:27.020 19:00:12 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:23:27.020 19:00:12 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:23:27.020 19:00:12 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:27.020 19:00:12 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:23:27.020 19:00:12 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:27.280 19:00:12 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:23:27.280 19:00:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:23:27.280 19:00:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:23:27.280 spdk_thread ids are 1 on reactor0. 00:23:27.280 19:00:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:23:27.280 19:00:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2209313 0 00:23:27.280 19:00:12 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2209313 0 idle 00:23:27.280 19:00:12 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2209313 00:23:27.280 19:00:12 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:23:27.280 19:00:12 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:27.280 19:00:12 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:27.280 19:00:12 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:27.280 19:00:12 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:27.280 19:00:12 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:27.280 19:00:12 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:27.280 19:00:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2209313 -w 256 00:23:27.280 19:00:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2209313 root 20 0 128.2g 36864 23808 S 0.0 0.0 0:00.27 reactor_0' 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2209313 root 20 0 128.2g 36864 23808 S 0.0 0.0 0:00.27 reactor_0 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2209313 1 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2209313 1 idle 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2209313 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2209313 -w 256 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2209351 root 20 0 128.2g 36864 23808 S 0.0 0.0 0:00.00 reactor_1' 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2209351 root 20 0 128.2g 36864 23808 S 0.0 0.0 0:00.00 reactor_1 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2209313 2 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2209313 2 idle 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2209313 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:23:27.539 19:00:12 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2209313 -w 256 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2209352 root 20 0 128.2g 36864 23808 S 0.0 0.0 0:00.00 reactor_2' 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2209352 root 20 0 128.2g 36864 23808 S 0.0 0.0 0:00.00 reactor_2 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:23:27.798 19:00:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:23:28.057 [2024-07-24 19:00:12.872418] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:23:28.057 19:00:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:23:28.057 [2024-07-24 19:00:13.040111] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:23:28.057 [2024-07-24 19:00:13.040502] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:28.057 19:00:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:23:28.315 [2024-07-24 19:00:13.212018] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:23:28.315 [2024-07-24 19:00:13.212127] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:28.315 19:00:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:23:28.315 19:00:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2209313 0 00:23:28.315 19:00:13 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2209313 0 busy 00:23:28.315 19:00:13 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2209313 00:23:28.315 19:00:13 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:23:28.315 19:00:13 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:23:28.315 19:00:13 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:23:28.315 19:00:13 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:28.315 19:00:13 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:28.315 19:00:13 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:28.315 19:00:13 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2209313 -w 256 00:23:28.315 19:00:13 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2209313 root 20 0 128.2g 36864 23808 R 99.9 0.0 0:00.63 reactor_0' 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2209313 root 20 0 128.2g 36864 23808 R 99.9 0.0 0:00.63 reactor_0 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2209313 2 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2209313 2 busy 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2209313 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2209313 -w 256 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2209352 root 20 0 128.2g 36864 23808 R 93.8 0.0 0:00.35 reactor_2' 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2209352 root 20 0 128.2g 36864 23808 R 93.8 0.0 0:00.35 reactor_2 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=93.8 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=93 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 93 -lt 70 ]] 00:23:28.574 19:00:13 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:23:28.575 19:00:13 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:28.575 19:00:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:23:28.833 [2024-07-24 19:00:13.740006] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:23:28.833 [2024-07-24 19:00:13.740092] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:28.833 19:00:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:23:28.833 19:00:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 2209313 2 00:23:28.833 19:00:13 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2209313 2 idle 00:23:28.833 19:00:13 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2209313 00:23:28.833 19:00:13 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:23:28.833 19:00:13 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:28.833 19:00:13 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:28.833 19:00:13 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:28.833 19:00:13 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:28.833 19:00:13 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:28.833 19:00:13 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:28.833 19:00:13 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2209313 -w 256 00:23:28.833 19:00:13 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:23:29.091 19:00:13 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2209352 root 20 0 128.2g 36864 23808 S 0.0 0.0 0:00.52 reactor_2' 00:23:29.091 19:00:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2209352 root 20 0 128.2g 36864 23808 S 0.0 0.0 0:00.52 reactor_2 00:23:29.091 19:00:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:29.091 19:00:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:29.091 19:00:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:29.091 19:00:13 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:29.091 19:00:13 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:29.091 19:00:13 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:29.091 19:00:13 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:29.091 19:00:13 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:29.091 19:00:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:23:29.091 [2024-07-24 19:00:14.080009] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:23:29.091 [2024-07-24 19:00:14.080109] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:29.091 19:00:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:23:29.091 19:00:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:23:29.091 19:00:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:23:29.351 [2024-07-24 19:00:14.252181] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:23:29.351 19:00:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 2209313 0 00:23:29.351 19:00:14 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2209313 0 idle 00:23:29.351 19:00:14 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2209313 00:23:29.351 19:00:14 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:23:29.351 19:00:14 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:29.351 19:00:14 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:29.351 19:00:14 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:29.351 19:00:14 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:29.351 19:00:14 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:29.351 19:00:14 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:29.351 19:00:14 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2209313 -w 256 00:23:29.351 19:00:14 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:23:29.610 19:00:14 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2209313 root 20 0 128.2g 36864 23808 S 0.0 0.0 0:01.31 reactor_0' 00:23:29.610 19:00:14 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:29.610 19:00:14 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2209313 root 20 0 128.2g 36864 23808 S 0.0 0.0 0:01.31 reactor_0 00:23:29.610 19:00:14 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:29.610 19:00:14 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:29.610 19:00:14 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:29.610 19:00:14 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:29.610 19:00:14 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:29.610 19:00:14 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:29.610 19:00:14 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:29.610 19:00:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:23:29.610 19:00:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:23:29.610 19:00:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:23:29.610 19:00:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 2209313 00:23:29.610 19:00:14 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 2209313 ']' 00:23:29.610 19:00:14 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 2209313 00:23:29.610 19:00:14 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:23:29.610 19:00:14 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:29.610 19:00:14 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2209313 00:23:29.610 19:00:14 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:29.610 19:00:14 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:29.610 19:00:14 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2209313' 00:23:29.610 killing process with pid 2209313 00:23:29.610 19:00:14 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 2209313 00:23:29.610 19:00:14 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 2209313 00:23:29.868 19:00:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:23:29.868 19:00:14 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:23:29.868 19:00:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:23:29.868 19:00:14 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:29.869 19:00:14 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:23:29.869 19:00:14 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2210030 00:23:29.869 19:00:14 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:29.869 19:00:14 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:23:29.869 19:00:14 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2210030 /var/tmp/spdk.sock 00:23:29.869 19:00:14 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 2210030 ']' 00:23:29.869 19:00:14 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:29.869 19:00:14 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:29.869 19:00:14 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:29.869 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:29.869 19:00:14 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:29.869 19:00:14 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:23:29.869 [2024-07-24 19:00:14.721039] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:23:29.869 [2024-07-24 19:00:14.721083] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2210030 ] 00:23:29.869 [2024-07-24 19:00:14.785063] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:23:29.869 [2024-07-24 19:00:14.857639] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:23:29.869 [2024-07-24 19:00:14.857746] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:23:29.869 [2024-07-24 19:00:14.857747] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:30.127 [2024-07-24 19:00:14.921253] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:23:30.695 19:00:15 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:30.695 19:00:15 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:23:30.695 19:00:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:23:30.695 19:00:15 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:30.954 Malloc0 00:23:30.954 Malloc1 00:23:30.954 Malloc2 00:23:30.954 19:00:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:23:30.954 19:00:15 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:23:30.954 19:00:15 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:23:30.954 19:00:15 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:23:30.954 5000+0 records in 00:23:30.954 5000+0 records out 00:23:30.954 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0107017 s, 957 MB/s 00:23:30.954 19:00:15 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:23:30.954 AIO0 00:23:30.954 19:00:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 2210030 00:23:30.954 19:00:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 2210030 00:23:30.954 19:00:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=2210030 00:23:30.954 19:00:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:23:30.954 19:00:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:23:30.954 19:00:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:23:30.954 19:00:15 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:23:30.954 19:00:15 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:23:30.954 19:00:15 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:23:30.954 19:00:15 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:30.954 19:00:15 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:23:30.954 19:00:15 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:31.212 19:00:16 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:23:31.212 19:00:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:23:31.212 19:00:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:23:31.212 19:00:16 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:23:31.212 19:00:16 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:23:31.212 19:00:16 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:23:31.212 19:00:16 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:31.212 19:00:16 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:23:31.212 19:00:16 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:23:31.470 spdk_thread ids are 1 on reactor0. 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2210030 0 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2210030 0 idle 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2210030 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2210030 -w 256 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2210030 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.27 reactor_0' 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2210030 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.27 reactor_0 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2210030 1 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2210030 1 idle 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2210030 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:23:31.470 19:00:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2210030 -w 256 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2210073 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_1' 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2210073 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_1 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2210030 2 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2210030 2 idle 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2210030 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2210030 -w 256 00:23:31.729 19:00:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:23:31.988 19:00:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2210074 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_2' 00:23:31.988 19:00:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2210074 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_2 00:23:31.988 19:00:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:31.988 19:00:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:31.988 19:00:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:31.988 19:00:16 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:31.988 19:00:16 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:31.988 19:00:16 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:31.988 19:00:16 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:31.988 19:00:16 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:31.988 19:00:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:23:31.988 19:00:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:23:31.988 [2024-07-24 19:00:16.970299] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:23:31.988 [2024-07-24 19:00:16.970487] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:23:31.988 [2024-07-24 19:00:16.970588] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:31.988 19:00:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:23:32.246 [2024-07-24 19:00:17.138581] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:23:32.246 [2024-07-24 19:00:17.138742] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:32.246 19:00:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:23:32.246 19:00:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2210030 0 00:23:32.246 19:00:17 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2210030 0 busy 00:23:32.246 19:00:17 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2210030 00:23:32.246 19:00:17 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:23:32.246 19:00:17 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:23:32.246 19:00:17 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:23:32.246 19:00:17 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:32.246 19:00:17 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:32.246 19:00:17 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:32.246 19:00:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2210030 -w 256 00:23:32.246 19:00:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2210030 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.61 reactor_0' 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2210030 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.61 reactor_0 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2210030 2 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2210030 2 busy 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2210030 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2210030 -w 256 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2210074 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.34 reactor_2' 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2210074 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.34 reactor_2 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:32.505 19:00:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:23:32.764 [2024-07-24 19:00:17.656023] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:23:32.764 [2024-07-24 19:00:17.656106] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:32.764 19:00:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:23:32.764 19:00:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 2210030 2 00:23:32.764 19:00:17 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2210030 2 idle 00:23:32.764 19:00:17 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2210030 00:23:32.764 19:00:17 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:23:32.764 19:00:17 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:32.764 19:00:17 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:32.764 19:00:17 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:32.764 19:00:17 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:32.764 19:00:17 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:32.764 19:00:17 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:32.764 19:00:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:23:32.764 19:00:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2210030 -w 256 00:23:33.022 19:00:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2210074 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.51 reactor_2' 00:23:33.022 19:00:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2210074 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.51 reactor_2 00:23:33.022 19:00:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:33.022 19:00:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:33.022 19:00:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:33.022 19:00:17 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:33.022 19:00:17 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:33.022 19:00:17 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:33.022 19:00:17 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:33.022 19:00:17 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:33.022 19:00:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:23:33.022 [2024-07-24 19:00:18.008930] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:23:33.022 [2024-07-24 19:00:18.009126] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:23:33.022 [2024-07-24 19:00:18.009156] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:33.022 19:00:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:23:33.022 19:00:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 2210030 0 00:23:33.022 19:00:18 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2210030 0 idle 00:23:33.022 19:00:18 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2210030 00:23:33.022 19:00:18 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:23:33.022 19:00:18 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:33.022 19:00:18 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:33.022 19:00:18 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:33.022 19:00:18 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:33.022 19:00:18 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:33.022 19:00:18 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:33.022 19:00:18 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2210030 -w 256 00:23:33.022 19:00:18 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:23:33.278 19:00:18 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2210030 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:01.30 reactor_0' 00:23:33.278 19:00:18 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2210030 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:01.30 reactor_0 00:23:33.278 19:00:18 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:33.278 19:00:18 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:33.278 19:00:18 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:33.278 19:00:18 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:33.278 19:00:18 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:33.278 19:00:18 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:33.278 19:00:18 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:33.278 19:00:18 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:33.278 19:00:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:23:33.278 19:00:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:23:33.278 19:00:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:23:33.278 19:00:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 2210030 00:23:33.278 19:00:18 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 2210030 ']' 00:23:33.278 19:00:18 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 2210030 00:23:33.278 19:00:18 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:23:33.278 19:00:18 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:33.278 19:00:18 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2210030 00:23:33.278 19:00:18 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:33.278 19:00:18 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:33.278 19:00:18 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2210030' 00:23:33.278 killing process with pid 2210030 00:23:33.278 19:00:18 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 2210030 00:23:33.278 19:00:18 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 2210030 00:23:33.536 19:00:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:23:33.536 19:00:18 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:23:33.536 00:23:33.536 real 0m8.123s 00:23:33.536 user 0m7.346s 00:23:33.536 sys 0m1.398s 00:23:33.536 19:00:18 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:33.536 19:00:18 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:23:33.536 ************************************ 00:23:33.536 END TEST reactor_set_interrupt 00:23:33.536 ************************************ 00:23:33.536 19:00:18 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:23:33.536 19:00:18 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:23:33.536 19:00:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:33.536 19:00:18 -- common/autotest_common.sh@10 -- # set +x 00:23:33.536 ************************************ 00:23:33.536 START TEST reap_unregistered_poller 00:23:33.536 ************************************ 00:23:33.536 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:23:33.796 * Looking for test storage... 00:23:33.796 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:33.796 19:00:18 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:23:33.796 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:23:33.796 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:33.796 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:33.796 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:23:33.796 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:23:33.796 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:23:33.796 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:23:33.796 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:23:33.796 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:23:33.796 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:23:33.796 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:23:33.796 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:23:33.796 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:23:33.796 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:23:33.796 19:00:18 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:23:33.797 19:00:18 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:23:33.797 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:23:33.797 19:00:18 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:23:33.797 19:00:18 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:23:33.797 19:00:18 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:23:33.797 19:00:18 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:23:33.797 19:00:18 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:23:33.797 19:00:18 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:23:33.797 19:00:18 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:23:33.797 19:00:18 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:23:33.797 19:00:18 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:23:33.797 19:00:18 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:23:33.797 19:00:18 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:23:33.797 19:00:18 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:23:33.797 19:00:18 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:23:33.797 19:00:18 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:23:33.797 19:00:18 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:23:33.797 #define SPDK_CONFIG_H 00:23:33.797 #define SPDK_CONFIG_APPS 1 00:23:33.797 #define SPDK_CONFIG_ARCH native 00:23:33.797 #undef SPDK_CONFIG_ASAN 00:23:33.797 #undef SPDK_CONFIG_AVAHI 00:23:33.797 #undef SPDK_CONFIG_CET 00:23:33.797 #define SPDK_CONFIG_COVERAGE 1 00:23:33.797 #define SPDK_CONFIG_CROSS_PREFIX 00:23:33.797 #define SPDK_CONFIG_CRYPTO 1 00:23:33.797 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:23:33.797 #undef SPDK_CONFIG_CUSTOMOCF 00:23:33.797 #undef SPDK_CONFIG_DAOS 00:23:33.797 #define SPDK_CONFIG_DAOS_DIR 00:23:33.797 #define SPDK_CONFIG_DEBUG 1 00:23:33.797 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:23:33.797 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:23:33.797 #define SPDK_CONFIG_DPDK_INC_DIR 00:23:33.797 #define SPDK_CONFIG_DPDK_LIB_DIR 00:23:33.797 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:23:33.797 #undef SPDK_CONFIG_DPDK_UADK 00:23:33.797 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:23:33.797 #define SPDK_CONFIG_EXAMPLES 1 00:23:33.797 #undef SPDK_CONFIG_FC 00:23:33.797 #define SPDK_CONFIG_FC_PATH 00:23:33.797 #define SPDK_CONFIG_FIO_PLUGIN 1 00:23:33.797 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:23:33.797 #undef SPDK_CONFIG_FUSE 00:23:33.797 #undef SPDK_CONFIG_FUZZER 00:23:33.797 #define SPDK_CONFIG_FUZZER_LIB 00:23:33.797 #undef SPDK_CONFIG_GOLANG 00:23:33.797 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:23:33.797 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:23:33.797 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:23:33.797 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:23:33.797 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:23:33.797 #undef SPDK_CONFIG_HAVE_LIBBSD 00:23:33.797 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:23:33.797 #define SPDK_CONFIG_IDXD 1 00:23:33.797 #define SPDK_CONFIG_IDXD_KERNEL 1 00:23:33.797 #define SPDK_CONFIG_IPSEC_MB 1 00:23:33.797 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:23:33.797 #define SPDK_CONFIG_ISAL 1 00:23:33.797 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:23:33.797 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:23:33.797 #define SPDK_CONFIG_LIBDIR 00:23:33.797 #undef SPDK_CONFIG_LTO 00:23:33.797 #define SPDK_CONFIG_MAX_LCORES 128 00:23:33.797 #define SPDK_CONFIG_NVME_CUSE 1 00:23:33.797 #undef SPDK_CONFIG_OCF 00:23:33.797 #define SPDK_CONFIG_OCF_PATH 00:23:33.797 #define SPDK_CONFIG_OPENSSL_PATH 00:23:33.797 #undef SPDK_CONFIG_PGO_CAPTURE 00:23:33.797 #define SPDK_CONFIG_PGO_DIR 00:23:33.797 #undef SPDK_CONFIG_PGO_USE 00:23:33.797 #define SPDK_CONFIG_PREFIX /usr/local 00:23:33.797 #undef SPDK_CONFIG_RAID5F 00:23:33.797 #undef SPDK_CONFIG_RBD 00:23:33.797 #define SPDK_CONFIG_RDMA 1 00:23:33.797 #define SPDK_CONFIG_RDMA_PROV verbs 00:23:33.797 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:23:33.797 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:23:33.797 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:23:33.797 #define SPDK_CONFIG_SHARED 1 00:23:33.797 #undef SPDK_CONFIG_SMA 00:23:33.797 #define SPDK_CONFIG_TESTS 1 00:23:33.797 #undef SPDK_CONFIG_TSAN 00:23:33.797 #define SPDK_CONFIG_UBLK 1 00:23:33.797 #define SPDK_CONFIG_UBSAN 1 00:23:33.797 #undef SPDK_CONFIG_UNIT_TESTS 00:23:33.797 #undef SPDK_CONFIG_URING 00:23:33.797 #define SPDK_CONFIG_URING_PATH 00:23:33.797 #undef SPDK_CONFIG_URING_ZNS 00:23:33.797 #undef SPDK_CONFIG_USDT 00:23:33.797 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:23:33.797 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:23:33.797 #undef SPDK_CONFIG_VFIO_USER 00:23:33.797 #define SPDK_CONFIG_VFIO_USER_DIR 00:23:33.797 #define SPDK_CONFIG_VHOST 1 00:23:33.797 #define SPDK_CONFIG_VIRTIO 1 00:23:33.797 #undef SPDK_CONFIG_VTUNE 00:23:33.797 #define SPDK_CONFIG_VTUNE_DIR 00:23:33.797 #define SPDK_CONFIG_WERROR 1 00:23:33.797 #define SPDK_CONFIG_WPDK_DIR 00:23:33.797 #undef SPDK_CONFIG_XNVME 00:23:33.797 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:23:33.797 19:00:18 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:23:33.798 19:00:18 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:23:33.798 19:00:18 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:23:33.798 19:00:18 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:23:33.798 19:00:18 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:33.798 19:00:18 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:33.798 19:00:18 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:33.798 19:00:18 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:23:33.798 19:00:18 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:23:33.798 19:00:18 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:23:33.798 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j96 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:23:33.799 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 2210740 ]] 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 2210740 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@1678 -- # set_test_storage 2147483648 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.RKils4 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.RKils4/tests/interrupt /tmp/spdk.RKils4 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=895512576 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4388917248 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=89610563584 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=95562764288 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=5952200704 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47725748224 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47781380096 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=55631872 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=19102932992 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=19112554496 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9621504 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47780880384 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47781384192 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=503808 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9556271104 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9556275200 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:23:33.800 * Looking for test storage... 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=89610563584 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=8166793216 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:33.800 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set -o errtrace 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@1681 -- # shopt -s extdebug 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@1685 -- # true 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # xtrace_fd 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:23:33.800 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:23:33.800 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:23:33.800 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:33.800 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:23:33.800 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:23:33.800 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:23:33.800 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:23:33.800 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:23:33.800 19:00:18 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:23:33.800 19:00:18 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:23:33.801 19:00:18 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:23:33.801 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:33.801 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:23:33.801 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2210786 00:23:33.801 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:33.801 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:23:33.801 19:00:18 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2210786 /var/tmp/spdk.sock 00:23:33.801 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 2210786 ']' 00:23:33.801 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:33.801 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:33.801 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:33.801 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:33.801 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:33.801 19:00:18 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:23:33.801 [2024-07-24 19:00:18.782923] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:23:33.801 [2024-07-24 19:00:18.782966] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2210786 ] 00:23:34.059 [2024-07-24 19:00:18.847323] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:23:34.059 [2024-07-24 19:00:18.923690] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:23:34.059 [2024-07-24 19:00:18.923704] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:23:34.059 [2024-07-24 19:00:18.923706] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:34.059 [2024-07-24 19:00:18.986896] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:23:34.627 19:00:19 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:34.627 19:00:19 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:23:34.627 19:00:19 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:23:34.627 19:00:19 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:23:34.627 19:00:19 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:23:34.627 19:00:19 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:23:34.627 19:00:19 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:23:34.627 19:00:19 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:23:34.627 "name": "app_thread", 00:23:34.627 "id": 1, 00:23:34.627 "active_pollers": [], 00:23:34.627 "timed_pollers": [ 00:23:34.627 { 00:23:34.627 "name": "rpc_subsystem_poll_servers", 00:23:34.627 "id": 1, 00:23:34.627 "state": "waiting", 00:23:34.627 "run_count": 0, 00:23:34.627 "busy_count": 0, 00:23:34.627 "period_ticks": 8400000 00:23:34.627 } 00:23:34.627 ], 00:23:34.627 "paused_pollers": [] 00:23:34.627 }' 00:23:34.627 19:00:19 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:23:34.887 19:00:19 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:23:34.887 19:00:19 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:23:34.887 19:00:19 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:23:34.887 19:00:19 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:23:34.887 19:00:19 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:23:34.887 19:00:19 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:23:34.887 19:00:19 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:23:34.887 19:00:19 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:23:34.887 5000+0 records in 00:23:34.887 5000+0 records out 00:23:34.887 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0165037 s, 620 MB/s 00:23:34.887 19:00:19 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:23:35.145 AIO0 00:23:35.145 19:00:19 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:35.145 19:00:20 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:23:35.404 19:00:20 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:23:35.404 19:00:20 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:23:35.404 19:00:20 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:23:35.404 19:00:20 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:23:35.404 19:00:20 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:23:35.404 19:00:20 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:23:35.404 "name": "app_thread", 00:23:35.404 "id": 1, 00:23:35.404 "active_pollers": [], 00:23:35.404 "timed_pollers": [ 00:23:35.404 { 00:23:35.404 "name": "rpc_subsystem_poll_servers", 00:23:35.404 "id": 1, 00:23:35.404 "state": "waiting", 00:23:35.404 "run_count": 0, 00:23:35.404 "busy_count": 0, 00:23:35.404 "period_ticks": 8400000 00:23:35.404 } 00:23:35.404 ], 00:23:35.404 "paused_pollers": [] 00:23:35.404 }' 00:23:35.404 19:00:20 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:23:35.404 19:00:20 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:23:35.404 19:00:20 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:23:35.404 19:00:20 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:23:35.404 19:00:20 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:23:35.404 19:00:20 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:23:35.404 19:00:20 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:23:35.404 19:00:20 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 2210786 00:23:35.404 19:00:20 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 2210786 ']' 00:23:35.404 19:00:20 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 2210786 00:23:35.404 19:00:20 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:23:35.404 19:00:20 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:35.404 19:00:20 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2210786 00:23:35.404 19:00:20 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:35.404 19:00:20 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:35.404 19:00:20 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2210786' 00:23:35.404 killing process with pid 2210786 00:23:35.404 19:00:20 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 2210786 00:23:35.404 19:00:20 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 2210786 00:23:35.663 19:00:20 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:23:35.663 19:00:20 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:23:35.663 00:23:35.663 real 0m2.047s 00:23:35.663 user 0m1.227s 00:23:35.663 sys 0m0.464s 00:23:35.663 19:00:20 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:35.663 19:00:20 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:23:35.663 ************************************ 00:23:35.663 END TEST reap_unregistered_poller 00:23:35.663 ************************************ 00:23:35.663 19:00:20 -- spdk/autotest.sh@198 -- # uname -s 00:23:35.663 19:00:20 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:23:35.663 19:00:20 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:23:35.663 19:00:20 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:23:35.663 19:00:20 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:23:35.663 19:00:20 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:23:35.663 19:00:20 -- spdk/autotest.sh@260 -- # timing_exit lib 00:23:35.663 19:00:20 -- common/autotest_common.sh@728 -- # xtrace_disable 00:23:35.663 19:00:20 -- common/autotest_common.sh@10 -- # set +x 00:23:35.663 19:00:20 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:23:35.663 19:00:20 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:23:35.663 19:00:20 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:23:35.663 19:00:20 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:23:35.663 19:00:20 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:23:35.663 19:00:20 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:23:35.663 19:00:20 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:23:35.663 19:00:20 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:23:35.663 19:00:20 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:23:35.663 19:00:20 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:23:35.663 19:00:20 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:23:35.663 19:00:20 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:23:35.663 19:00:20 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:23:35.663 19:00:20 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:23:35.663 19:00:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:35.663 19:00:20 -- common/autotest_common.sh@10 -- # set +x 00:23:35.663 ************************************ 00:23:35.663 START TEST compress_compdev 00:23:35.663 ************************************ 00:23:35.663 19:00:20 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:23:35.922 * Looking for test storage... 00:23:35.922 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:23:35.922 19:00:20 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:23:35.922 19:00:20 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:23:35.922 19:00:20 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:23:35.922 19:00:20 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:23:35.922 19:00:20 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:35.922 19:00:20 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:35.922 19:00:20 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:35.922 19:00:20 compress_compdev -- paths/export.sh@5 -- # export PATH 00:23:35.922 19:00:20 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:23:35.922 19:00:20 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:23:35.923 19:00:20 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:23:35.923 19:00:20 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:23:35.923 19:00:20 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:23:35.923 19:00:20 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:23:35.923 19:00:20 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:23:35.923 19:00:20 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:35.923 19:00:20 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:23:35.923 19:00:20 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:23:35.923 19:00:20 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:23:35.923 19:00:20 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:23:35.923 19:00:20 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2211321 00:23:35.923 19:00:20 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:35.923 19:00:20 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2211321 00:23:35.923 19:00:20 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:23:35.923 19:00:20 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2211321 ']' 00:23:35.923 19:00:20 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:35.923 19:00:20 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:35.923 19:00:20 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:35.923 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:35.923 19:00:20 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:35.923 19:00:20 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:23:35.923 [2024-07-24 19:00:20.809811] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:23:35.923 [2024-07-24 19:00:20.809851] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2211321 ] 00:23:35.923 [2024-07-24 19:00:20.874182] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:23:36.181 [2024-07-24 19:00:20.945414] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:23:36.181 [2024-07-24 19:00:20.945417] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:23:36.440 [2024-07-24 19:00:21.316108] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:23:36.697 19:00:21 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:36.697 19:00:21 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:23:36.697 19:00:21 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:23:36.697 19:00:21 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:23:36.697 19:00:21 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:23:39.982 [2024-07-24 19:00:24.606720] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1cd1bc0 PMD being used: compress_qat 00:23:39.982 19:00:24 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:23:39.982 19:00:24 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:23:39.982 19:00:24 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:39.982 19:00:24 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:23:39.982 19:00:24 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:39.982 19:00:24 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:39.982 19:00:24 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:39.982 19:00:24 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:23:39.982 [ 00:23:39.982 { 00:23:39.982 "name": "Nvme0n1", 00:23:39.982 "aliases": [ 00:23:39.982 "8ecb9929-93b2-481c-a6c9-55092705b60b" 00:23:39.982 ], 00:23:39.982 "product_name": "NVMe disk", 00:23:39.982 "block_size": 512, 00:23:39.982 "num_blocks": 1953525168, 00:23:39.982 "uuid": "8ecb9929-93b2-481c-a6c9-55092705b60b", 00:23:39.982 "assigned_rate_limits": { 00:23:39.982 "rw_ios_per_sec": 0, 00:23:39.982 "rw_mbytes_per_sec": 0, 00:23:39.982 "r_mbytes_per_sec": 0, 00:23:39.982 "w_mbytes_per_sec": 0 00:23:39.982 }, 00:23:39.982 "claimed": false, 00:23:39.982 "zoned": false, 00:23:39.982 "supported_io_types": { 00:23:39.982 "read": true, 00:23:39.982 "write": true, 00:23:39.982 "unmap": true, 00:23:39.982 "flush": true, 00:23:39.982 "reset": true, 00:23:39.982 "nvme_admin": true, 00:23:39.982 "nvme_io": true, 00:23:39.982 "nvme_io_md": false, 00:23:39.982 "write_zeroes": true, 00:23:39.982 "zcopy": false, 00:23:39.982 "get_zone_info": false, 00:23:39.982 "zone_management": false, 00:23:39.982 "zone_append": false, 00:23:39.982 "compare": false, 00:23:39.982 "compare_and_write": false, 00:23:39.982 "abort": true, 00:23:39.982 "seek_hole": false, 00:23:39.982 "seek_data": false, 00:23:39.982 "copy": false, 00:23:39.982 "nvme_iov_md": false 00:23:39.982 }, 00:23:39.982 "driver_specific": { 00:23:39.982 "nvme": [ 00:23:39.982 { 00:23:39.982 "pci_address": "0000:5e:00.0", 00:23:39.982 "trid": { 00:23:39.982 "trtype": "PCIe", 00:23:39.982 "traddr": "0000:5e:00.0" 00:23:39.982 }, 00:23:39.983 "ctrlr_data": { 00:23:39.983 "cntlid": 0, 00:23:39.983 "vendor_id": "0x8086", 00:23:39.983 "model_number": "INTEL SSDPE2KX010T8", 00:23:39.983 "serial_number": "BTLJ807001JM1P0FGN", 00:23:39.983 "firmware_revision": "VDV10170", 00:23:39.983 "oacs": { 00:23:39.983 "security": 1, 00:23:39.983 "format": 1, 00:23:39.983 "firmware": 1, 00:23:39.983 "ns_manage": 1 00:23:39.983 }, 00:23:39.983 "multi_ctrlr": false, 00:23:39.983 "ana_reporting": false 00:23:39.983 }, 00:23:39.983 "vs": { 00:23:39.983 "nvme_version": "1.2" 00:23:39.983 }, 00:23:39.983 "ns_data": { 00:23:39.983 "id": 1, 00:23:39.983 "can_share": false 00:23:39.983 }, 00:23:39.983 "security": { 00:23:39.983 "opal": true 00:23:39.983 } 00:23:39.983 } 00:23:39.983 ], 00:23:39.983 "mp_policy": "active_passive" 00:23:39.983 } 00:23:39.983 } 00:23:39.983 ] 00:23:39.983 19:00:24 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:23:39.983 19:00:24 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:23:40.242 [2024-07-24 19:00:25.126388] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b08c20 PMD being used: compress_qat 00:23:41.177 0e2f023f-450f-47c9-ab9a-f718b8cf48e3 00:23:41.177 19:00:26 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:23:41.177 ab5b3585-6d23-4ff1-a142-860edf5a2881 00:23:41.435 19:00:26 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:23:41.435 19:00:26 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:23:41.435 19:00:26 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:41.435 19:00:26 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:23:41.435 19:00:26 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:41.435 19:00:26 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:41.435 19:00:26 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:41.435 19:00:26 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:23:41.693 [ 00:23:41.693 { 00:23:41.693 "name": "ab5b3585-6d23-4ff1-a142-860edf5a2881", 00:23:41.693 "aliases": [ 00:23:41.693 "lvs0/lv0" 00:23:41.693 ], 00:23:41.693 "product_name": "Logical Volume", 00:23:41.693 "block_size": 512, 00:23:41.693 "num_blocks": 204800, 00:23:41.693 "uuid": "ab5b3585-6d23-4ff1-a142-860edf5a2881", 00:23:41.693 "assigned_rate_limits": { 00:23:41.693 "rw_ios_per_sec": 0, 00:23:41.693 "rw_mbytes_per_sec": 0, 00:23:41.693 "r_mbytes_per_sec": 0, 00:23:41.693 "w_mbytes_per_sec": 0 00:23:41.693 }, 00:23:41.693 "claimed": false, 00:23:41.693 "zoned": false, 00:23:41.693 "supported_io_types": { 00:23:41.693 "read": true, 00:23:41.693 "write": true, 00:23:41.693 "unmap": true, 00:23:41.693 "flush": false, 00:23:41.693 "reset": true, 00:23:41.693 "nvme_admin": false, 00:23:41.693 "nvme_io": false, 00:23:41.693 "nvme_io_md": false, 00:23:41.693 "write_zeroes": true, 00:23:41.693 "zcopy": false, 00:23:41.693 "get_zone_info": false, 00:23:41.693 "zone_management": false, 00:23:41.693 "zone_append": false, 00:23:41.693 "compare": false, 00:23:41.693 "compare_and_write": false, 00:23:41.693 "abort": false, 00:23:41.693 "seek_hole": true, 00:23:41.693 "seek_data": true, 00:23:41.693 "copy": false, 00:23:41.693 "nvme_iov_md": false 00:23:41.693 }, 00:23:41.693 "driver_specific": { 00:23:41.693 "lvol": { 00:23:41.693 "lvol_store_uuid": "0e2f023f-450f-47c9-ab9a-f718b8cf48e3", 00:23:41.693 "base_bdev": "Nvme0n1", 00:23:41.693 "thin_provision": true, 00:23:41.693 "num_allocated_clusters": 0, 00:23:41.693 "snapshot": false, 00:23:41.693 "clone": false, 00:23:41.693 "esnap_clone": false 00:23:41.693 } 00:23:41.693 } 00:23:41.693 } 00:23:41.693 ] 00:23:41.693 19:00:26 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:23:41.693 19:00:26 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:23:41.693 19:00:26 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:23:41.693 [2024-07-24 19:00:26.689847] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:23:41.693 COMP_lvs0/lv0 00:23:41.951 19:00:26 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:23:41.951 19:00:26 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:23:41.951 19:00:26 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:41.952 19:00:26 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:23:41.952 19:00:26 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:41.952 19:00:26 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:41.952 19:00:26 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:41.952 19:00:26 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:23:42.209 [ 00:23:42.209 { 00:23:42.209 "name": "COMP_lvs0/lv0", 00:23:42.209 "aliases": [ 00:23:42.209 "d35f1893-5ece-5e54-8dd7-2d817250f2fd" 00:23:42.209 ], 00:23:42.209 "product_name": "compress", 00:23:42.209 "block_size": 512, 00:23:42.209 "num_blocks": 200704, 00:23:42.209 "uuid": "d35f1893-5ece-5e54-8dd7-2d817250f2fd", 00:23:42.209 "assigned_rate_limits": { 00:23:42.209 "rw_ios_per_sec": 0, 00:23:42.209 "rw_mbytes_per_sec": 0, 00:23:42.209 "r_mbytes_per_sec": 0, 00:23:42.209 "w_mbytes_per_sec": 0 00:23:42.209 }, 00:23:42.209 "claimed": false, 00:23:42.209 "zoned": false, 00:23:42.209 "supported_io_types": { 00:23:42.209 "read": true, 00:23:42.210 "write": true, 00:23:42.210 "unmap": false, 00:23:42.210 "flush": false, 00:23:42.210 "reset": false, 00:23:42.210 "nvme_admin": false, 00:23:42.210 "nvme_io": false, 00:23:42.210 "nvme_io_md": false, 00:23:42.210 "write_zeroes": true, 00:23:42.210 "zcopy": false, 00:23:42.210 "get_zone_info": false, 00:23:42.210 "zone_management": false, 00:23:42.210 "zone_append": false, 00:23:42.210 "compare": false, 00:23:42.210 "compare_and_write": false, 00:23:42.210 "abort": false, 00:23:42.210 "seek_hole": false, 00:23:42.210 "seek_data": false, 00:23:42.210 "copy": false, 00:23:42.210 "nvme_iov_md": false 00:23:42.210 }, 00:23:42.210 "driver_specific": { 00:23:42.210 "compress": { 00:23:42.210 "name": "COMP_lvs0/lv0", 00:23:42.210 "base_bdev_name": "ab5b3585-6d23-4ff1-a142-860edf5a2881", 00:23:42.210 "pm_path": "/tmp/pmem/568228ba-138e-482e-a9fc-ced115d1a925" 00:23:42.210 } 00:23:42.210 } 00:23:42.210 } 00:23:42.210 ] 00:23:42.210 19:00:27 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:23:42.210 19:00:27 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:23:42.210 [2024-07-24 19:00:27.119749] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fde481b15c0 PMD being used: compress_qat 00:23:42.210 [2024-07-24 19:00:27.121275] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b07660 PMD being used: compress_qat 00:23:42.210 Running I/O for 3 seconds... 00:23:45.495 00:23:45.495 Latency(us) 00:23:45.495 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:45.495 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:23:45.495 Verification LBA range: start 0x0 length 0x3100 00:23:45.495 COMP_lvs0/lv0 : 3.01 4048.88 15.82 0.00 0.00 7856.58 126.78 13918.60 00:23:45.495 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:23:45.495 Verification LBA range: start 0x3100 length 0x3100 00:23:45.495 COMP_lvs0/lv0 : 3.01 4149.07 16.21 0.00 0.00 7676.89 120.93 13856.18 00:23:45.495 =================================================================================================================== 00:23:45.495 Total : 8197.96 32.02 0.00 0.00 7765.69 120.93 13918.60 00:23:45.495 0 00:23:45.495 19:00:30 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:23:45.495 19:00:30 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:23:45.495 19:00:30 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:23:45.753 19:00:30 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:23:45.753 19:00:30 compress_compdev -- compress/compress.sh@78 -- # killprocess 2211321 00:23:45.753 19:00:30 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2211321 ']' 00:23:45.754 19:00:30 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2211321 00:23:45.754 19:00:30 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:23:45.754 19:00:30 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:45.754 19:00:30 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2211321 00:23:45.754 19:00:30 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:23:45.754 19:00:30 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:23:45.754 19:00:30 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2211321' 00:23:45.754 killing process with pid 2211321 00:23:45.754 19:00:30 compress_compdev -- common/autotest_common.sh@967 -- # kill 2211321 00:23:45.754 Received shutdown signal, test time was about 3.000000 seconds 00:23:45.754 00:23:45.754 Latency(us) 00:23:45.754 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:45.754 =================================================================================================================== 00:23:45.754 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:45.754 19:00:30 compress_compdev -- common/autotest_common.sh@972 -- # wait 2211321 00:23:47.127 19:00:32 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:23:47.127 19:00:32 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:23:47.127 19:00:32 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2213162 00:23:47.127 19:00:32 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:47.127 19:00:32 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:23:47.127 19:00:32 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2213162 00:23:47.127 19:00:32 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2213162 ']' 00:23:47.127 19:00:32 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:47.127 19:00:32 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:47.127 19:00:32 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:47.127 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:47.127 19:00:32 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:47.127 19:00:32 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:23:47.127 [2024-07-24 19:00:32.049673] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:23:47.127 [2024-07-24 19:00:32.049715] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2213162 ] 00:23:47.127 [2024-07-24 19:00:32.112803] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:23:47.384 [2024-07-24 19:00:32.189700] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:23:47.384 [2024-07-24 19:00:32.189702] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:23:47.651 [2024-07-24 19:00:32.563152] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:23:47.909 19:00:32 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:47.909 19:00:32 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:23:47.909 19:00:32 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:23:47.909 19:00:32 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:23:47.909 19:00:32 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:23:51.262 [2024-07-24 19:00:35.838721] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1e9dbc0 PMD being used: compress_qat 00:23:51.262 19:00:35 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:23:51.262 19:00:35 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:23:51.262 19:00:35 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:51.262 19:00:35 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:23:51.262 19:00:35 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:51.262 19:00:35 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:51.262 19:00:35 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:51.262 19:00:36 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:23:51.262 [ 00:23:51.262 { 00:23:51.262 "name": "Nvme0n1", 00:23:51.262 "aliases": [ 00:23:51.262 "23d325c4-d89b-47e5-a6df-aa19f4832ca5" 00:23:51.262 ], 00:23:51.262 "product_name": "NVMe disk", 00:23:51.262 "block_size": 512, 00:23:51.262 "num_blocks": 1953525168, 00:23:51.262 "uuid": "23d325c4-d89b-47e5-a6df-aa19f4832ca5", 00:23:51.262 "assigned_rate_limits": { 00:23:51.262 "rw_ios_per_sec": 0, 00:23:51.262 "rw_mbytes_per_sec": 0, 00:23:51.262 "r_mbytes_per_sec": 0, 00:23:51.262 "w_mbytes_per_sec": 0 00:23:51.262 }, 00:23:51.262 "claimed": false, 00:23:51.262 "zoned": false, 00:23:51.262 "supported_io_types": { 00:23:51.262 "read": true, 00:23:51.262 "write": true, 00:23:51.262 "unmap": true, 00:23:51.262 "flush": true, 00:23:51.262 "reset": true, 00:23:51.262 "nvme_admin": true, 00:23:51.262 "nvme_io": true, 00:23:51.262 "nvme_io_md": false, 00:23:51.262 "write_zeroes": true, 00:23:51.262 "zcopy": false, 00:23:51.262 "get_zone_info": false, 00:23:51.262 "zone_management": false, 00:23:51.262 "zone_append": false, 00:23:51.262 "compare": false, 00:23:51.262 "compare_and_write": false, 00:23:51.262 "abort": true, 00:23:51.262 "seek_hole": false, 00:23:51.262 "seek_data": false, 00:23:51.262 "copy": false, 00:23:51.262 "nvme_iov_md": false 00:23:51.262 }, 00:23:51.262 "driver_specific": { 00:23:51.262 "nvme": [ 00:23:51.262 { 00:23:51.262 "pci_address": "0000:5e:00.0", 00:23:51.262 "trid": { 00:23:51.262 "trtype": "PCIe", 00:23:51.262 "traddr": "0000:5e:00.0" 00:23:51.262 }, 00:23:51.263 "ctrlr_data": { 00:23:51.263 "cntlid": 0, 00:23:51.263 "vendor_id": "0x8086", 00:23:51.263 "model_number": "INTEL SSDPE2KX010T8", 00:23:51.263 "serial_number": "BTLJ807001JM1P0FGN", 00:23:51.263 "firmware_revision": "VDV10170", 00:23:51.263 "oacs": { 00:23:51.263 "security": 1, 00:23:51.263 "format": 1, 00:23:51.263 "firmware": 1, 00:23:51.263 "ns_manage": 1 00:23:51.263 }, 00:23:51.263 "multi_ctrlr": false, 00:23:51.263 "ana_reporting": false 00:23:51.263 }, 00:23:51.263 "vs": { 00:23:51.263 "nvme_version": "1.2" 00:23:51.263 }, 00:23:51.263 "ns_data": { 00:23:51.263 "id": 1, 00:23:51.263 "can_share": false 00:23:51.263 }, 00:23:51.263 "security": { 00:23:51.263 "opal": true 00:23:51.263 } 00:23:51.263 } 00:23:51.263 ], 00:23:51.263 "mp_policy": "active_passive" 00:23:51.263 } 00:23:51.263 } 00:23:51.263 ] 00:23:51.263 19:00:36 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:23:51.263 19:00:36 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:23:51.521 [2024-07-24 19:00:36.358437] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1cd4c80 PMD being used: compress_qat 00:23:52.455 c5aa7f52-c080-4c25-84ea-c80ad27d232d 00:23:52.455 19:00:37 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:23:52.455 c6d52bc7-4c52-4f34-94bf-0b512de51096 00:23:52.455 19:00:37 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:23:52.455 19:00:37 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:23:52.455 19:00:37 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:52.455 19:00:37 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:23:52.455 19:00:37 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:52.455 19:00:37 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:52.455 19:00:37 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:52.722 19:00:37 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:23:52.980 [ 00:23:52.980 { 00:23:52.980 "name": "c6d52bc7-4c52-4f34-94bf-0b512de51096", 00:23:52.980 "aliases": [ 00:23:52.980 "lvs0/lv0" 00:23:52.980 ], 00:23:52.980 "product_name": "Logical Volume", 00:23:52.980 "block_size": 512, 00:23:52.980 "num_blocks": 204800, 00:23:52.980 "uuid": "c6d52bc7-4c52-4f34-94bf-0b512de51096", 00:23:52.980 "assigned_rate_limits": { 00:23:52.980 "rw_ios_per_sec": 0, 00:23:52.980 "rw_mbytes_per_sec": 0, 00:23:52.980 "r_mbytes_per_sec": 0, 00:23:52.980 "w_mbytes_per_sec": 0 00:23:52.980 }, 00:23:52.980 "claimed": false, 00:23:52.981 "zoned": false, 00:23:52.981 "supported_io_types": { 00:23:52.981 "read": true, 00:23:52.981 "write": true, 00:23:52.981 "unmap": true, 00:23:52.981 "flush": false, 00:23:52.981 "reset": true, 00:23:52.981 "nvme_admin": false, 00:23:52.981 "nvme_io": false, 00:23:52.981 "nvme_io_md": false, 00:23:52.981 "write_zeroes": true, 00:23:52.981 "zcopy": false, 00:23:52.981 "get_zone_info": false, 00:23:52.981 "zone_management": false, 00:23:52.981 "zone_append": false, 00:23:52.981 "compare": false, 00:23:52.981 "compare_and_write": false, 00:23:52.981 "abort": false, 00:23:52.981 "seek_hole": true, 00:23:52.981 "seek_data": true, 00:23:52.981 "copy": false, 00:23:52.981 "nvme_iov_md": false 00:23:52.981 }, 00:23:52.981 "driver_specific": { 00:23:52.981 "lvol": { 00:23:52.981 "lvol_store_uuid": "c5aa7f52-c080-4c25-84ea-c80ad27d232d", 00:23:52.981 "base_bdev": "Nvme0n1", 00:23:52.981 "thin_provision": true, 00:23:52.981 "num_allocated_clusters": 0, 00:23:52.981 "snapshot": false, 00:23:52.981 "clone": false, 00:23:52.981 "esnap_clone": false 00:23:52.981 } 00:23:52.981 } 00:23:52.981 } 00:23:52.981 ] 00:23:52.981 19:00:37 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:23:52.981 19:00:37 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:23:52.981 19:00:37 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:23:52.981 [2024-07-24 19:00:37.907119] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:23:52.981 COMP_lvs0/lv0 00:23:52.981 19:00:37 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:23:52.981 19:00:37 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:23:52.981 19:00:37 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:52.981 19:00:37 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:23:52.981 19:00:37 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:52.981 19:00:37 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:52.981 19:00:37 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:53.238 19:00:38 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:23:53.238 [ 00:23:53.238 { 00:23:53.238 "name": "COMP_lvs0/lv0", 00:23:53.238 "aliases": [ 00:23:53.238 "5aed86a0-2e18-59ab-a332-c64e9389da9b" 00:23:53.238 ], 00:23:53.238 "product_name": "compress", 00:23:53.238 "block_size": 512, 00:23:53.238 "num_blocks": 200704, 00:23:53.238 "uuid": "5aed86a0-2e18-59ab-a332-c64e9389da9b", 00:23:53.238 "assigned_rate_limits": { 00:23:53.238 "rw_ios_per_sec": 0, 00:23:53.238 "rw_mbytes_per_sec": 0, 00:23:53.238 "r_mbytes_per_sec": 0, 00:23:53.238 "w_mbytes_per_sec": 0 00:23:53.238 }, 00:23:53.238 "claimed": false, 00:23:53.238 "zoned": false, 00:23:53.238 "supported_io_types": { 00:23:53.238 "read": true, 00:23:53.238 "write": true, 00:23:53.238 "unmap": false, 00:23:53.238 "flush": false, 00:23:53.238 "reset": false, 00:23:53.238 "nvme_admin": false, 00:23:53.238 "nvme_io": false, 00:23:53.238 "nvme_io_md": false, 00:23:53.238 "write_zeroes": true, 00:23:53.238 "zcopy": false, 00:23:53.238 "get_zone_info": false, 00:23:53.238 "zone_management": false, 00:23:53.238 "zone_append": false, 00:23:53.238 "compare": false, 00:23:53.238 "compare_and_write": false, 00:23:53.238 "abort": false, 00:23:53.238 "seek_hole": false, 00:23:53.238 "seek_data": false, 00:23:53.238 "copy": false, 00:23:53.238 "nvme_iov_md": false 00:23:53.238 }, 00:23:53.238 "driver_specific": { 00:23:53.238 "compress": { 00:23:53.238 "name": "COMP_lvs0/lv0", 00:23:53.239 "base_bdev_name": "c6d52bc7-4c52-4f34-94bf-0b512de51096", 00:23:53.239 "pm_path": "/tmp/pmem/0058e23b-395a-46a7-b698-e6f3123d8b41" 00:23:53.239 } 00:23:53.239 } 00:23:53.239 } 00:23:53.239 ] 00:23:53.239 19:00:38 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:23:53.239 19:00:38 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:23:53.497 [2024-07-24 19:00:38.321025] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7facf81b15c0 PMD being used: compress_qat 00:23:53.497 [2024-07-24 19:00:38.322511] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1e9a550 PMD being used: compress_qat 00:23:53.497 Running I/O for 3 seconds... 00:23:56.795 00:23:56.795 Latency(us) 00:23:56.795 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:56.795 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:23:56.795 Verification LBA range: start 0x0 length 0x3100 00:23:56.795 COMP_lvs0/lv0 : 3.01 4083.89 15.95 0.00 0.00 7797.46 126.78 13793.77 00:23:56.795 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:23:56.795 Verification LBA range: start 0x3100 length 0x3100 00:23:56.795 COMP_lvs0/lv0 : 3.01 4182.24 16.34 0.00 0.00 7616.29 120.44 13294.45 00:23:56.795 =================================================================================================================== 00:23:56.795 Total : 8266.13 32.29 0.00 0.00 7705.80 120.44 13793.77 00:23:56.795 0 00:23:56.795 19:00:41 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:23:56.795 19:00:41 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:23:56.795 19:00:41 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:23:56.795 19:00:41 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:23:56.795 19:00:41 compress_compdev -- compress/compress.sh@78 -- # killprocess 2213162 00:23:56.795 19:00:41 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2213162 ']' 00:23:56.795 19:00:41 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2213162 00:23:56.795 19:00:41 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:23:56.795 19:00:41 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:56.795 19:00:41 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2213162 00:23:56.795 19:00:41 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:23:56.795 19:00:41 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:23:56.795 19:00:41 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2213162' 00:23:56.795 killing process with pid 2213162 00:23:56.795 19:00:41 compress_compdev -- common/autotest_common.sh@967 -- # kill 2213162 00:23:56.795 Received shutdown signal, test time was about 3.000000 seconds 00:23:56.795 00:23:56.795 Latency(us) 00:23:56.795 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:56.795 =================================================================================================================== 00:23:56.795 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:56.795 19:00:41 compress_compdev -- common/autotest_common.sh@972 -- # wait 2213162 00:23:58.696 19:00:43 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:23:58.696 19:00:43 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:23:58.696 19:00:43 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2215001 00:23:58.696 19:00:43 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:58.696 19:00:43 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:23:58.696 19:00:43 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2215001 00:23:58.696 19:00:43 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2215001 ']' 00:23:58.696 19:00:43 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:58.696 19:00:43 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:58.696 19:00:43 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:58.696 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:58.696 19:00:43 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:58.696 19:00:43 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:23:58.696 [2024-07-24 19:00:43.251742] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:23:58.696 [2024-07-24 19:00:43.251785] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2215001 ] 00:23:58.696 [2024-07-24 19:00:43.315291] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:23:58.696 [2024-07-24 19:00:43.384406] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:23:58.696 [2024-07-24 19:00:43.384408] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:23:58.954 [2024-07-24 19:00:43.755105] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:23:59.211 19:00:44 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:59.211 19:00:44 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:23:59.211 19:00:44 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:23:59.211 19:00:44 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:23:59.211 19:00:44 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:02.505 [2024-07-24 19:00:47.038671] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x24d2bc0 PMD being used: compress_qat 00:24:02.505 19:00:47 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:02.505 19:00:47 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:24:02.505 19:00:47 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:02.505 19:00:47 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:02.505 19:00:47 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:02.505 19:00:47 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:02.505 19:00:47 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:02.505 19:00:47 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:02.505 [ 00:24:02.505 { 00:24:02.505 "name": "Nvme0n1", 00:24:02.505 "aliases": [ 00:24:02.505 "af9cabf6-f9c6-4cc0-8343-e8e1c944fec9" 00:24:02.505 ], 00:24:02.505 "product_name": "NVMe disk", 00:24:02.505 "block_size": 512, 00:24:02.505 "num_blocks": 1953525168, 00:24:02.505 "uuid": "af9cabf6-f9c6-4cc0-8343-e8e1c944fec9", 00:24:02.505 "assigned_rate_limits": { 00:24:02.505 "rw_ios_per_sec": 0, 00:24:02.505 "rw_mbytes_per_sec": 0, 00:24:02.505 "r_mbytes_per_sec": 0, 00:24:02.505 "w_mbytes_per_sec": 0 00:24:02.505 }, 00:24:02.505 "claimed": false, 00:24:02.505 "zoned": false, 00:24:02.505 "supported_io_types": { 00:24:02.505 "read": true, 00:24:02.505 "write": true, 00:24:02.505 "unmap": true, 00:24:02.505 "flush": true, 00:24:02.505 "reset": true, 00:24:02.505 "nvme_admin": true, 00:24:02.505 "nvme_io": true, 00:24:02.505 "nvme_io_md": false, 00:24:02.505 "write_zeroes": true, 00:24:02.505 "zcopy": false, 00:24:02.505 "get_zone_info": false, 00:24:02.505 "zone_management": false, 00:24:02.505 "zone_append": false, 00:24:02.505 "compare": false, 00:24:02.505 "compare_and_write": false, 00:24:02.505 "abort": true, 00:24:02.505 "seek_hole": false, 00:24:02.505 "seek_data": false, 00:24:02.505 "copy": false, 00:24:02.505 "nvme_iov_md": false 00:24:02.505 }, 00:24:02.505 "driver_specific": { 00:24:02.505 "nvme": [ 00:24:02.505 { 00:24:02.505 "pci_address": "0000:5e:00.0", 00:24:02.505 "trid": { 00:24:02.505 "trtype": "PCIe", 00:24:02.505 "traddr": "0000:5e:00.0" 00:24:02.505 }, 00:24:02.505 "ctrlr_data": { 00:24:02.505 "cntlid": 0, 00:24:02.505 "vendor_id": "0x8086", 00:24:02.505 "model_number": "INTEL SSDPE2KX010T8", 00:24:02.505 "serial_number": "BTLJ807001JM1P0FGN", 00:24:02.505 "firmware_revision": "VDV10170", 00:24:02.505 "oacs": { 00:24:02.505 "security": 1, 00:24:02.505 "format": 1, 00:24:02.505 "firmware": 1, 00:24:02.505 "ns_manage": 1 00:24:02.505 }, 00:24:02.505 "multi_ctrlr": false, 00:24:02.505 "ana_reporting": false 00:24:02.505 }, 00:24:02.505 "vs": { 00:24:02.505 "nvme_version": "1.2" 00:24:02.505 }, 00:24:02.505 "ns_data": { 00:24:02.505 "id": 1, 00:24:02.505 "can_share": false 00:24:02.505 }, 00:24:02.505 "security": { 00:24:02.505 "opal": true 00:24:02.505 } 00:24:02.505 } 00:24:02.505 ], 00:24:02.505 "mp_policy": "active_passive" 00:24:02.505 } 00:24:02.505 } 00:24:02.505 ] 00:24:02.505 19:00:47 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:02.505 19:00:47 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:02.763 [2024-07-24 19:00:47.570544] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2309c20 PMD being used: compress_qat 00:24:03.698 1aa300c5-b026-4e7d-b94b-01bc7ef89237 00:24:03.698 19:00:48 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:03.698 ea1a7ab9-fb7a-4455-8c31-1f1cd85daeb1 00:24:03.698 19:00:48 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:03.698 19:00:48 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:24:03.698 19:00:48 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:03.698 19:00:48 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:03.698 19:00:48 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:03.698 19:00:48 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:03.698 19:00:48 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:03.955 19:00:48 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:03.955 [ 00:24:03.955 { 00:24:03.955 "name": "ea1a7ab9-fb7a-4455-8c31-1f1cd85daeb1", 00:24:03.955 "aliases": [ 00:24:03.955 "lvs0/lv0" 00:24:03.955 ], 00:24:03.955 "product_name": "Logical Volume", 00:24:03.955 "block_size": 512, 00:24:03.955 "num_blocks": 204800, 00:24:03.955 "uuid": "ea1a7ab9-fb7a-4455-8c31-1f1cd85daeb1", 00:24:03.955 "assigned_rate_limits": { 00:24:03.955 "rw_ios_per_sec": 0, 00:24:03.955 "rw_mbytes_per_sec": 0, 00:24:03.955 "r_mbytes_per_sec": 0, 00:24:03.955 "w_mbytes_per_sec": 0 00:24:03.955 }, 00:24:03.955 "claimed": false, 00:24:03.955 "zoned": false, 00:24:03.955 "supported_io_types": { 00:24:03.955 "read": true, 00:24:03.955 "write": true, 00:24:03.955 "unmap": true, 00:24:03.955 "flush": false, 00:24:03.955 "reset": true, 00:24:03.955 "nvme_admin": false, 00:24:03.955 "nvme_io": false, 00:24:03.955 "nvme_io_md": false, 00:24:03.955 "write_zeroes": true, 00:24:03.955 "zcopy": false, 00:24:03.955 "get_zone_info": false, 00:24:03.955 "zone_management": false, 00:24:03.955 "zone_append": false, 00:24:03.955 "compare": false, 00:24:03.955 "compare_and_write": false, 00:24:03.955 "abort": false, 00:24:03.955 "seek_hole": true, 00:24:03.955 "seek_data": true, 00:24:03.955 "copy": false, 00:24:03.955 "nvme_iov_md": false 00:24:03.955 }, 00:24:03.955 "driver_specific": { 00:24:03.955 "lvol": { 00:24:03.955 "lvol_store_uuid": "1aa300c5-b026-4e7d-b94b-01bc7ef89237", 00:24:03.955 "base_bdev": "Nvme0n1", 00:24:03.955 "thin_provision": true, 00:24:03.955 "num_allocated_clusters": 0, 00:24:03.955 "snapshot": false, 00:24:03.955 "clone": false, 00:24:03.955 "esnap_clone": false 00:24:03.955 } 00:24:03.955 } 00:24:03.955 } 00:24:03.955 ] 00:24:04.214 19:00:48 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:04.214 19:00:48 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:24:04.214 19:00:48 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:24:04.214 [2024-07-24 19:00:49.131663] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:04.214 COMP_lvs0/lv0 00:24:04.214 19:00:49 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:04.214 19:00:49 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:24:04.214 19:00:49 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:04.214 19:00:49 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:04.214 19:00:49 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:04.214 19:00:49 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:04.214 19:00:49 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:04.472 19:00:49 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:04.472 [ 00:24:04.472 { 00:24:04.472 "name": "COMP_lvs0/lv0", 00:24:04.472 "aliases": [ 00:24:04.472 "23bc92d2-b21e-5bde-9654-f583931f9372" 00:24:04.472 ], 00:24:04.472 "product_name": "compress", 00:24:04.472 "block_size": 4096, 00:24:04.472 "num_blocks": 25088, 00:24:04.472 "uuid": "23bc92d2-b21e-5bde-9654-f583931f9372", 00:24:04.472 "assigned_rate_limits": { 00:24:04.472 "rw_ios_per_sec": 0, 00:24:04.472 "rw_mbytes_per_sec": 0, 00:24:04.472 "r_mbytes_per_sec": 0, 00:24:04.472 "w_mbytes_per_sec": 0 00:24:04.472 }, 00:24:04.472 "claimed": false, 00:24:04.472 "zoned": false, 00:24:04.472 "supported_io_types": { 00:24:04.472 "read": true, 00:24:04.472 "write": true, 00:24:04.472 "unmap": false, 00:24:04.472 "flush": false, 00:24:04.472 "reset": false, 00:24:04.472 "nvme_admin": false, 00:24:04.472 "nvme_io": false, 00:24:04.472 "nvme_io_md": false, 00:24:04.472 "write_zeroes": true, 00:24:04.472 "zcopy": false, 00:24:04.472 "get_zone_info": false, 00:24:04.472 "zone_management": false, 00:24:04.472 "zone_append": false, 00:24:04.472 "compare": false, 00:24:04.472 "compare_and_write": false, 00:24:04.472 "abort": false, 00:24:04.472 "seek_hole": false, 00:24:04.472 "seek_data": false, 00:24:04.472 "copy": false, 00:24:04.472 "nvme_iov_md": false 00:24:04.472 }, 00:24:04.472 "driver_specific": { 00:24:04.472 "compress": { 00:24:04.472 "name": "COMP_lvs0/lv0", 00:24:04.472 "base_bdev_name": "ea1a7ab9-fb7a-4455-8c31-1f1cd85daeb1", 00:24:04.472 "pm_path": "/tmp/pmem/c90e7f43-69be-4f7c-af29-02b1f8de651a" 00:24:04.472 } 00:24:04.472 } 00:24:04.472 } 00:24:04.472 ] 00:24:04.472 19:00:49 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:04.473 19:00:49 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:24:04.729 [2024-07-24 19:00:49.553408] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fea4c1b15c0 PMD being used: compress_qat 00:24:04.729 [2024-07-24 19:00:49.554907] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x24cf600 PMD being used: compress_qat 00:24:04.729 Running I/O for 3 seconds... 00:24:08.010 00:24:08.010 Latency(us) 00:24:08.010 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:08.010 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:24:08.010 Verification LBA range: start 0x0 length 0x3100 00:24:08.010 COMP_lvs0/lv0 : 3.01 3951.43 15.44 0.00 0.00 8060.53 172.62 13232.03 00:24:08.010 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:24:08.010 Verification LBA range: start 0x3100 length 0x3100 00:24:08.010 COMP_lvs0/lv0 : 3.00 4047.14 15.81 0.00 0.00 7873.21 165.79 13232.03 00:24:08.010 =================================================================================================================== 00:24:08.010 Total : 7998.56 31.24 0.00 0.00 7965.78 165.79 13232.03 00:24:08.010 0 00:24:08.010 19:00:52 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:24:08.010 19:00:52 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:08.010 19:00:52 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:08.010 19:00:52 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:24:08.010 19:00:52 compress_compdev -- compress/compress.sh@78 -- # killprocess 2215001 00:24:08.010 19:00:52 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2215001 ']' 00:24:08.010 19:00:52 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2215001 00:24:08.010 19:00:52 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:24:08.010 19:00:52 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:08.010 19:00:52 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2215001 00:24:08.010 19:00:52 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:24:08.010 19:00:52 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:24:08.010 19:00:52 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2215001' 00:24:08.010 killing process with pid 2215001 00:24:08.010 19:00:52 compress_compdev -- common/autotest_common.sh@967 -- # kill 2215001 00:24:08.010 Received shutdown signal, test time was about 3.000000 seconds 00:24:08.010 00:24:08.010 Latency(us) 00:24:08.010 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:08.010 =================================================================================================================== 00:24:08.010 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:08.010 19:00:52 compress_compdev -- common/autotest_common.sh@972 -- # wait 2215001 00:24:09.912 19:00:54 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:24:09.912 19:00:54 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:24:09.912 19:00:54 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=2216838 00:24:09.912 19:00:54 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:09.912 19:00:54 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:24:09.912 19:00:54 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 2216838 00:24:09.912 19:00:54 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2216838 ']' 00:24:09.912 19:00:54 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:09.912 19:00:54 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:09.912 19:00:54 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:09.912 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:09.912 19:00:54 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:09.912 19:00:54 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:24:09.912 [2024-07-24 19:00:54.513771] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:24:09.912 [2024-07-24 19:00:54.513814] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2216838 ] 00:24:09.912 [2024-07-24 19:00:54.578648] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:24:09.912 [2024-07-24 19:00:54.652562] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:24:09.912 [2024-07-24 19:00:54.652657] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:09.912 [2024-07-24 19:00:54.652657] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:24:10.172 [2024-07-24 19:00:55.025583] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:24:10.430 19:00:55 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:10.430 19:00:55 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:24:10.430 19:00:55 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:24:10.430 19:00:55 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:10.430 19:00:55 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:13.714 [2024-07-24 19:00:58.310801] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x267c7e0 PMD being used: compress_qat 00:24:13.714 19:00:58 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:13.714 19:00:58 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:24:13.714 19:00:58 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:13.714 19:00:58 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:13.714 19:00:58 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:13.714 19:00:58 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:13.714 19:00:58 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:13.714 19:00:58 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:13.714 [ 00:24:13.714 { 00:24:13.714 "name": "Nvme0n1", 00:24:13.714 "aliases": [ 00:24:13.714 "091d064b-c037-43f2-8921-2fd98cd22276" 00:24:13.714 ], 00:24:13.714 "product_name": "NVMe disk", 00:24:13.714 "block_size": 512, 00:24:13.714 "num_blocks": 1953525168, 00:24:13.714 "uuid": "091d064b-c037-43f2-8921-2fd98cd22276", 00:24:13.714 "assigned_rate_limits": { 00:24:13.714 "rw_ios_per_sec": 0, 00:24:13.714 "rw_mbytes_per_sec": 0, 00:24:13.714 "r_mbytes_per_sec": 0, 00:24:13.714 "w_mbytes_per_sec": 0 00:24:13.714 }, 00:24:13.714 "claimed": false, 00:24:13.714 "zoned": false, 00:24:13.714 "supported_io_types": { 00:24:13.714 "read": true, 00:24:13.714 "write": true, 00:24:13.714 "unmap": true, 00:24:13.714 "flush": true, 00:24:13.714 "reset": true, 00:24:13.714 "nvme_admin": true, 00:24:13.714 "nvme_io": true, 00:24:13.714 "nvme_io_md": false, 00:24:13.714 "write_zeroes": true, 00:24:13.714 "zcopy": false, 00:24:13.714 "get_zone_info": false, 00:24:13.714 "zone_management": false, 00:24:13.714 "zone_append": false, 00:24:13.714 "compare": false, 00:24:13.714 "compare_and_write": false, 00:24:13.714 "abort": true, 00:24:13.714 "seek_hole": false, 00:24:13.714 "seek_data": false, 00:24:13.714 "copy": false, 00:24:13.714 "nvme_iov_md": false 00:24:13.714 }, 00:24:13.714 "driver_specific": { 00:24:13.714 "nvme": [ 00:24:13.714 { 00:24:13.714 "pci_address": "0000:5e:00.0", 00:24:13.714 "trid": { 00:24:13.714 "trtype": "PCIe", 00:24:13.714 "traddr": "0000:5e:00.0" 00:24:13.714 }, 00:24:13.714 "ctrlr_data": { 00:24:13.714 "cntlid": 0, 00:24:13.714 "vendor_id": "0x8086", 00:24:13.714 "model_number": "INTEL SSDPE2KX010T8", 00:24:13.714 "serial_number": "BTLJ807001JM1P0FGN", 00:24:13.714 "firmware_revision": "VDV10170", 00:24:13.714 "oacs": { 00:24:13.714 "security": 1, 00:24:13.714 "format": 1, 00:24:13.714 "firmware": 1, 00:24:13.714 "ns_manage": 1 00:24:13.714 }, 00:24:13.714 "multi_ctrlr": false, 00:24:13.714 "ana_reporting": false 00:24:13.714 }, 00:24:13.714 "vs": { 00:24:13.714 "nvme_version": "1.2" 00:24:13.714 }, 00:24:13.714 "ns_data": { 00:24:13.714 "id": 1, 00:24:13.714 "can_share": false 00:24:13.714 }, 00:24:13.714 "security": { 00:24:13.714 "opal": true 00:24:13.714 } 00:24:13.714 } 00:24:13.714 ], 00:24:13.714 "mp_policy": "active_passive" 00:24:13.714 } 00:24:13.714 } 00:24:13.714 ] 00:24:13.714 19:00:58 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:13.714 19:00:58 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:13.973 [2024-07-24 19:00:58.842560] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x267ebb0 PMD being used: compress_qat 00:24:14.915 d8cfded2-e629-4350-b4de-b65082e216cb 00:24:14.915 19:00:59 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:14.915 29a1ba5a-6499-424b-a939-648593233080 00:24:14.915 19:00:59 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:14.915 19:00:59 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:24:14.915 19:00:59 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:14.915 19:00:59 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:14.915 19:00:59 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:14.915 19:00:59 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:14.915 19:00:59 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:15.173 19:01:00 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:15.432 [ 00:24:15.432 { 00:24:15.432 "name": "29a1ba5a-6499-424b-a939-648593233080", 00:24:15.432 "aliases": [ 00:24:15.432 "lvs0/lv0" 00:24:15.432 ], 00:24:15.432 "product_name": "Logical Volume", 00:24:15.432 "block_size": 512, 00:24:15.432 "num_blocks": 204800, 00:24:15.432 "uuid": "29a1ba5a-6499-424b-a939-648593233080", 00:24:15.432 "assigned_rate_limits": { 00:24:15.432 "rw_ios_per_sec": 0, 00:24:15.432 "rw_mbytes_per_sec": 0, 00:24:15.432 "r_mbytes_per_sec": 0, 00:24:15.432 "w_mbytes_per_sec": 0 00:24:15.432 }, 00:24:15.433 "claimed": false, 00:24:15.433 "zoned": false, 00:24:15.433 "supported_io_types": { 00:24:15.433 "read": true, 00:24:15.433 "write": true, 00:24:15.433 "unmap": true, 00:24:15.433 "flush": false, 00:24:15.433 "reset": true, 00:24:15.433 "nvme_admin": false, 00:24:15.433 "nvme_io": false, 00:24:15.433 "nvme_io_md": false, 00:24:15.433 "write_zeroes": true, 00:24:15.433 "zcopy": false, 00:24:15.433 "get_zone_info": false, 00:24:15.433 "zone_management": false, 00:24:15.433 "zone_append": false, 00:24:15.433 "compare": false, 00:24:15.433 "compare_and_write": false, 00:24:15.433 "abort": false, 00:24:15.433 "seek_hole": true, 00:24:15.433 "seek_data": true, 00:24:15.433 "copy": false, 00:24:15.433 "nvme_iov_md": false 00:24:15.433 }, 00:24:15.433 "driver_specific": { 00:24:15.433 "lvol": { 00:24:15.433 "lvol_store_uuid": "d8cfded2-e629-4350-b4de-b65082e216cb", 00:24:15.433 "base_bdev": "Nvme0n1", 00:24:15.433 "thin_provision": true, 00:24:15.433 "num_allocated_clusters": 0, 00:24:15.433 "snapshot": false, 00:24:15.433 "clone": false, 00:24:15.433 "esnap_clone": false 00:24:15.433 } 00:24:15.433 } 00:24:15.433 } 00:24:15.433 ] 00:24:15.433 19:01:00 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:15.433 19:01:00 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:24:15.433 19:01:00 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:24:15.433 [2024-07-24 19:01:00.393761] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:15.433 COMP_lvs0/lv0 00:24:15.433 19:01:00 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:15.433 19:01:00 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:24:15.433 19:01:00 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:15.433 19:01:00 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:15.433 19:01:00 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:15.433 19:01:00 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:15.433 19:01:00 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:15.706 19:01:00 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:16.012 [ 00:24:16.012 { 00:24:16.012 "name": "COMP_lvs0/lv0", 00:24:16.012 "aliases": [ 00:24:16.012 "8f792cae-3b8a-5726-b410-55dd6f50a581" 00:24:16.012 ], 00:24:16.012 "product_name": "compress", 00:24:16.012 "block_size": 512, 00:24:16.012 "num_blocks": 200704, 00:24:16.012 "uuid": "8f792cae-3b8a-5726-b410-55dd6f50a581", 00:24:16.012 "assigned_rate_limits": { 00:24:16.012 "rw_ios_per_sec": 0, 00:24:16.012 "rw_mbytes_per_sec": 0, 00:24:16.012 "r_mbytes_per_sec": 0, 00:24:16.012 "w_mbytes_per_sec": 0 00:24:16.012 }, 00:24:16.012 "claimed": false, 00:24:16.012 "zoned": false, 00:24:16.012 "supported_io_types": { 00:24:16.012 "read": true, 00:24:16.012 "write": true, 00:24:16.012 "unmap": false, 00:24:16.012 "flush": false, 00:24:16.012 "reset": false, 00:24:16.012 "nvme_admin": false, 00:24:16.012 "nvme_io": false, 00:24:16.012 "nvme_io_md": false, 00:24:16.012 "write_zeroes": true, 00:24:16.012 "zcopy": false, 00:24:16.012 "get_zone_info": false, 00:24:16.012 "zone_management": false, 00:24:16.012 "zone_append": false, 00:24:16.012 "compare": false, 00:24:16.012 "compare_and_write": false, 00:24:16.012 "abort": false, 00:24:16.012 "seek_hole": false, 00:24:16.012 "seek_data": false, 00:24:16.012 "copy": false, 00:24:16.012 "nvme_iov_md": false 00:24:16.012 }, 00:24:16.012 "driver_specific": { 00:24:16.012 "compress": { 00:24:16.012 "name": "COMP_lvs0/lv0", 00:24:16.012 "base_bdev_name": "29a1ba5a-6499-424b-a939-648593233080", 00:24:16.012 "pm_path": "/tmp/pmem/f46b9c69-b55a-4a69-a79f-8aaa86b8ece0" 00:24:16.012 } 00:24:16.012 } 00:24:16.012 } 00:24:16.012 ] 00:24:16.012 19:01:00 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:16.012 19:01:00 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:24:16.012 [2024-07-24 19:01:00.814576] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f1e181b1350 PMD being used: compress_qat 00:24:16.012 I/O targets: 00:24:16.012 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:24:16.012 00:24:16.012 00:24:16.012 CUnit - A unit testing framework for C - Version 2.1-3 00:24:16.012 http://cunit.sourceforge.net/ 00:24:16.012 00:24:16.012 00:24:16.012 Suite: bdevio tests on: COMP_lvs0/lv0 00:24:16.012 Test: blockdev write read block ...passed 00:24:16.012 Test: blockdev write zeroes read block ...passed 00:24:16.012 Test: blockdev write zeroes read no split ...passed 00:24:16.012 Test: blockdev write zeroes read split ...passed 00:24:16.012 Test: blockdev write zeroes read split partial ...passed 00:24:16.012 Test: blockdev reset ...[2024-07-24 19:01:00.872291] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:24:16.012 passed 00:24:16.012 Test: blockdev write read 8 blocks ...passed 00:24:16.012 Test: blockdev write read size > 128k ...passed 00:24:16.012 Test: blockdev write read invalid size ...passed 00:24:16.012 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:24:16.012 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:24:16.012 Test: blockdev write read max offset ...passed 00:24:16.012 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:24:16.012 Test: blockdev writev readv 8 blocks ...passed 00:24:16.012 Test: blockdev writev readv 30 x 1block ...passed 00:24:16.012 Test: blockdev writev readv block ...passed 00:24:16.012 Test: blockdev writev readv size > 128k ...passed 00:24:16.012 Test: blockdev writev readv size > 128k in two iovs ...passed 00:24:16.012 Test: blockdev comparev and writev ...passed 00:24:16.012 Test: blockdev nvme passthru rw ...passed 00:24:16.012 Test: blockdev nvme passthru vendor specific ...passed 00:24:16.012 Test: blockdev nvme admin passthru ...passed 00:24:16.012 Test: blockdev copy ...passed 00:24:16.012 00:24:16.012 Run Summary: Type Total Ran Passed Failed Inactive 00:24:16.012 suites 1 1 n/a 0 0 00:24:16.012 tests 23 23 23 0 0 00:24:16.012 asserts 130 130 130 0 n/a 00:24:16.012 00:24:16.012 Elapsed time = 0.172 seconds 00:24:16.012 0 00:24:16.012 19:01:00 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:24:16.012 19:01:00 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:16.275 19:01:01 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:16.275 19:01:01 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:24:16.275 19:01:01 compress_compdev -- compress/compress.sh@62 -- # killprocess 2216838 00:24:16.275 19:01:01 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2216838 ']' 00:24:16.275 19:01:01 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2216838 00:24:16.275 19:01:01 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:24:16.275 19:01:01 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:16.275 19:01:01 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2216838 00:24:16.533 19:01:01 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:16.533 19:01:01 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:16.533 19:01:01 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2216838' 00:24:16.533 killing process with pid 2216838 00:24:16.533 19:01:01 compress_compdev -- common/autotest_common.sh@967 -- # kill 2216838 00:24:16.533 19:01:01 compress_compdev -- common/autotest_common.sh@972 -- # wait 2216838 00:24:17.910 19:01:02 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:24:17.910 19:01:02 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:24:17.910 00:24:17.910 real 0m42.118s 00:24:17.910 user 1m34.529s 00:24:17.910 sys 0m3.437s 00:24:17.910 19:01:02 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:17.910 19:01:02 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:24:17.910 ************************************ 00:24:17.910 END TEST compress_compdev 00:24:17.910 ************************************ 00:24:17.910 19:01:02 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:24:17.910 19:01:02 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:24:17.910 19:01:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:17.910 19:01:02 -- common/autotest_common.sh@10 -- # set +x 00:24:17.910 ************************************ 00:24:17.910 START TEST compress_isal 00:24:17.910 ************************************ 00:24:17.910 19:01:02 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:24:18.169 * Looking for test storage... 00:24:18.169 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:24:18.169 19:01:02 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:24:18.169 19:01:02 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:24:18.169 19:01:02 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:24:18.169 19:01:02 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:24:18.169 19:01:02 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:24:18.169 19:01:02 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:24:18.169 19:01:02 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:24:18.169 19:01:02 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:24:18.169 19:01:02 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:24:18.169 19:01:02 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:24:18.169 19:01:02 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:24:18.169 19:01:02 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:24:18.169 19:01:02 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:24:18.169 19:01:02 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:24:18.169 19:01:02 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:24:18.169 19:01:02 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:24:18.169 19:01:02 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:24:18.169 19:01:02 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:24:18.169 19:01:02 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:24:18.169 19:01:02 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:24:18.169 19:01:02 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:24:18.169 19:01:02 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:24:18.170 19:01:02 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:18.170 19:01:02 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:18.170 19:01:02 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:18.170 19:01:02 compress_isal -- paths/export.sh@5 -- # export PATH 00:24:18.170 19:01:02 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:18.170 19:01:02 compress_isal -- nvmf/common.sh@47 -- # : 0 00:24:18.170 19:01:02 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:24:18.170 19:01:02 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:24:18.170 19:01:02 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:24:18.170 19:01:02 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:24:18.170 19:01:02 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:24:18.170 19:01:02 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:24:18.170 19:01:02 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:24:18.170 19:01:02 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:24:18.170 19:01:02 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:18.170 19:01:02 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:24:18.170 19:01:02 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:24:18.170 19:01:02 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:24:18.170 19:01:02 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:24:18.170 19:01:02 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2218279 00:24:18.170 19:01:02 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:18.170 19:01:02 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2218279 00:24:18.170 19:01:02 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2218279 ']' 00:24:18.170 19:01:02 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:18.170 19:01:02 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:18.170 19:01:02 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:18.170 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:18.170 19:01:02 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:18.170 19:01:02 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:24:18.170 19:01:02 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:24:18.170 [2024-07-24 19:01:03.001248] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:24:18.170 [2024-07-24 19:01:03.001288] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2218279 ] 00:24:18.170 [2024-07-24 19:01:03.067178] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:18.170 [2024-07-24 19:01:03.146681] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:24:18.170 [2024-07-24 19:01:03.146694] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:24:19.117 19:01:03 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:19.117 19:01:03 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:24:19.117 19:01:03 compress_isal -- compress/compress.sh@74 -- # create_vols 00:24:19.117 19:01:03 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:19.117 19:01:03 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:22.404 19:01:06 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:22.404 19:01:06 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:24:22.404 19:01:06 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:22.404 19:01:06 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:22.404 19:01:06 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:22.404 19:01:06 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:22.404 19:01:06 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:22.404 19:01:06 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:22.404 [ 00:24:22.404 { 00:24:22.404 "name": "Nvme0n1", 00:24:22.404 "aliases": [ 00:24:22.404 "91abd725-cc9f-490b-bd91-f03380c92c56" 00:24:22.404 ], 00:24:22.404 "product_name": "NVMe disk", 00:24:22.404 "block_size": 512, 00:24:22.404 "num_blocks": 1953525168, 00:24:22.404 "uuid": "91abd725-cc9f-490b-bd91-f03380c92c56", 00:24:22.404 "assigned_rate_limits": { 00:24:22.404 "rw_ios_per_sec": 0, 00:24:22.404 "rw_mbytes_per_sec": 0, 00:24:22.404 "r_mbytes_per_sec": 0, 00:24:22.404 "w_mbytes_per_sec": 0 00:24:22.404 }, 00:24:22.404 "claimed": false, 00:24:22.404 "zoned": false, 00:24:22.404 "supported_io_types": { 00:24:22.404 "read": true, 00:24:22.404 "write": true, 00:24:22.404 "unmap": true, 00:24:22.404 "flush": true, 00:24:22.404 "reset": true, 00:24:22.404 "nvme_admin": true, 00:24:22.404 "nvme_io": true, 00:24:22.404 "nvme_io_md": false, 00:24:22.404 "write_zeroes": true, 00:24:22.404 "zcopy": false, 00:24:22.404 "get_zone_info": false, 00:24:22.404 "zone_management": false, 00:24:22.404 "zone_append": false, 00:24:22.404 "compare": false, 00:24:22.404 "compare_and_write": false, 00:24:22.404 "abort": true, 00:24:22.404 "seek_hole": false, 00:24:22.404 "seek_data": false, 00:24:22.404 "copy": false, 00:24:22.404 "nvme_iov_md": false 00:24:22.404 }, 00:24:22.404 "driver_specific": { 00:24:22.404 "nvme": [ 00:24:22.404 { 00:24:22.404 "pci_address": "0000:5e:00.0", 00:24:22.404 "trid": { 00:24:22.404 "trtype": "PCIe", 00:24:22.404 "traddr": "0000:5e:00.0" 00:24:22.404 }, 00:24:22.404 "ctrlr_data": { 00:24:22.404 "cntlid": 0, 00:24:22.404 "vendor_id": "0x8086", 00:24:22.404 "model_number": "INTEL SSDPE2KX010T8", 00:24:22.404 "serial_number": "BTLJ807001JM1P0FGN", 00:24:22.404 "firmware_revision": "VDV10170", 00:24:22.404 "oacs": { 00:24:22.404 "security": 1, 00:24:22.404 "format": 1, 00:24:22.404 "firmware": 1, 00:24:22.404 "ns_manage": 1 00:24:22.404 }, 00:24:22.404 "multi_ctrlr": false, 00:24:22.404 "ana_reporting": false 00:24:22.404 }, 00:24:22.404 "vs": { 00:24:22.404 "nvme_version": "1.2" 00:24:22.404 }, 00:24:22.404 "ns_data": { 00:24:22.404 "id": 1, 00:24:22.404 "can_share": false 00:24:22.404 }, 00:24:22.404 "security": { 00:24:22.404 "opal": true 00:24:22.404 } 00:24:22.404 } 00:24:22.404 ], 00:24:22.404 "mp_policy": "active_passive" 00:24:22.404 } 00:24:22.404 } 00:24:22.404 ] 00:24:22.404 19:01:07 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:22.404 19:01:07 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:23.340 1c53978c-756c-4433-96bf-dabf235a1bdd 00:24:23.340 19:01:08 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:23.597 3feb376b-87cc-47c2-af06-261e593e4806 00:24:23.597 19:01:08 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:23.597 19:01:08 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:24:23.597 19:01:08 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:23.597 19:01:08 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:23.597 19:01:08 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:23.597 19:01:08 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:23.597 19:01:08 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:23.597 19:01:08 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:23.856 [ 00:24:23.856 { 00:24:23.856 "name": "3feb376b-87cc-47c2-af06-261e593e4806", 00:24:23.856 "aliases": [ 00:24:23.856 "lvs0/lv0" 00:24:23.856 ], 00:24:23.856 "product_name": "Logical Volume", 00:24:23.856 "block_size": 512, 00:24:23.856 "num_blocks": 204800, 00:24:23.856 "uuid": "3feb376b-87cc-47c2-af06-261e593e4806", 00:24:23.856 "assigned_rate_limits": { 00:24:23.856 "rw_ios_per_sec": 0, 00:24:23.856 "rw_mbytes_per_sec": 0, 00:24:23.856 "r_mbytes_per_sec": 0, 00:24:23.856 "w_mbytes_per_sec": 0 00:24:23.856 }, 00:24:23.856 "claimed": false, 00:24:23.856 "zoned": false, 00:24:23.856 "supported_io_types": { 00:24:23.856 "read": true, 00:24:23.856 "write": true, 00:24:23.856 "unmap": true, 00:24:23.856 "flush": false, 00:24:23.856 "reset": true, 00:24:23.856 "nvme_admin": false, 00:24:23.856 "nvme_io": false, 00:24:23.856 "nvme_io_md": false, 00:24:23.856 "write_zeroes": true, 00:24:23.856 "zcopy": false, 00:24:23.856 "get_zone_info": false, 00:24:23.856 "zone_management": false, 00:24:23.856 "zone_append": false, 00:24:23.856 "compare": false, 00:24:23.856 "compare_and_write": false, 00:24:23.856 "abort": false, 00:24:23.856 "seek_hole": true, 00:24:23.856 "seek_data": true, 00:24:23.856 "copy": false, 00:24:23.856 "nvme_iov_md": false 00:24:23.856 }, 00:24:23.856 "driver_specific": { 00:24:23.856 "lvol": { 00:24:23.856 "lvol_store_uuid": "1c53978c-756c-4433-96bf-dabf235a1bdd", 00:24:23.856 "base_bdev": "Nvme0n1", 00:24:23.856 "thin_provision": true, 00:24:23.856 "num_allocated_clusters": 0, 00:24:23.856 "snapshot": false, 00:24:23.856 "clone": false, 00:24:23.856 "esnap_clone": false 00:24:23.856 } 00:24:23.856 } 00:24:23.856 } 00:24:23.856 ] 00:24:23.856 19:01:08 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:23.856 19:01:08 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:24:23.856 19:01:08 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:24:24.115 [2024-07-24 19:01:08.872506] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:24.115 COMP_lvs0/lv0 00:24:24.115 19:01:08 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:24.115 19:01:08 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:24:24.115 19:01:08 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:24.115 19:01:08 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:24.115 19:01:08 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:24.115 19:01:08 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:24.115 19:01:08 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:24.115 19:01:09 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:24.374 [ 00:24:24.374 { 00:24:24.374 "name": "COMP_lvs0/lv0", 00:24:24.374 "aliases": [ 00:24:24.374 "63b48b1b-1d01-50e2-9ca6-e5a168b74451" 00:24:24.374 ], 00:24:24.374 "product_name": "compress", 00:24:24.374 "block_size": 512, 00:24:24.374 "num_blocks": 200704, 00:24:24.374 "uuid": "63b48b1b-1d01-50e2-9ca6-e5a168b74451", 00:24:24.374 "assigned_rate_limits": { 00:24:24.374 "rw_ios_per_sec": 0, 00:24:24.374 "rw_mbytes_per_sec": 0, 00:24:24.374 "r_mbytes_per_sec": 0, 00:24:24.374 "w_mbytes_per_sec": 0 00:24:24.374 }, 00:24:24.374 "claimed": false, 00:24:24.374 "zoned": false, 00:24:24.374 "supported_io_types": { 00:24:24.374 "read": true, 00:24:24.374 "write": true, 00:24:24.374 "unmap": false, 00:24:24.374 "flush": false, 00:24:24.374 "reset": false, 00:24:24.374 "nvme_admin": false, 00:24:24.374 "nvme_io": false, 00:24:24.374 "nvme_io_md": false, 00:24:24.374 "write_zeroes": true, 00:24:24.374 "zcopy": false, 00:24:24.374 "get_zone_info": false, 00:24:24.374 "zone_management": false, 00:24:24.374 "zone_append": false, 00:24:24.374 "compare": false, 00:24:24.374 "compare_and_write": false, 00:24:24.374 "abort": false, 00:24:24.374 "seek_hole": false, 00:24:24.374 "seek_data": false, 00:24:24.374 "copy": false, 00:24:24.374 "nvme_iov_md": false 00:24:24.374 }, 00:24:24.374 "driver_specific": { 00:24:24.374 "compress": { 00:24:24.374 "name": "COMP_lvs0/lv0", 00:24:24.374 "base_bdev_name": "3feb376b-87cc-47c2-af06-261e593e4806", 00:24:24.374 "pm_path": "/tmp/pmem/36527629-d387-43f3-88c0-78f8cfb4aaa5" 00:24:24.374 } 00:24:24.374 } 00:24:24.374 } 00:24:24.374 ] 00:24:24.374 19:01:09 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:24.374 19:01:09 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:24:24.374 Running I/O for 3 seconds... 00:24:27.672 00:24:27.672 Latency(us) 00:24:27.672 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:27.672 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:24:27.672 Verification LBA range: start 0x0 length 0x3100 00:24:27.672 COMP_lvs0/lv0 : 3.01 3347.29 13.08 0.00 0.00 9513.24 56.32 15416.56 00:24:27.672 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:24:27.672 Verification LBA range: start 0x3100 length 0x3100 00:24:27.672 COMP_lvs0/lv0 : 3.01 3355.71 13.11 0.00 0.00 9485.62 53.39 15978.30 00:24:27.672 =================================================================================================================== 00:24:27.672 Total : 6703.00 26.18 0.00 0.00 9499.41 53.39 15978.30 00:24:27.672 0 00:24:27.672 19:01:12 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:24:27.672 19:01:12 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:27.672 19:01:12 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:27.930 19:01:12 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:24:27.930 19:01:12 compress_isal -- compress/compress.sh@78 -- # killprocess 2218279 00:24:27.930 19:01:12 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2218279 ']' 00:24:27.930 19:01:12 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2218279 00:24:27.930 19:01:12 compress_isal -- common/autotest_common.sh@953 -- # uname 00:24:27.930 19:01:12 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:27.930 19:01:12 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2218279 00:24:27.930 19:01:12 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:24:27.930 19:01:12 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:24:27.930 19:01:12 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2218279' 00:24:27.930 killing process with pid 2218279 00:24:27.930 19:01:12 compress_isal -- common/autotest_common.sh@967 -- # kill 2218279 00:24:27.930 Received shutdown signal, test time was about 3.000000 seconds 00:24:27.930 00:24:27.930 Latency(us) 00:24:27.930 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:27.930 =================================================================================================================== 00:24:27.930 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:27.930 19:01:12 compress_isal -- common/autotest_common.sh@972 -- # wait 2218279 00:24:29.305 19:01:14 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:24:29.305 19:01:14 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:24:29.305 19:01:14 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2220121 00:24:29.305 19:01:14 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:29.305 19:01:14 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:24:29.305 19:01:14 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2220121 00:24:29.305 19:01:14 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2220121 ']' 00:24:29.305 19:01:14 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:29.305 19:01:14 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:29.305 19:01:14 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:29.305 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:29.305 19:01:14 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:29.305 19:01:14 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:24:29.305 [2024-07-24 19:01:14.296013] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:24:29.305 [2024-07-24 19:01:14.296055] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2220121 ] 00:24:29.564 [2024-07-24 19:01:14.359908] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:29.564 [2024-07-24 19:01:14.439181] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:24:29.564 [2024-07-24 19:01:14.439184] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:24:30.132 19:01:15 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:30.132 19:01:15 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:24:30.132 19:01:15 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:24:30.132 19:01:15 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:30.132 19:01:15 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:33.421 19:01:18 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:33.421 19:01:18 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:24:33.421 19:01:18 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:33.421 19:01:18 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:33.421 19:01:18 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:33.421 19:01:18 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:33.421 19:01:18 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:33.421 19:01:18 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:33.681 [ 00:24:33.681 { 00:24:33.681 "name": "Nvme0n1", 00:24:33.681 "aliases": [ 00:24:33.681 "8d3c9902-b7c5-41d3-907c-b532546122a2" 00:24:33.681 ], 00:24:33.681 "product_name": "NVMe disk", 00:24:33.681 "block_size": 512, 00:24:33.681 "num_blocks": 1953525168, 00:24:33.681 "uuid": "8d3c9902-b7c5-41d3-907c-b532546122a2", 00:24:33.681 "assigned_rate_limits": { 00:24:33.681 "rw_ios_per_sec": 0, 00:24:33.681 "rw_mbytes_per_sec": 0, 00:24:33.681 "r_mbytes_per_sec": 0, 00:24:33.681 "w_mbytes_per_sec": 0 00:24:33.681 }, 00:24:33.681 "claimed": false, 00:24:33.681 "zoned": false, 00:24:33.681 "supported_io_types": { 00:24:33.681 "read": true, 00:24:33.681 "write": true, 00:24:33.681 "unmap": true, 00:24:33.681 "flush": true, 00:24:33.681 "reset": true, 00:24:33.681 "nvme_admin": true, 00:24:33.681 "nvme_io": true, 00:24:33.681 "nvme_io_md": false, 00:24:33.681 "write_zeroes": true, 00:24:33.681 "zcopy": false, 00:24:33.681 "get_zone_info": false, 00:24:33.681 "zone_management": false, 00:24:33.681 "zone_append": false, 00:24:33.681 "compare": false, 00:24:33.681 "compare_and_write": false, 00:24:33.681 "abort": true, 00:24:33.681 "seek_hole": false, 00:24:33.681 "seek_data": false, 00:24:33.681 "copy": false, 00:24:33.681 "nvme_iov_md": false 00:24:33.681 }, 00:24:33.681 "driver_specific": { 00:24:33.681 "nvme": [ 00:24:33.681 { 00:24:33.681 "pci_address": "0000:5e:00.0", 00:24:33.681 "trid": { 00:24:33.681 "trtype": "PCIe", 00:24:33.681 "traddr": "0000:5e:00.0" 00:24:33.681 }, 00:24:33.681 "ctrlr_data": { 00:24:33.681 "cntlid": 0, 00:24:33.681 "vendor_id": "0x8086", 00:24:33.681 "model_number": "INTEL SSDPE2KX010T8", 00:24:33.681 "serial_number": "BTLJ807001JM1P0FGN", 00:24:33.681 "firmware_revision": "VDV10170", 00:24:33.681 "oacs": { 00:24:33.681 "security": 1, 00:24:33.681 "format": 1, 00:24:33.681 "firmware": 1, 00:24:33.681 "ns_manage": 1 00:24:33.681 }, 00:24:33.681 "multi_ctrlr": false, 00:24:33.681 "ana_reporting": false 00:24:33.681 }, 00:24:33.681 "vs": { 00:24:33.681 "nvme_version": "1.2" 00:24:33.681 }, 00:24:33.681 "ns_data": { 00:24:33.681 "id": 1, 00:24:33.681 "can_share": false 00:24:33.681 }, 00:24:33.681 "security": { 00:24:33.681 "opal": true 00:24:33.681 } 00:24:33.681 } 00:24:33.681 ], 00:24:33.681 "mp_policy": "active_passive" 00:24:33.681 } 00:24:33.681 } 00:24:33.681 ] 00:24:33.681 19:01:18 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:33.681 19:01:18 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:34.618 2c55fd01-e412-45fb-8ba8-104729d1a718 00:24:34.618 19:01:19 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:34.877 6d2e7d9d-959c-4b25-9683-8a9732a72d65 00:24:34.877 19:01:19 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:34.877 19:01:19 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:24:34.877 19:01:19 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:34.877 19:01:19 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:34.877 19:01:19 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:34.877 19:01:19 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:34.877 19:01:19 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:34.877 19:01:19 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:35.136 [ 00:24:35.136 { 00:24:35.136 "name": "6d2e7d9d-959c-4b25-9683-8a9732a72d65", 00:24:35.136 "aliases": [ 00:24:35.136 "lvs0/lv0" 00:24:35.136 ], 00:24:35.136 "product_name": "Logical Volume", 00:24:35.136 "block_size": 512, 00:24:35.136 "num_blocks": 204800, 00:24:35.136 "uuid": "6d2e7d9d-959c-4b25-9683-8a9732a72d65", 00:24:35.136 "assigned_rate_limits": { 00:24:35.136 "rw_ios_per_sec": 0, 00:24:35.136 "rw_mbytes_per_sec": 0, 00:24:35.136 "r_mbytes_per_sec": 0, 00:24:35.136 "w_mbytes_per_sec": 0 00:24:35.136 }, 00:24:35.136 "claimed": false, 00:24:35.136 "zoned": false, 00:24:35.136 "supported_io_types": { 00:24:35.136 "read": true, 00:24:35.136 "write": true, 00:24:35.136 "unmap": true, 00:24:35.136 "flush": false, 00:24:35.136 "reset": true, 00:24:35.136 "nvme_admin": false, 00:24:35.136 "nvme_io": false, 00:24:35.136 "nvme_io_md": false, 00:24:35.136 "write_zeroes": true, 00:24:35.136 "zcopy": false, 00:24:35.136 "get_zone_info": false, 00:24:35.136 "zone_management": false, 00:24:35.136 "zone_append": false, 00:24:35.136 "compare": false, 00:24:35.136 "compare_and_write": false, 00:24:35.136 "abort": false, 00:24:35.136 "seek_hole": true, 00:24:35.136 "seek_data": true, 00:24:35.136 "copy": false, 00:24:35.136 "nvme_iov_md": false 00:24:35.136 }, 00:24:35.136 "driver_specific": { 00:24:35.136 "lvol": { 00:24:35.136 "lvol_store_uuid": "2c55fd01-e412-45fb-8ba8-104729d1a718", 00:24:35.136 "base_bdev": "Nvme0n1", 00:24:35.136 "thin_provision": true, 00:24:35.136 "num_allocated_clusters": 0, 00:24:35.136 "snapshot": false, 00:24:35.136 "clone": false, 00:24:35.136 "esnap_clone": false 00:24:35.136 } 00:24:35.136 } 00:24:35.136 } 00:24:35.136 ] 00:24:35.136 19:01:19 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:35.136 19:01:19 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:24:35.136 19:01:19 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:24:35.136 [2024-07-24 19:01:20.141442] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:35.136 COMP_lvs0/lv0 00:24:35.395 19:01:20 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:35.395 19:01:20 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:24:35.395 19:01:20 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:35.395 19:01:20 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:35.395 19:01:20 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:35.395 19:01:20 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:35.395 19:01:20 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:35.395 19:01:20 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:35.654 [ 00:24:35.654 { 00:24:35.654 "name": "COMP_lvs0/lv0", 00:24:35.654 "aliases": [ 00:24:35.654 "4f0de861-2104-5623-bef5-fcdef9524e00" 00:24:35.654 ], 00:24:35.654 "product_name": "compress", 00:24:35.654 "block_size": 512, 00:24:35.654 "num_blocks": 200704, 00:24:35.654 "uuid": "4f0de861-2104-5623-bef5-fcdef9524e00", 00:24:35.654 "assigned_rate_limits": { 00:24:35.654 "rw_ios_per_sec": 0, 00:24:35.654 "rw_mbytes_per_sec": 0, 00:24:35.654 "r_mbytes_per_sec": 0, 00:24:35.654 "w_mbytes_per_sec": 0 00:24:35.654 }, 00:24:35.654 "claimed": false, 00:24:35.654 "zoned": false, 00:24:35.654 "supported_io_types": { 00:24:35.654 "read": true, 00:24:35.654 "write": true, 00:24:35.654 "unmap": false, 00:24:35.654 "flush": false, 00:24:35.654 "reset": false, 00:24:35.654 "nvme_admin": false, 00:24:35.654 "nvme_io": false, 00:24:35.654 "nvme_io_md": false, 00:24:35.654 "write_zeroes": true, 00:24:35.654 "zcopy": false, 00:24:35.654 "get_zone_info": false, 00:24:35.654 "zone_management": false, 00:24:35.654 "zone_append": false, 00:24:35.654 "compare": false, 00:24:35.654 "compare_and_write": false, 00:24:35.654 "abort": false, 00:24:35.654 "seek_hole": false, 00:24:35.654 "seek_data": false, 00:24:35.654 "copy": false, 00:24:35.654 "nvme_iov_md": false 00:24:35.654 }, 00:24:35.654 "driver_specific": { 00:24:35.654 "compress": { 00:24:35.654 "name": "COMP_lvs0/lv0", 00:24:35.654 "base_bdev_name": "6d2e7d9d-959c-4b25-9683-8a9732a72d65", 00:24:35.654 "pm_path": "/tmp/pmem/a246a864-5a92-4450-a358-eb62b2686d89" 00:24:35.654 } 00:24:35.654 } 00:24:35.654 } 00:24:35.654 ] 00:24:35.654 19:01:20 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:35.654 19:01:20 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:24:35.654 Running I/O for 3 seconds... 00:24:38.944 00:24:38.944 Latency(us) 00:24:38.944 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:38.944 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:24:38.944 Verification LBA range: start 0x0 length 0x3100 00:24:38.944 COMP_lvs0/lv0 : 3.01 3373.11 13.18 0.00 0.00 9447.32 55.83 14792.41 00:24:38.944 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:24:38.944 Verification LBA range: start 0x3100 length 0x3100 00:24:38.944 COMP_lvs0/lv0 : 3.01 3391.08 13.25 0.00 0.00 9398.36 54.37 15915.89 00:24:38.944 =================================================================================================================== 00:24:38.944 Total : 6764.19 26.42 0.00 0.00 9422.77 54.37 15915.89 00:24:38.945 0 00:24:38.945 19:01:23 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:24:38.945 19:01:23 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:38.945 19:01:23 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:39.202 19:01:23 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:24:39.202 19:01:23 compress_isal -- compress/compress.sh@78 -- # killprocess 2220121 00:24:39.202 19:01:23 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2220121 ']' 00:24:39.202 19:01:23 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2220121 00:24:39.202 19:01:23 compress_isal -- common/autotest_common.sh@953 -- # uname 00:24:39.202 19:01:23 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:39.202 19:01:23 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2220121 00:24:39.202 19:01:23 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:24:39.202 19:01:23 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:24:39.202 19:01:23 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2220121' 00:24:39.202 killing process with pid 2220121 00:24:39.202 19:01:24 compress_isal -- common/autotest_common.sh@967 -- # kill 2220121 00:24:39.202 Received shutdown signal, test time was about 3.000000 seconds 00:24:39.202 00:24:39.202 Latency(us) 00:24:39.202 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:39.202 =================================================================================================================== 00:24:39.203 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:39.203 19:01:24 compress_isal -- common/autotest_common.sh@972 -- # wait 2220121 00:24:40.578 19:01:25 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:24:40.578 19:01:25 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:24:40.578 19:01:25 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2221956 00:24:40.579 19:01:25 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:40.579 19:01:25 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:24:40.579 19:01:25 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2221956 00:24:40.579 19:01:25 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2221956 ']' 00:24:40.579 19:01:25 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:40.579 19:01:25 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:40.579 19:01:25 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:40.579 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:40.579 19:01:25 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:40.579 19:01:25 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:24:40.579 [2024-07-24 19:01:25.521724] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:24:40.579 [2024-07-24 19:01:25.521769] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2221956 ] 00:24:40.579 [2024-07-24 19:01:25.584995] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:40.838 [2024-07-24 19:01:25.655839] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:24:40.838 [2024-07-24 19:01:25.655842] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:24:41.405 19:01:26 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:41.405 19:01:26 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:24:41.405 19:01:26 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:24:41.405 19:01:26 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:41.405 19:01:26 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:44.776 19:01:29 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:44.776 19:01:29 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:24:44.776 19:01:29 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:44.776 19:01:29 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:44.776 19:01:29 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:44.776 19:01:29 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:44.776 19:01:29 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:44.776 19:01:29 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:44.776 [ 00:24:44.776 { 00:24:44.776 "name": "Nvme0n1", 00:24:44.776 "aliases": [ 00:24:44.776 "30620af0-bdfb-480c-b1d9-8bd494ff1b43" 00:24:44.776 ], 00:24:44.776 "product_name": "NVMe disk", 00:24:44.776 "block_size": 512, 00:24:44.776 "num_blocks": 1953525168, 00:24:44.776 "uuid": "30620af0-bdfb-480c-b1d9-8bd494ff1b43", 00:24:44.776 "assigned_rate_limits": { 00:24:44.776 "rw_ios_per_sec": 0, 00:24:44.776 "rw_mbytes_per_sec": 0, 00:24:44.776 "r_mbytes_per_sec": 0, 00:24:44.776 "w_mbytes_per_sec": 0 00:24:44.776 }, 00:24:44.776 "claimed": false, 00:24:44.776 "zoned": false, 00:24:44.776 "supported_io_types": { 00:24:44.776 "read": true, 00:24:44.776 "write": true, 00:24:44.776 "unmap": true, 00:24:44.776 "flush": true, 00:24:44.776 "reset": true, 00:24:44.776 "nvme_admin": true, 00:24:44.776 "nvme_io": true, 00:24:44.776 "nvme_io_md": false, 00:24:44.776 "write_zeroes": true, 00:24:44.776 "zcopy": false, 00:24:44.776 "get_zone_info": false, 00:24:44.776 "zone_management": false, 00:24:44.776 "zone_append": false, 00:24:44.776 "compare": false, 00:24:44.776 "compare_and_write": false, 00:24:44.776 "abort": true, 00:24:44.776 "seek_hole": false, 00:24:44.776 "seek_data": false, 00:24:44.776 "copy": false, 00:24:44.776 "nvme_iov_md": false 00:24:44.776 }, 00:24:44.776 "driver_specific": { 00:24:44.776 "nvme": [ 00:24:44.776 { 00:24:44.776 "pci_address": "0000:5e:00.0", 00:24:44.776 "trid": { 00:24:44.776 "trtype": "PCIe", 00:24:44.776 "traddr": "0000:5e:00.0" 00:24:44.776 }, 00:24:44.776 "ctrlr_data": { 00:24:44.776 "cntlid": 0, 00:24:44.776 "vendor_id": "0x8086", 00:24:44.776 "model_number": "INTEL SSDPE2KX010T8", 00:24:44.776 "serial_number": "BTLJ807001JM1P0FGN", 00:24:44.776 "firmware_revision": "VDV10170", 00:24:44.776 "oacs": { 00:24:44.776 "security": 1, 00:24:44.776 "format": 1, 00:24:44.776 "firmware": 1, 00:24:44.776 "ns_manage": 1 00:24:44.776 }, 00:24:44.776 "multi_ctrlr": false, 00:24:44.776 "ana_reporting": false 00:24:44.776 }, 00:24:44.776 "vs": { 00:24:44.776 "nvme_version": "1.2" 00:24:44.776 }, 00:24:44.776 "ns_data": { 00:24:44.776 "id": 1, 00:24:44.776 "can_share": false 00:24:44.776 }, 00:24:44.776 "security": { 00:24:44.776 "opal": true 00:24:44.776 } 00:24:44.776 } 00:24:44.776 ], 00:24:44.776 "mp_policy": "active_passive" 00:24:44.776 } 00:24:44.776 } 00:24:44.776 ] 00:24:44.776 19:01:29 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:44.776 19:01:29 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:45.712 8d6a6088-abde-469e-8759-a06e56ee40e0 00:24:45.712 19:01:30 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:45.971 ddacca52-a282-4f5a-a4f3-b09e78676243 00:24:45.971 19:01:30 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:45.971 19:01:30 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:24:45.971 19:01:30 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:45.971 19:01:30 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:45.971 19:01:30 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:45.971 19:01:30 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:45.971 19:01:30 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:46.228 19:01:31 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:46.228 [ 00:24:46.228 { 00:24:46.228 "name": "ddacca52-a282-4f5a-a4f3-b09e78676243", 00:24:46.228 "aliases": [ 00:24:46.228 "lvs0/lv0" 00:24:46.228 ], 00:24:46.228 "product_name": "Logical Volume", 00:24:46.228 "block_size": 512, 00:24:46.228 "num_blocks": 204800, 00:24:46.228 "uuid": "ddacca52-a282-4f5a-a4f3-b09e78676243", 00:24:46.228 "assigned_rate_limits": { 00:24:46.228 "rw_ios_per_sec": 0, 00:24:46.228 "rw_mbytes_per_sec": 0, 00:24:46.228 "r_mbytes_per_sec": 0, 00:24:46.228 "w_mbytes_per_sec": 0 00:24:46.228 }, 00:24:46.228 "claimed": false, 00:24:46.228 "zoned": false, 00:24:46.228 "supported_io_types": { 00:24:46.228 "read": true, 00:24:46.228 "write": true, 00:24:46.228 "unmap": true, 00:24:46.228 "flush": false, 00:24:46.228 "reset": true, 00:24:46.228 "nvme_admin": false, 00:24:46.228 "nvme_io": false, 00:24:46.228 "nvme_io_md": false, 00:24:46.228 "write_zeroes": true, 00:24:46.228 "zcopy": false, 00:24:46.228 "get_zone_info": false, 00:24:46.228 "zone_management": false, 00:24:46.228 "zone_append": false, 00:24:46.228 "compare": false, 00:24:46.228 "compare_and_write": false, 00:24:46.228 "abort": false, 00:24:46.228 "seek_hole": true, 00:24:46.228 "seek_data": true, 00:24:46.228 "copy": false, 00:24:46.228 "nvme_iov_md": false 00:24:46.228 }, 00:24:46.228 "driver_specific": { 00:24:46.228 "lvol": { 00:24:46.228 "lvol_store_uuid": "8d6a6088-abde-469e-8759-a06e56ee40e0", 00:24:46.228 "base_bdev": "Nvme0n1", 00:24:46.228 "thin_provision": true, 00:24:46.228 "num_allocated_clusters": 0, 00:24:46.228 "snapshot": false, 00:24:46.228 "clone": false, 00:24:46.228 "esnap_clone": false 00:24:46.228 } 00:24:46.228 } 00:24:46.228 } 00:24:46.228 ] 00:24:46.228 19:01:31 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:46.228 19:01:31 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:24:46.228 19:01:31 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:24:46.486 [2024-07-24 19:01:31.358195] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:46.486 COMP_lvs0/lv0 00:24:46.486 19:01:31 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:46.486 19:01:31 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:24:46.486 19:01:31 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:46.486 19:01:31 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:46.486 19:01:31 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:46.486 19:01:31 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:46.486 19:01:31 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:47.749 19:01:31 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:47.749 [ 00:24:47.749 { 00:24:47.749 "name": "COMP_lvs0/lv0", 00:24:47.749 "aliases": [ 00:24:47.749 "d6d15188-c4b4-5668-8319-8420a6a78c33" 00:24:47.749 ], 00:24:47.749 "product_name": "compress", 00:24:47.749 "block_size": 4096, 00:24:47.749 "num_blocks": 25088, 00:24:47.749 "uuid": "d6d15188-c4b4-5668-8319-8420a6a78c33", 00:24:47.749 "assigned_rate_limits": { 00:24:47.749 "rw_ios_per_sec": 0, 00:24:47.749 "rw_mbytes_per_sec": 0, 00:24:47.749 "r_mbytes_per_sec": 0, 00:24:47.749 "w_mbytes_per_sec": 0 00:24:47.749 }, 00:24:47.749 "claimed": false, 00:24:47.749 "zoned": false, 00:24:47.749 "supported_io_types": { 00:24:47.749 "read": true, 00:24:47.749 "write": true, 00:24:47.749 "unmap": false, 00:24:47.749 "flush": false, 00:24:47.749 "reset": false, 00:24:47.749 "nvme_admin": false, 00:24:47.750 "nvme_io": false, 00:24:47.750 "nvme_io_md": false, 00:24:47.750 "write_zeroes": true, 00:24:47.750 "zcopy": false, 00:24:47.750 "get_zone_info": false, 00:24:47.750 "zone_management": false, 00:24:47.750 "zone_append": false, 00:24:47.750 "compare": false, 00:24:47.750 "compare_and_write": false, 00:24:47.750 "abort": false, 00:24:47.750 "seek_hole": false, 00:24:47.750 "seek_data": false, 00:24:47.750 "copy": false, 00:24:47.750 "nvme_iov_md": false 00:24:47.750 }, 00:24:47.750 "driver_specific": { 00:24:47.750 "compress": { 00:24:47.750 "name": "COMP_lvs0/lv0", 00:24:47.750 "base_bdev_name": "ddacca52-a282-4f5a-a4f3-b09e78676243", 00:24:47.750 "pm_path": "/tmp/pmem/3c9ab4b0-ac79-4bf0-9493-30ee70564959" 00:24:47.750 } 00:24:47.750 } 00:24:47.750 } 00:24:47.750 ] 00:24:47.750 19:01:31 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:47.750 19:01:31 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:24:47.750 Running I/O for 3 seconds... 00:24:50.280 00:24:50.280 Latency(us) 00:24:50.280 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:50.280 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:24:50.280 Verification LBA range: start 0x0 length 0x3100 00:24:50.280 COMP_lvs0/lv0 : 3.01 3388.61 13.24 0.00 0.00 9391.44 57.78 15416.56 00:24:50.280 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:24:50.280 Verification LBA range: start 0x3100 length 0x3100 00:24:50.280 COMP_lvs0/lv0 : 3.01 3381.44 13.21 0.00 0.00 9419.70 56.08 15416.56 00:24:50.280 =================================================================================================================== 00:24:50.280 Total : 6770.05 26.45 0.00 0.00 9405.55 56.08 15416.56 00:24:50.280 0 00:24:50.280 19:01:34 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:24:50.280 19:01:34 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:50.280 19:01:35 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:50.280 19:01:35 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:24:50.280 19:01:35 compress_isal -- compress/compress.sh@78 -- # killprocess 2221956 00:24:50.280 19:01:35 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2221956 ']' 00:24:50.280 19:01:35 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2221956 00:24:50.280 19:01:35 compress_isal -- common/autotest_common.sh@953 -- # uname 00:24:50.280 19:01:35 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:50.280 19:01:35 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2221956 00:24:50.280 19:01:35 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:24:50.280 19:01:35 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:24:50.280 19:01:35 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2221956' 00:24:50.280 killing process with pid 2221956 00:24:50.280 19:01:35 compress_isal -- common/autotest_common.sh@967 -- # kill 2221956 00:24:50.280 Received shutdown signal, test time was about 3.000000 seconds 00:24:50.280 00:24:50.280 Latency(us) 00:24:50.280 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:50.280 =================================================================================================================== 00:24:50.280 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:50.280 19:01:35 compress_isal -- common/autotest_common.sh@972 -- # wait 2221956 00:24:51.657 19:01:36 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:24:51.657 19:01:36 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:24:51.657 19:01:36 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=2223796 00:24:51.657 19:01:36 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:51.657 19:01:36 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:24:51.657 19:01:36 compress_isal -- compress/compress.sh@57 -- # waitforlisten 2223796 00:24:51.657 19:01:36 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2223796 ']' 00:24:51.657 19:01:36 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:51.657 19:01:36 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:51.657 19:01:36 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:51.657 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:51.657 19:01:36 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:51.657 19:01:36 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:24:51.915 [2024-07-24 19:01:36.677663] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:24:51.915 [2024-07-24 19:01:36.677711] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2223796 ] 00:24:51.915 [2024-07-24 19:01:36.734455] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:24:51.915 [2024-07-24 19:01:36.808110] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:24:51.915 [2024-07-24 19:01:36.808125] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:24:51.915 [2024-07-24 19:01:36.808126] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:52.482 19:01:37 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:52.482 19:01:37 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:24:52.482 19:01:37 compress_isal -- compress/compress.sh@58 -- # create_vols 00:24:52.482 19:01:37 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:52.482 19:01:37 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:55.770 19:01:40 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:55.770 19:01:40 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:24:55.770 19:01:40 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:55.770 19:01:40 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:55.770 19:01:40 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:55.770 19:01:40 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:55.770 19:01:40 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:55.770 19:01:40 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:56.029 [ 00:24:56.030 { 00:24:56.030 "name": "Nvme0n1", 00:24:56.030 "aliases": [ 00:24:56.030 "712cccd7-e8a1-455d-a12d-b10c6dc946ee" 00:24:56.030 ], 00:24:56.030 "product_name": "NVMe disk", 00:24:56.030 "block_size": 512, 00:24:56.030 "num_blocks": 1953525168, 00:24:56.030 "uuid": "712cccd7-e8a1-455d-a12d-b10c6dc946ee", 00:24:56.030 "assigned_rate_limits": { 00:24:56.030 "rw_ios_per_sec": 0, 00:24:56.030 "rw_mbytes_per_sec": 0, 00:24:56.030 "r_mbytes_per_sec": 0, 00:24:56.030 "w_mbytes_per_sec": 0 00:24:56.030 }, 00:24:56.030 "claimed": false, 00:24:56.030 "zoned": false, 00:24:56.030 "supported_io_types": { 00:24:56.030 "read": true, 00:24:56.030 "write": true, 00:24:56.030 "unmap": true, 00:24:56.030 "flush": true, 00:24:56.030 "reset": true, 00:24:56.030 "nvme_admin": true, 00:24:56.030 "nvme_io": true, 00:24:56.030 "nvme_io_md": false, 00:24:56.030 "write_zeroes": true, 00:24:56.030 "zcopy": false, 00:24:56.030 "get_zone_info": false, 00:24:56.030 "zone_management": false, 00:24:56.030 "zone_append": false, 00:24:56.030 "compare": false, 00:24:56.030 "compare_and_write": false, 00:24:56.030 "abort": true, 00:24:56.030 "seek_hole": false, 00:24:56.030 "seek_data": false, 00:24:56.030 "copy": false, 00:24:56.030 "nvme_iov_md": false 00:24:56.030 }, 00:24:56.030 "driver_specific": { 00:24:56.030 "nvme": [ 00:24:56.030 { 00:24:56.030 "pci_address": "0000:5e:00.0", 00:24:56.030 "trid": { 00:24:56.030 "trtype": "PCIe", 00:24:56.030 "traddr": "0000:5e:00.0" 00:24:56.030 }, 00:24:56.030 "ctrlr_data": { 00:24:56.030 "cntlid": 0, 00:24:56.030 "vendor_id": "0x8086", 00:24:56.030 "model_number": "INTEL SSDPE2KX010T8", 00:24:56.030 "serial_number": "BTLJ807001JM1P0FGN", 00:24:56.030 "firmware_revision": "VDV10170", 00:24:56.030 "oacs": { 00:24:56.030 "security": 1, 00:24:56.030 "format": 1, 00:24:56.030 "firmware": 1, 00:24:56.030 "ns_manage": 1 00:24:56.030 }, 00:24:56.030 "multi_ctrlr": false, 00:24:56.030 "ana_reporting": false 00:24:56.030 }, 00:24:56.030 "vs": { 00:24:56.030 "nvme_version": "1.2" 00:24:56.030 }, 00:24:56.030 "ns_data": { 00:24:56.030 "id": 1, 00:24:56.030 "can_share": false 00:24:56.030 }, 00:24:56.030 "security": { 00:24:56.030 "opal": true 00:24:56.030 } 00:24:56.030 } 00:24:56.030 ], 00:24:56.030 "mp_policy": "active_passive" 00:24:56.030 } 00:24:56.030 } 00:24:56.030 ] 00:24:56.030 19:01:40 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:56.030 19:01:40 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:56.968 61b4b5a5-0934-407e-8688-00488f686ac7 00:24:56.968 19:01:41 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:57.225 426a73a5-1b14-4bbb-a70f-da055c521cbb 00:24:57.225 19:01:42 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:57.225 19:01:42 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:24:57.225 19:01:42 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:57.225 19:01:42 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:57.225 19:01:42 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:57.226 19:01:42 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:57.226 19:01:42 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:57.484 19:01:42 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:57.484 [ 00:24:57.484 { 00:24:57.484 "name": "426a73a5-1b14-4bbb-a70f-da055c521cbb", 00:24:57.484 "aliases": [ 00:24:57.484 "lvs0/lv0" 00:24:57.484 ], 00:24:57.484 "product_name": "Logical Volume", 00:24:57.484 "block_size": 512, 00:24:57.484 "num_blocks": 204800, 00:24:57.484 "uuid": "426a73a5-1b14-4bbb-a70f-da055c521cbb", 00:24:57.484 "assigned_rate_limits": { 00:24:57.484 "rw_ios_per_sec": 0, 00:24:57.484 "rw_mbytes_per_sec": 0, 00:24:57.484 "r_mbytes_per_sec": 0, 00:24:57.484 "w_mbytes_per_sec": 0 00:24:57.484 }, 00:24:57.484 "claimed": false, 00:24:57.484 "zoned": false, 00:24:57.484 "supported_io_types": { 00:24:57.484 "read": true, 00:24:57.484 "write": true, 00:24:57.484 "unmap": true, 00:24:57.484 "flush": false, 00:24:57.484 "reset": true, 00:24:57.484 "nvme_admin": false, 00:24:57.484 "nvme_io": false, 00:24:57.484 "nvme_io_md": false, 00:24:57.484 "write_zeroes": true, 00:24:57.484 "zcopy": false, 00:24:57.484 "get_zone_info": false, 00:24:57.484 "zone_management": false, 00:24:57.484 "zone_append": false, 00:24:57.484 "compare": false, 00:24:57.484 "compare_and_write": false, 00:24:57.484 "abort": false, 00:24:57.484 "seek_hole": true, 00:24:57.484 "seek_data": true, 00:24:57.484 "copy": false, 00:24:57.484 "nvme_iov_md": false 00:24:57.484 }, 00:24:57.484 "driver_specific": { 00:24:57.484 "lvol": { 00:24:57.484 "lvol_store_uuid": "61b4b5a5-0934-407e-8688-00488f686ac7", 00:24:57.484 "base_bdev": "Nvme0n1", 00:24:57.484 "thin_provision": true, 00:24:57.484 "num_allocated_clusters": 0, 00:24:57.484 "snapshot": false, 00:24:57.484 "clone": false, 00:24:57.484 "esnap_clone": false 00:24:57.484 } 00:24:57.484 } 00:24:57.484 } 00:24:57.484 ] 00:24:57.484 19:01:42 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:57.484 19:01:42 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:24:57.484 19:01:42 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:24:57.743 [2024-07-24 19:01:42.570462] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:57.743 COMP_lvs0/lv0 00:24:57.743 19:01:42 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:57.743 19:01:42 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:24:57.743 19:01:42 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:57.743 19:01:42 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:57.743 19:01:42 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:57.743 19:01:42 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:57.743 19:01:42 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:57.743 19:01:42 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:58.002 [ 00:24:58.002 { 00:24:58.002 "name": "COMP_lvs0/lv0", 00:24:58.002 "aliases": [ 00:24:58.002 "612d212d-0a74-56d4-b08d-ae1d6701dda4" 00:24:58.002 ], 00:24:58.002 "product_name": "compress", 00:24:58.002 "block_size": 512, 00:24:58.002 "num_blocks": 200704, 00:24:58.002 "uuid": "612d212d-0a74-56d4-b08d-ae1d6701dda4", 00:24:58.002 "assigned_rate_limits": { 00:24:58.002 "rw_ios_per_sec": 0, 00:24:58.002 "rw_mbytes_per_sec": 0, 00:24:58.002 "r_mbytes_per_sec": 0, 00:24:58.002 "w_mbytes_per_sec": 0 00:24:58.002 }, 00:24:58.002 "claimed": false, 00:24:58.002 "zoned": false, 00:24:58.002 "supported_io_types": { 00:24:58.002 "read": true, 00:24:58.002 "write": true, 00:24:58.002 "unmap": false, 00:24:58.002 "flush": false, 00:24:58.002 "reset": false, 00:24:58.002 "nvme_admin": false, 00:24:58.002 "nvme_io": false, 00:24:58.002 "nvme_io_md": false, 00:24:58.002 "write_zeroes": true, 00:24:58.002 "zcopy": false, 00:24:58.002 "get_zone_info": false, 00:24:58.002 "zone_management": false, 00:24:58.002 "zone_append": false, 00:24:58.002 "compare": false, 00:24:58.002 "compare_and_write": false, 00:24:58.002 "abort": false, 00:24:58.002 "seek_hole": false, 00:24:58.002 "seek_data": false, 00:24:58.002 "copy": false, 00:24:58.002 "nvme_iov_md": false 00:24:58.002 }, 00:24:58.002 "driver_specific": { 00:24:58.002 "compress": { 00:24:58.002 "name": "COMP_lvs0/lv0", 00:24:58.002 "base_bdev_name": "426a73a5-1b14-4bbb-a70f-da055c521cbb", 00:24:58.002 "pm_path": "/tmp/pmem/42eec053-e0ab-40de-9499-604761180fd0" 00:24:58.002 } 00:24:58.002 } 00:24:58.002 } 00:24:58.002 ] 00:24:58.002 19:01:42 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:58.002 19:01:42 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:24:58.002 I/O targets: 00:24:58.002 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:24:58.002 00:24:58.002 00:24:58.002 CUnit - A unit testing framework for C - Version 2.1-3 00:24:58.002 http://cunit.sourceforge.net/ 00:24:58.002 00:24:58.002 00:24:58.002 Suite: bdevio tests on: COMP_lvs0/lv0 00:24:58.002 Test: blockdev write read block ...passed 00:24:58.002 Test: blockdev write zeroes read block ...passed 00:24:58.261 Test: blockdev write zeroes read no split ...passed 00:24:58.261 Test: blockdev write zeroes read split ...passed 00:24:58.261 Test: blockdev write zeroes read split partial ...passed 00:24:58.261 Test: blockdev reset ...[2024-07-24 19:01:43.062132] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:24:58.261 passed 00:24:58.261 Test: blockdev write read 8 blocks ...passed 00:24:58.261 Test: blockdev write read size > 128k ...passed 00:24:58.261 Test: blockdev write read invalid size ...passed 00:24:58.261 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:24:58.261 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:24:58.261 Test: blockdev write read max offset ...passed 00:24:58.261 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:24:58.261 Test: blockdev writev readv 8 blocks ...passed 00:24:58.261 Test: blockdev writev readv 30 x 1block ...passed 00:24:58.261 Test: blockdev writev readv block ...passed 00:24:58.261 Test: blockdev writev readv size > 128k ...passed 00:24:58.261 Test: blockdev writev readv size > 128k in two iovs ...passed 00:24:58.261 Test: blockdev comparev and writev ...passed 00:24:58.261 Test: blockdev nvme passthru rw ...passed 00:24:58.261 Test: blockdev nvme passthru vendor specific ...passed 00:24:58.261 Test: blockdev nvme admin passthru ...passed 00:24:58.261 Test: blockdev copy ...passed 00:24:58.261 00:24:58.261 Run Summary: Type Total Ran Passed Failed Inactive 00:24:58.261 suites 1 1 n/a 0 0 00:24:58.261 tests 23 23 23 0 0 00:24:58.261 asserts 130 130 130 0 n/a 00:24:58.261 00:24:58.261 Elapsed time = 0.198 seconds 00:24:58.261 0 00:24:58.261 19:01:43 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:24:58.261 19:01:43 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:58.527 19:01:43 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:58.527 19:01:43 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:24:58.527 19:01:43 compress_isal -- compress/compress.sh@62 -- # killprocess 2223796 00:24:58.527 19:01:43 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2223796 ']' 00:24:58.527 19:01:43 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2223796 00:24:58.527 19:01:43 compress_isal -- common/autotest_common.sh@953 -- # uname 00:24:58.527 19:01:43 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:58.527 19:01:43 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2223796 00:24:58.527 19:01:43 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:58.527 19:01:43 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:58.527 19:01:43 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2223796' 00:24:58.527 killing process with pid 2223796 00:24:58.527 19:01:43 compress_isal -- common/autotest_common.sh@967 -- # kill 2223796 00:24:58.527 19:01:43 compress_isal -- common/autotest_common.sh@972 -- # wait 2223796 00:25:00.432 19:01:44 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:25:00.432 19:01:44 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:25:00.432 00:25:00.432 real 0m42.081s 00:25:00.432 user 1m35.066s 00:25:00.432 sys 0m2.832s 00:25:00.432 19:01:44 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:00.432 19:01:44 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:25:00.432 ************************************ 00:25:00.432 END TEST compress_isal 00:25:00.432 ************************************ 00:25:00.432 19:01:44 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:25:00.432 19:01:44 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:25:00.432 19:01:44 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:25:00.432 19:01:44 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:25:00.432 19:01:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:00.432 19:01:44 -- common/autotest_common.sh@10 -- # set +x 00:25:00.432 ************************************ 00:25:00.432 START TEST blockdev_crypto_aesni 00:25:00.432 ************************************ 00:25:00.432 19:01:44 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:25:00.432 * Looking for test storage... 00:25:00.432 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@673 -- # uname -s 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@681 -- # test_type=crypto_aesni 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # crypto_device= 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # dek= 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # env_ctx= 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@689 -- # [[ crypto_aesni == bdev ]] 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@689 -- # [[ crypto_aesni == crypto_* ]] 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2225236 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 2225236 00:25:00.432 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:25:00.433 19:01:45 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 2225236 ']' 00:25:00.433 19:01:45 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:00.433 19:01:45 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:00.433 19:01:45 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:00.433 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:00.433 19:01:45 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:00.433 19:01:45 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:00.433 [2024-07-24 19:01:45.134833] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:25:00.433 [2024-07-24 19:01:45.134876] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2225236 ] 00:25:00.433 [2024-07-24 19:01:45.200042] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:00.433 [2024-07-24 19:01:45.270699] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:01.000 19:01:45 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:01.000 19:01:45 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:25:01.000 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:25:01.000 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@704 -- # setup_crypto_aesni_conf 00:25:01.000 19:01:45 blockdev_crypto_aesni -- bdev/blockdev.sh@145 -- # rpc_cmd 00:25:01.000 19:01:45 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:25:01.000 19:01:45 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:01.000 [2024-07-24 19:01:45.928640] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:25:01.000 [2024-07-24 19:01:45.936668] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:25:01.000 [2024-07-24 19:01:45.944687] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:25:01.258 [2024-07-24 19:01:46.014067] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:25:03.789 true 00:25:03.789 true 00:25:03.789 true 00:25:03.789 true 00:25:03.789 Malloc0 00:25:03.789 Malloc1 00:25:03.789 Malloc2 00:25:03.789 Malloc3 00:25:03.789 [2024-07-24 19:01:48.299374] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:25:03.789 crypto_ram 00:25:03.789 [2024-07-24 19:01:48.307393] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:25:03.789 crypto_ram2 00:25:03.789 [2024-07-24 19:01:48.315413] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:25:03.789 crypto_ram3 00:25:03.789 [2024-07-24 19:01:48.323432] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:25:03.789 crypto_ram4 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:25:03.789 19:01:48 blockdev_crypto_aesni -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:25:03.789 19:01:48 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # cat 00:25:03.789 19:01:48 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:25:03.789 19:01:48 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:25:03.789 19:01:48 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:25:03.789 19:01:48 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:25:03.789 19:01:48 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:25:03.789 19:01:48 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:25:03.789 19:01:48 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:25:03.789 19:01:48 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r .name 00:25:03.789 19:01:48 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "e2a2c95a-b679-5e24-a442-48a757e97dfc"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e2a2c95a-b679-5e24-a442-48a757e97dfc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "4511afbe-24a2-577b-bf76-4223fa21a603"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "4511afbe-24a2-577b-bf76-4223fa21a603",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "560d1a95-1ce6-5640-83dc-ba34a26dd30a"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "560d1a95-1ce6-5640-83dc-ba34a26dd30a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "124c79bb-096f-5629-adb1-7586da2103a8"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "124c79bb-096f-5629-adb1-7586da2103a8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:25:03.789 19:01:48 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:25:03.789 19:01:48 blockdev_crypto_aesni -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:25:03.789 19:01:48 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:25:03.789 19:01:48 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # killprocess 2225236 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 2225236 ']' 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 2225236 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2225236 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2225236' 00:25:03.789 killing process with pid 2225236 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 2225236 00:25:03.789 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 2225236 00:25:04.048 19:01:48 blockdev_crypto_aesni -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:25:04.048 19:01:48 blockdev_crypto_aesni -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:25:04.048 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:04.048 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:04.048 19:01:48 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:04.048 ************************************ 00:25:04.048 START TEST bdev_hello_world 00:25:04.048 ************************************ 00:25:04.048 19:01:49 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:25:04.048 [2024-07-24 19:01:49.051523] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:25:04.048 [2024-07-24 19:01:49.051562] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2225863 ] 00:25:04.306 [2024-07-24 19:01:49.115696] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:04.306 [2024-07-24 19:01:49.193598] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:04.306 [2024-07-24 19:01:49.214525] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:25:04.306 [2024-07-24 19:01:49.222548] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:25:04.306 [2024-07-24 19:01:49.230565] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:25:04.565 [2024-07-24 19:01:49.328598] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:25:07.097 [2024-07-24 19:01:51.479844] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:25:07.097 [2024-07-24 19:01:51.479902] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:25:07.097 [2024-07-24 19:01:51.479912] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:07.097 [2024-07-24 19:01:51.487863] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:25:07.097 [2024-07-24 19:01:51.487874] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:07.097 [2024-07-24 19:01:51.487879] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:07.097 [2024-07-24 19:01:51.495880] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:25:07.097 [2024-07-24 19:01:51.495889] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:07.097 [2024-07-24 19:01:51.495894] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:07.097 [2024-07-24 19:01:51.503911] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:25:07.097 [2024-07-24 19:01:51.503920] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:25:07.097 [2024-07-24 19:01:51.503925] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:07.097 [2024-07-24 19:01:51.571127] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:25:07.097 [2024-07-24 19:01:51.571162] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:25:07.097 [2024-07-24 19:01:51.571172] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:25:07.097 [2024-07-24 19:01:51.572040] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:25:07.097 [2024-07-24 19:01:51.572094] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:25:07.097 [2024-07-24 19:01:51.572104] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:25:07.097 [2024-07-24 19:01:51.572132] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:25:07.097 00:25:07.097 [2024-07-24 19:01:51.572143] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:25:07.097 00:25:07.097 real 0m2.873s 00:25:07.097 user 0m2.580s 00:25:07.097 sys 0m0.253s 00:25:07.097 19:01:51 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:07.097 19:01:51 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:25:07.097 ************************************ 00:25:07.097 END TEST bdev_hello_world 00:25:07.097 ************************************ 00:25:07.097 19:01:51 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:25:07.097 19:01:51 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:25:07.097 19:01:51 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:07.097 19:01:51 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:07.097 ************************************ 00:25:07.097 START TEST bdev_bounds 00:25:07.097 ************************************ 00:25:07.097 19:01:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:25:07.097 19:01:51 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=2226407 00:25:07.097 19:01:51 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:25:07.097 19:01:51 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:25:07.097 19:01:51 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 2226407' 00:25:07.097 Process bdevio pid: 2226407 00:25:07.097 19:01:51 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 2226407 00:25:07.097 19:01:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2226407 ']' 00:25:07.097 19:01:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:07.097 19:01:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:07.097 19:01:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:07.097 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:07.097 19:01:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:07.097 19:01:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:25:07.097 [2024-07-24 19:01:51.995500] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:25:07.097 [2024-07-24 19:01:51.995539] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2226407 ] 00:25:07.097 [2024-07-24 19:01:52.058530] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:25:07.356 [2024-07-24 19:01:52.132583] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:25:07.356 [2024-07-24 19:01:52.132682] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:25:07.356 [2024-07-24 19:01:52.132684] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:07.356 [2024-07-24 19:01:52.153633] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:25:07.356 [2024-07-24 19:01:52.161660] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:25:07.356 [2024-07-24 19:01:52.169680] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:25:07.356 [2024-07-24 19:01:52.263392] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:25:09.962 [2024-07-24 19:01:54.406642] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:25:09.962 [2024-07-24 19:01:54.406709] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:25:09.962 [2024-07-24 19:01:54.406719] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:09.962 [2024-07-24 19:01:54.414656] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:25:09.962 [2024-07-24 19:01:54.414669] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:09.962 [2024-07-24 19:01:54.414676] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:09.962 [2024-07-24 19:01:54.422680] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:25:09.963 [2024-07-24 19:01:54.422691] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:09.963 [2024-07-24 19:01:54.422696] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:09.963 [2024-07-24 19:01:54.430806] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:25:09.963 [2024-07-24 19:01:54.430821] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:25:09.963 [2024-07-24 19:01:54.430832] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:09.963 19:01:54 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:09.963 19:01:54 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:25:09.963 19:01:54 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:25:09.963 I/O targets: 00:25:09.963 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:25:09.963 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:25:09.963 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:25:09.963 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:25:09.963 00:25:09.963 00:25:09.963 CUnit - A unit testing framework for C - Version 2.1-3 00:25:09.963 http://cunit.sourceforge.net/ 00:25:09.963 00:25:09.963 00:25:09.963 Suite: bdevio tests on: crypto_ram4 00:25:09.963 Test: blockdev write read block ...passed 00:25:09.963 Test: blockdev write zeroes read block ...passed 00:25:09.963 Test: blockdev write zeroes read no split ...passed 00:25:09.963 Test: blockdev write zeroes read split ...passed 00:25:09.963 Test: blockdev write zeroes read split partial ...passed 00:25:09.963 Test: blockdev reset ...passed 00:25:09.963 Test: blockdev write read 8 blocks ...passed 00:25:09.963 Test: blockdev write read size > 128k ...passed 00:25:09.963 Test: blockdev write read invalid size ...passed 00:25:09.963 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:09.963 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:09.963 Test: blockdev write read max offset ...passed 00:25:09.963 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:09.963 Test: blockdev writev readv 8 blocks ...passed 00:25:09.963 Test: blockdev writev readv 30 x 1block ...passed 00:25:09.963 Test: blockdev writev readv block ...passed 00:25:09.963 Test: blockdev writev readv size > 128k ...passed 00:25:09.963 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:09.963 Test: blockdev comparev and writev ...passed 00:25:09.963 Test: blockdev nvme passthru rw ...passed 00:25:09.963 Test: blockdev nvme passthru vendor specific ...passed 00:25:09.963 Test: blockdev nvme admin passthru ...passed 00:25:09.963 Test: blockdev copy ...passed 00:25:09.963 Suite: bdevio tests on: crypto_ram3 00:25:09.963 Test: blockdev write read block ...passed 00:25:09.963 Test: blockdev write zeroes read block ...passed 00:25:09.963 Test: blockdev write zeroes read no split ...passed 00:25:09.963 Test: blockdev write zeroes read split ...passed 00:25:09.963 Test: blockdev write zeroes read split partial ...passed 00:25:09.963 Test: blockdev reset ...passed 00:25:09.963 Test: blockdev write read 8 blocks ...passed 00:25:09.963 Test: blockdev write read size > 128k ...passed 00:25:09.963 Test: blockdev write read invalid size ...passed 00:25:09.963 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:09.963 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:09.963 Test: blockdev write read max offset ...passed 00:25:09.963 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:09.963 Test: blockdev writev readv 8 blocks ...passed 00:25:09.963 Test: blockdev writev readv 30 x 1block ...passed 00:25:09.963 Test: blockdev writev readv block ...passed 00:25:09.963 Test: blockdev writev readv size > 128k ...passed 00:25:09.963 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:09.963 Test: blockdev comparev and writev ...passed 00:25:09.963 Test: blockdev nvme passthru rw ...passed 00:25:09.963 Test: blockdev nvme passthru vendor specific ...passed 00:25:09.963 Test: blockdev nvme admin passthru ...passed 00:25:09.963 Test: blockdev copy ...passed 00:25:09.963 Suite: bdevio tests on: crypto_ram2 00:25:09.963 Test: blockdev write read block ...passed 00:25:09.963 Test: blockdev write zeroes read block ...passed 00:25:09.963 Test: blockdev write zeroes read no split ...passed 00:25:09.963 Test: blockdev write zeroes read split ...passed 00:25:09.963 Test: blockdev write zeroes read split partial ...passed 00:25:09.963 Test: blockdev reset ...passed 00:25:09.963 Test: blockdev write read 8 blocks ...passed 00:25:09.963 Test: blockdev write read size > 128k ...passed 00:25:09.963 Test: blockdev write read invalid size ...passed 00:25:09.963 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:09.963 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:09.963 Test: blockdev write read max offset ...passed 00:25:09.963 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:09.963 Test: blockdev writev readv 8 blocks ...passed 00:25:09.963 Test: blockdev writev readv 30 x 1block ...passed 00:25:09.963 Test: blockdev writev readv block ...passed 00:25:09.963 Test: blockdev writev readv size > 128k ...passed 00:25:09.963 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:09.963 Test: blockdev comparev and writev ...passed 00:25:09.963 Test: blockdev nvme passthru rw ...passed 00:25:09.963 Test: blockdev nvme passthru vendor specific ...passed 00:25:09.963 Test: blockdev nvme admin passthru ...passed 00:25:09.963 Test: blockdev copy ...passed 00:25:09.963 Suite: bdevio tests on: crypto_ram 00:25:09.963 Test: blockdev write read block ...passed 00:25:09.963 Test: blockdev write zeroes read block ...passed 00:25:09.963 Test: blockdev write zeroes read no split ...passed 00:25:09.963 Test: blockdev write zeroes read split ...passed 00:25:09.963 Test: blockdev write zeroes read split partial ...passed 00:25:09.963 Test: blockdev reset ...passed 00:25:09.963 Test: blockdev write read 8 blocks ...passed 00:25:09.963 Test: blockdev write read size > 128k ...passed 00:25:09.963 Test: blockdev write read invalid size ...passed 00:25:09.963 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:09.964 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:09.964 Test: blockdev write read max offset ...passed 00:25:09.964 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:09.964 Test: blockdev writev readv 8 blocks ...passed 00:25:09.964 Test: blockdev writev readv 30 x 1block ...passed 00:25:09.964 Test: blockdev writev readv block ...passed 00:25:09.964 Test: blockdev writev readv size > 128k ...passed 00:25:09.964 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:09.964 Test: blockdev comparev and writev ...passed 00:25:09.964 Test: blockdev nvme passthru rw ...passed 00:25:09.964 Test: blockdev nvme passthru vendor specific ...passed 00:25:09.964 Test: blockdev nvme admin passthru ...passed 00:25:09.964 Test: blockdev copy ...passed 00:25:09.964 00:25:09.964 Run Summary: Type Total Ran Passed Failed Inactive 00:25:09.964 suites 4 4 n/a 0 0 00:25:09.964 tests 92 92 92 0 0 00:25:09.964 asserts 520 520 520 0 n/a 00:25:09.964 00:25:09.964 Elapsed time = 0.526 seconds 00:25:09.964 0 00:25:09.964 19:01:54 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 2226407 00:25:09.964 19:01:54 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2226407 ']' 00:25:09.964 19:01:54 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2226407 00:25:09.964 19:01:54 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:25:09.964 19:01:54 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:09.964 19:01:54 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2226407 00:25:09.964 19:01:54 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:09.964 19:01:54 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:09.964 19:01:54 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2226407' 00:25:09.964 killing process with pid 2226407 00:25:09.964 19:01:54 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2226407 00:25:09.964 19:01:54 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2226407 00:25:10.222 19:01:55 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:25:10.222 00:25:10.222 real 0m3.281s 00:25:10.222 user 0m9.268s 00:25:10.222 sys 0m0.389s 00:25:10.222 19:01:55 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:10.222 19:01:55 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:25:10.222 ************************************ 00:25:10.222 END TEST bdev_bounds 00:25:10.222 ************************************ 00:25:10.481 19:01:55 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:25:10.481 19:01:55 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:25:10.481 19:01:55 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:10.481 19:01:55 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:10.481 ************************************ 00:25:10.481 START TEST bdev_nbd 00:25:10.481 ************************************ 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=4 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=4 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=2226912 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 2226912 /var/tmp/spdk-nbd.sock 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2226912 ']' 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:25:10.481 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:10.481 19:01:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:25:10.481 [2024-07-24 19:01:55.355408] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:25:10.481 [2024-07-24 19:01:55.355445] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:25:10.481 [2024-07-24 19:01:55.417096] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:10.740 [2024-07-24 19:01:55.494820] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:10.740 [2024-07-24 19:01:55.515717] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:25:10.740 [2024-07-24 19:01:55.523747] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:25:10.740 [2024-07-24 19:01:55.531756] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:25:10.740 [2024-07-24 19:01:55.626988] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:25:13.275 [2024-07-24 19:01:57.773212] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:25:13.275 [2024-07-24 19:01:57.773265] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:25:13.276 [2024-07-24 19:01:57.773273] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:13.276 [2024-07-24 19:01:57.781231] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:25:13.276 [2024-07-24 19:01:57.781242] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:13.276 [2024-07-24 19:01:57.781247] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:13.276 [2024-07-24 19:01:57.789251] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:25:13.276 [2024-07-24 19:01:57.789260] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:13.276 [2024-07-24 19:01:57.789265] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:13.276 [2024-07-24 19:01:57.797271] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:25:13.276 [2024-07-24 19:01:57.797280] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:25:13.276 [2024-07-24 19:01:57.797285] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:13.276 19:01:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:13.276 19:01:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:25:13.276 19:01:57 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:25:13.276 19:01:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:13.276 19:01:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:13.276 19:01:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:25:13.276 19:01:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:25:13.276 19:01:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:13.276 19:01:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:13.276 19:01:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:25:13.276 19:01:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:25:13.276 19:01:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:25:13.276 19:01:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:25:13.276 19:01:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:13.276 19:01:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:13.276 1+0 records in 00:25:13.276 1+0 records out 00:25:13.276 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021957 s, 18.7 MB/s 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:25:13.276 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:13.536 1+0 records in 00:25:13.536 1+0 records out 00:25:13.536 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225125 s, 18.2 MB/s 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:13.536 1+0 records in 00:25:13.536 1+0 records out 00:25:13.536 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000193132 s, 21.2 MB/s 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:13.536 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:13.796 1+0 records in 00:25:13.796 1+0 records out 00:25:13.796 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244239 s, 16.8 MB/s 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:13.796 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:14.055 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:25:14.055 { 00:25:14.055 "nbd_device": "/dev/nbd0", 00:25:14.055 "bdev_name": "crypto_ram" 00:25:14.055 }, 00:25:14.055 { 00:25:14.055 "nbd_device": "/dev/nbd1", 00:25:14.055 "bdev_name": "crypto_ram2" 00:25:14.055 }, 00:25:14.055 { 00:25:14.055 "nbd_device": "/dev/nbd2", 00:25:14.055 "bdev_name": "crypto_ram3" 00:25:14.055 }, 00:25:14.055 { 00:25:14.055 "nbd_device": "/dev/nbd3", 00:25:14.055 "bdev_name": "crypto_ram4" 00:25:14.055 } 00:25:14.055 ]' 00:25:14.055 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:25:14.055 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:25:14.055 { 00:25:14.055 "nbd_device": "/dev/nbd0", 00:25:14.055 "bdev_name": "crypto_ram" 00:25:14.055 }, 00:25:14.055 { 00:25:14.055 "nbd_device": "/dev/nbd1", 00:25:14.055 "bdev_name": "crypto_ram2" 00:25:14.055 }, 00:25:14.055 { 00:25:14.055 "nbd_device": "/dev/nbd2", 00:25:14.055 "bdev_name": "crypto_ram3" 00:25:14.055 }, 00:25:14.055 { 00:25:14.055 "nbd_device": "/dev/nbd3", 00:25:14.055 "bdev_name": "crypto_ram4" 00:25:14.055 } 00:25:14.055 ]' 00:25:14.055 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:25:14.055 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:25:14.055 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:14.055 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:25:14.055 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:14.055 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:25:14.055 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:14.055 19:01:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:14.314 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:25:14.572 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:25:14.572 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:25:14.572 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:25:14.572 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:14.572 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:14.572 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:25:14.572 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:14.572 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:14.572 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:14.572 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:25:14.831 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:25:14.831 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:25:14.831 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:25:14.831 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:14.831 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:14.831 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:25:14.831 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:14.831 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:14.831 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:25:14.831 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:14.831 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:14.831 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:25:14.831 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:25:14.831 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:15.091 19:01:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:25:15.091 /dev/nbd0 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:15.091 1+0 records in 00:25:15.091 1+0 records out 00:25:15.091 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00022257 s, 18.4 MB/s 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:15.091 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:25:15.351 /dev/nbd1 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:15.351 1+0 records in 00:25:15.351 1+0 records out 00:25:15.351 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259886 s, 15.8 MB/s 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:15.351 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:25:15.608 /dev/nbd10 00:25:15.608 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:25:15.608 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:25:15.608 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:25:15.608 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:15.608 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:15.608 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:15.608 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:25:15.608 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:15.608 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:15.608 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:15.609 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:15.609 1+0 records in 00:25:15.609 1+0 records out 00:25:15.609 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225203 s, 18.2 MB/s 00:25:15.609 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.609 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:15.609 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.609 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:15.609 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:15.609 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:15.609 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:15.609 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:25:15.867 /dev/nbd11 00:25:15.867 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:25:15.867 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:25:15.867 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:25:15.867 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:15.867 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:15.867 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:15.867 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:25:15.867 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:15.867 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:15.868 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:15.868 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:15.868 1+0 records in 00:25:15.868 1+0 records out 00:25:15.868 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251794 s, 16.3 MB/s 00:25:15.868 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.868 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:15.868 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.868 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:15.868 19:02:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:15.868 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:15.868 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:15.868 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:25:15.868 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:15.868 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:15.868 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:25:15.868 { 00:25:15.868 "nbd_device": "/dev/nbd0", 00:25:15.868 "bdev_name": "crypto_ram" 00:25:15.868 }, 00:25:15.868 { 00:25:15.868 "nbd_device": "/dev/nbd1", 00:25:15.868 "bdev_name": "crypto_ram2" 00:25:15.868 }, 00:25:15.868 { 00:25:15.868 "nbd_device": "/dev/nbd10", 00:25:15.868 "bdev_name": "crypto_ram3" 00:25:15.868 }, 00:25:15.868 { 00:25:15.868 "nbd_device": "/dev/nbd11", 00:25:15.868 "bdev_name": "crypto_ram4" 00:25:15.868 } 00:25:15.868 ]' 00:25:15.868 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:25:15.868 { 00:25:15.868 "nbd_device": "/dev/nbd0", 00:25:15.868 "bdev_name": "crypto_ram" 00:25:15.868 }, 00:25:15.868 { 00:25:15.868 "nbd_device": "/dev/nbd1", 00:25:15.868 "bdev_name": "crypto_ram2" 00:25:15.868 }, 00:25:15.868 { 00:25:15.868 "nbd_device": "/dev/nbd10", 00:25:15.868 "bdev_name": "crypto_ram3" 00:25:15.868 }, 00:25:15.868 { 00:25:15.868 "nbd_device": "/dev/nbd11", 00:25:15.868 "bdev_name": "crypto_ram4" 00:25:15.868 } 00:25:15.868 ]' 00:25:15.868 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:25:16.127 /dev/nbd1 00:25:16.127 /dev/nbd10 00:25:16.127 /dev/nbd11' 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:25:16.127 /dev/nbd1 00:25:16.127 /dev/nbd10 00:25:16.127 /dev/nbd11' 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:25:16.127 256+0 records in 00:25:16.127 256+0 records out 00:25:16.127 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103356 s, 101 MB/s 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:25:16.127 256+0 records in 00:25:16.127 256+0 records out 00:25:16.127 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0269513 s, 38.9 MB/s 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:25:16.127 256+0 records in 00:25:16.127 256+0 records out 00:25:16.127 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0299503 s, 35.0 MB/s 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:25:16.127 19:02:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:25:16.127 256+0 records in 00:25:16.127 256+0 records out 00:25:16.127 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0262203 s, 40.0 MB/s 00:25:16.127 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:25:16.127 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:25:16.128 256+0 records in 00:25:16.128 256+0 records out 00:25:16.128 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0245516 s, 42.7 MB/s 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:16.128 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:25:16.386 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:16.386 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:16.386 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:16.386 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:16.386 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:16.386 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:16.386 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:16.386 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:16.386 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:16.386 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:16.646 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:25:16.904 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:25:16.904 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:25:16.904 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:25:16.904 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:16.904 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:16.904 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:25:16.904 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:16.904 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:16.904 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:25:16.904 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:16.904 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:17.163 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:25:17.163 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:25:17.163 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:25:17.163 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:25:17.163 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:25:17.163 19:02:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:25:17.163 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:25:17.163 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:25:17.163 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:25:17.163 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:25:17.163 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:25:17.163 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:25:17.163 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:25:17.163 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:17.163 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:17.163 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:25:17.163 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:25:17.163 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:25:17.163 malloc_lvol_verify 00:25:17.422 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:25:17.422 cfbbd2de-01a7-4920-a531-719819755483 00:25:17.422 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:25:17.680 162ee78e-f86a-4407-8b82-433547de8052 00:25:17.680 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:25:17.680 /dev/nbd0 00:25:17.680 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:25:17.938 mke2fs 1.46.5 (30-Dec-2021) 00:25:17.939 Discarding device blocks: 0/4096 done 00:25:17.939 Creating filesystem with 4096 1k blocks and 1024 inodes 00:25:17.939 00:25:17.939 Allocating group tables: 0/1 done 00:25:17.939 Writing inode tables: 0/1 done 00:25:17.939 Creating journal (1024 blocks): done 00:25:17.939 Writing superblocks and filesystem accounting information: 0/1 done 00:25:17.939 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 2226912 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2226912 ']' 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2226912 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2226912 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2226912' 00:25:17.939 killing process with pid 2226912 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2226912 00:25:17.939 19:02:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2226912 00:25:18.507 19:02:03 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:25:18.507 00:25:18.507 real 0m7.925s 00:25:18.507 user 0m10.557s 00:25:18.507 sys 0m2.408s 00:25:18.507 19:02:03 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:18.507 19:02:03 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:25:18.507 ************************************ 00:25:18.507 END TEST bdev_nbd 00:25:18.507 ************************************ 00:25:18.507 19:02:03 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:25:18.508 19:02:03 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # '[' crypto_aesni = nvme ']' 00:25:18.508 19:02:03 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # '[' crypto_aesni = gpt ']' 00:25:18.508 19:02:03 blockdev_crypto_aesni -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:25:18.508 19:02:03 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:25:18.508 19:02:03 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:18.508 19:02:03 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:18.508 ************************************ 00:25:18.508 START TEST bdev_fio 00:25:18.508 ************************************ 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:25:18.508 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1278 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1279 -- # local workload=verify 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local bdev_type=AIO 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local env_context= 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local fio_dir=/usr/src/fio 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1289 -- # '[' -z verify ']' 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1293 -- # '[' -n '' ']' 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1297 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # cat 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1311 -- # '[' verify == verify ']' 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1312 -- # cat 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1321 -- # '[' AIO == AIO ']' 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1322 -- # /usr/src/fio/fio --version 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1322 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # echo serialize_overlap=1 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram2]' 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram2 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram4]' 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram4 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:25:18.508 ************************************ 00:25:18.508 START TEST bdev_fio_rw_verify 00:25:18.508 ************************************ 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1354 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # local fio_dir=/usr/src/fio 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local sanitizers 00:25:18.508 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:18.509 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # shift 00:25:18.509 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # local asan_lib= 00:25:18.509 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # for sanitizer in "${sanitizers[@]}" 00:25:18.509 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:18.509 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # grep libasan 00:25:18.509 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # awk '{print $3}' 00:25:18.509 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # asan_lib= 00:25:18.509 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # [[ -n '' ]] 00:25:18.509 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # for sanitizer in "${sanitizers[@]}" 00:25:18.509 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:18.509 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # grep libclang_rt.asan 00:25:18.509 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # awk '{print $3}' 00:25:18.509 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # asan_lib= 00:25:18.509 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # [[ -n '' ]] 00:25:18.509 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1350 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:25:18.509 19:02:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1350 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:18.768 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:18.768 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:18.768 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:18.768 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:18.768 fio-3.35 00:25:18.768 Starting 4 threads 00:25:33.652 00:25:33.652 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2229032: Wed Jul 24 19:02:16 2024 00:25:33.652 read: IOPS=24.8k, BW=97.0MiB/s (102MB/s)(970MiB/10001msec) 00:25:33.652 slat (usec): min=11, max=1135, avg=55.51, stdev=27.23 00:25:33.652 clat (usec): min=7, max=1926, avg=298.05, stdev=173.65 00:25:33.652 lat (usec): min=19, max=2161, avg=353.56, stdev=187.40 00:25:33.652 clat percentiles (usec): 00:25:33.652 | 50.000th=[ 265], 99.000th=[ 791], 99.900th=[ 955], 99.990th=[ 1237], 00:25:33.652 | 99.999th=[ 1811] 00:25:33.652 write: IOPS=27.3k, BW=107MiB/s (112MB/s)(1040MiB/9741msec); 0 zone resets 00:25:33.652 slat (usec): min=17, max=333, avg=63.84, stdev=25.52 00:25:33.652 clat (usec): min=23, max=2630, avg=351.28, stdev=202.46 00:25:33.652 lat (usec): min=65, max=2963, avg=415.12, stdev=214.22 00:25:33.652 clat percentiles (usec): 00:25:33.652 | 50.000th=[ 318], 99.000th=[ 979], 99.900th=[ 1172], 99.990th=[ 1549], 00:25:33.652 | 99.999th=[ 2376] 00:25:33.652 bw ( KiB/s): min=86768, max=148871, per=97.53%, avg=106615.11, stdev=3508.01, samples=76 00:25:33.652 iops : min=21692, max=37217, avg=26653.74, stdev=876.96, samples=76 00:25:33.652 lat (usec) : 10=0.01%, 20=0.01%, 50=0.87%, 100=7.28%, 250=32.88% 00:25:33.652 lat (usec) : 500=42.18%, 750=13.51%, 1000=2.82% 00:25:33.652 lat (msec) : 2=0.46%, 4=0.01% 00:25:33.652 cpu : usr=99.67%, sys=0.01%, ctx=61, majf=0, minf=229 00:25:33.652 IO depths : 1=10.0%, 2=25.6%, 4=51.3%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:25:33.652 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:25:33.652 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:25:33.652 issued rwts: total=248353,266211,0,0 short=0,0,0,0 dropped=0,0,0,0 00:25:33.652 latency : target=0, window=0, percentile=100.00%, depth=8 00:25:33.652 00:25:33.652 Run status group 0 (all jobs): 00:25:33.652 READ: bw=97.0MiB/s (102MB/s), 97.0MiB/s-97.0MiB/s (102MB/s-102MB/s), io=970MiB (1017MB), run=10001-10001msec 00:25:33.652 WRITE: bw=107MiB/s (112MB/s), 107MiB/s-107MiB/s (112MB/s-112MB/s), io=1040MiB (1090MB), run=9741-9741msec 00:25:33.652 00:25:33.652 real 0m13.232s 00:25:33.652 user 0m48.269s 00:25:33.652 sys 0m0.355s 00:25:33.652 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:33.652 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:25:33.652 ************************************ 00:25:33.652 END TEST bdev_fio_rw_verify 00:25:33.653 ************************************ 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1278 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1279 -- # local workload=trim 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local bdev_type= 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local env_context= 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local fio_dir=/usr/src/fio 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1289 -- # '[' -z trim ']' 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1293 -- # '[' -n '' ']' 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1297 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # cat 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1311 -- # '[' trim == verify ']' 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1326 -- # '[' trim == trim ']' 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1327 -- # echo rw=trimwrite 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "e2a2c95a-b679-5e24-a442-48a757e97dfc"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e2a2c95a-b679-5e24-a442-48a757e97dfc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "4511afbe-24a2-577b-bf76-4223fa21a603"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "4511afbe-24a2-577b-bf76-4223fa21a603",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "560d1a95-1ce6-5640-83dc-ba34a26dd30a"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "560d1a95-1ce6-5640-83dc-ba34a26dd30a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "124c79bb-096f-5629-adb1-7586da2103a8"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "124c79bb-096f-5629-adb1-7586da2103a8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:25:33.653 crypto_ram2 00:25:33.653 crypto_ram3 00:25:33.653 crypto_ram4 ]] 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "e2a2c95a-b679-5e24-a442-48a757e97dfc"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e2a2c95a-b679-5e24-a442-48a757e97dfc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "4511afbe-24a2-577b-bf76-4223fa21a603"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "4511afbe-24a2-577b-bf76-4223fa21a603",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "560d1a95-1ce6-5640-83dc-ba34a26dd30a"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "560d1a95-1ce6-5640-83dc-ba34a26dd30a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "124c79bb-096f-5629-adb1-7586da2103a8"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "124c79bb-096f-5629-adb1-7586da2103a8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram2]' 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram2 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram4]' 00:25:33.653 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram4 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:25:33.654 ************************************ 00:25:33.654 START TEST bdev_fio_trim 00:25:33.654 ************************************ 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1354 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # local fio_dir=/usr/src/fio 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local sanitizers 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # shift 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # local asan_lib= 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # for sanitizer in "${sanitizers[@]}" 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # grep libasan 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # awk '{print $3}' 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # asan_lib= 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # [[ -n '' ]] 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # for sanitizer in "${sanitizers[@]}" 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # grep libclang_rt.asan 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # awk '{print $3}' 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # asan_lib= 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # [[ -n '' ]] 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1350 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:25:33.654 19:02:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1350 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:33.654 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:33.654 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:33.654 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:33.654 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:33.654 fio-3.35 00:25:33.654 Starting 4 threads 00:25:45.854 00:25:45.854 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2231218: Wed Jul 24 19:02:29 2024 00:25:45.854 write: IOPS=43.2k, BW=169MiB/s (177MB/s)(1689MiB/10001msec); 0 zone resets 00:25:45.854 slat (usec): min=12, max=321, avg=52.57, stdev=29.39 00:25:45.854 clat (usec): min=37, max=1446, avg=231.96, stdev=160.35 00:25:45.854 lat (usec): min=49, max=1492, avg=284.52, stdev=181.01 00:25:45.854 clat percentiles (usec): 00:25:45.854 | 50.000th=[ 188], 99.000th=[ 783], 99.900th=[ 873], 99.990th=[ 914], 00:25:45.854 | 99.999th=[ 1123] 00:25:45.854 bw ( KiB/s): min=149944, max=271504, per=100.00%, avg=173701.11, stdev=8411.50, samples=76 00:25:45.854 iops : min=37486, max=67876, avg=43425.16, stdev=2102.88, samples=76 00:25:45.854 trim: IOPS=43.2k, BW=169MiB/s (177MB/s)(1689MiB/10001msec); 0 zone resets 00:25:45.854 slat (usec): min=4, max=1007, avg=16.22, stdev= 7.22 00:25:45.854 clat (usec): min=35, max=1340, avg=218.47, stdev=105.72 00:25:45.854 lat (usec): min=54, max=1352, avg=234.69, stdev=108.43 00:25:45.854 clat percentiles (usec): 00:25:45.854 | 50.000th=[ 200], 99.000th=[ 537], 99.900th=[ 594], 99.990th=[ 635], 00:25:45.854 | 99.999th=[ 840] 00:25:45.854 bw ( KiB/s): min=149936, max=271504, per=100.00%, avg=173701.53, stdev=8412.38, samples=76 00:25:45.854 iops : min=37484, max=67876, avg=43425.47, stdev=2103.08, samples=76 00:25:45.854 lat (usec) : 50=0.52%, 100=12.29%, 250=54.94%, 500=27.11%, 750=4.37% 00:25:45.854 lat (usec) : 1000=0.77% 00:25:45.854 lat (msec) : 2=0.01% 00:25:45.854 cpu : usr=99.69%, sys=0.00%, ctx=51, majf=0, minf=92 00:25:45.854 IO depths : 1=8.1%, 2=26.3%, 4=52.5%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:25:45.854 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:25:45.854 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:25:45.854 issued rwts: total=0,432348,432349,0 short=0,0,0,0 dropped=0,0,0,0 00:25:45.854 latency : target=0, window=0, percentile=100.00%, depth=8 00:25:45.854 00:25:45.854 Run status group 0 (all jobs): 00:25:45.854 WRITE: bw=169MiB/s (177MB/s), 169MiB/s-169MiB/s (177MB/s-177MB/s), io=1689MiB (1771MB), run=10001-10001msec 00:25:45.854 TRIM: bw=169MiB/s (177MB/s), 169MiB/s-169MiB/s (177MB/s-177MB/s), io=1689MiB (1771MB), run=10001-10001msec 00:25:45.854 00:25:45.854 real 0m13.260s 00:25:45.855 user 0m48.620s 00:25:45.855 sys 0m0.361s 00:25:45.855 19:02:30 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:45.855 19:02:30 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:25:45.855 ************************************ 00:25:45.855 END TEST bdev_fio_trim 00:25:45.855 ************************************ 00:25:45.855 19:02:30 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:25:45.855 19:02:30 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:45.855 19:02:30 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:25:45.855 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:25:45.855 19:02:30 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:25:45.855 00:25:45.855 real 0m26.774s 00:25:45.855 user 1m37.046s 00:25:45.855 sys 0m0.854s 00:25:45.855 19:02:30 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:45.855 19:02:30 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:25:45.855 ************************************ 00:25:45.855 END TEST bdev_fio 00:25:45.855 ************************************ 00:25:45.855 19:02:30 blockdev_crypto_aesni -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:25:45.855 19:02:30 blockdev_crypto_aesni -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:25:45.855 19:02:30 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:25:45.855 19:02:30 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:45.855 19:02:30 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:45.855 ************************************ 00:25:45.855 START TEST bdev_verify 00:25:45.855 ************************************ 00:25:45.855 19:02:30 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:25:45.855 [2024-07-24 19:02:30.173565] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:25:45.855 [2024-07-24 19:02:30.173599] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2233043 ] 00:25:45.855 [2024-07-24 19:02:30.234854] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:25:45.855 [2024-07-24 19:02:30.307219] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:25:45.855 [2024-07-24 19:02:30.307221] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:45.855 [2024-07-24 19:02:30.328226] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:25:45.855 [2024-07-24 19:02:30.336248] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:25:45.855 [2024-07-24 19:02:30.344270] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:25:45.855 [2024-07-24 19:02:30.436950] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:25:47.756 [2024-07-24 19:02:32.585039] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:25:47.756 [2024-07-24 19:02:32.585091] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:25:47.756 [2024-07-24 19:02:32.585102] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:47.756 [2024-07-24 19:02:32.593058] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:25:47.756 [2024-07-24 19:02:32.593082] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:47.756 [2024-07-24 19:02:32.593092] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:47.756 [2024-07-24 19:02:32.601078] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:25:47.756 [2024-07-24 19:02:32.601092] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:47.756 [2024-07-24 19:02:32.601100] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:47.756 [2024-07-24 19:02:32.609098] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:25:47.756 [2024-07-24 19:02:32.609111] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:25:47.756 [2024-07-24 19:02:32.609119] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:47.756 Running I/O for 5 seconds... 00:25:53.091 00:25:53.091 Latency(us) 00:25:53.091 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:53.091 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:25:53.091 Verification LBA range: start 0x0 length 0x1000 00:25:53.091 crypto_ram : 5.05 734.27 2.87 0.00 0.00 174008.73 2793.08 113845.39 00:25:53.091 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:25:53.091 Verification LBA range: start 0x1000 length 0x1000 00:25:53.091 crypto_ram : 5.05 735.55 2.87 0.00 0.00 173727.28 3339.22 113845.39 00:25:53.091 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:25:53.091 Verification LBA range: start 0x0 length 0x1000 00:25:53.091 crypto_ram2 : 5.05 734.37 2.87 0.00 0.00 173633.35 2839.89 106355.57 00:25:53.091 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:25:53.091 Verification LBA range: start 0x1000 length 0x1000 00:25:53.091 crypto_ram2 : 5.05 735.45 2.87 0.00 0.00 173370.49 3417.23 105856.24 00:25:53.091 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:25:53.091 Verification LBA range: start 0x0 length 0x1000 00:25:53.091 crypto_ram3 : 5.05 5732.55 22.39 0.00 0.00 22179.07 3448.44 17226.61 00:25:53.091 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:25:53.091 Verification LBA range: start 0x1000 length 0x1000 00:25:53.091 crypto_ram3 : 5.04 5765.80 22.52 0.00 0.00 22052.06 3588.88 17226.61 00:25:53.091 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:25:53.091 Verification LBA range: start 0x0 length 0x1000 00:25:53.091 crypto_ram4 : 5.05 5733.04 22.39 0.00 0.00 22134.54 3635.69 15666.22 00:25:53.091 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:25:53.091 Verification LBA range: start 0x1000 length 0x1000 00:25:53.091 crypto_ram4 : 5.04 5764.99 22.52 0.00 0.00 22011.92 3682.50 15166.90 00:25:53.091 =================================================================================================================== 00:25:53.091 Total : 25936.02 101.31 0.00 0.00 39300.14 2793.08 113845.39 00:25:53.091 00:25:53.091 real 0m7.935s 00:25:53.091 user 0m15.293s 00:25:53.092 sys 0m0.248s 00:25:53.092 19:02:38 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:53.092 19:02:38 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:25:53.092 ************************************ 00:25:53.092 END TEST bdev_verify 00:25:53.092 ************************************ 00:25:53.092 19:02:38 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:25:53.092 19:02:38 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:25:53.092 19:02:38 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:53.092 19:02:38 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:53.350 ************************************ 00:25:53.350 START TEST bdev_verify_big_io 00:25:53.350 ************************************ 00:25:53.350 19:02:38 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:25:53.350 [2024-07-24 19:02:38.178196] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:25:53.350 [2024-07-24 19:02:38.178230] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2234205 ] 00:25:53.350 [2024-07-24 19:02:38.240384] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:25:53.350 [2024-07-24 19:02:38.312833] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:25:53.350 [2024-07-24 19:02:38.312837] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:53.351 [2024-07-24 19:02:38.333826] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:25:53.351 [2024-07-24 19:02:38.341846] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:25:53.351 [2024-07-24 19:02:38.349867] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:25:53.609 [2024-07-24 19:02:38.445336] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:25:56.142 [2024-07-24 19:02:40.589415] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:25:56.142 [2024-07-24 19:02:40.589481] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:25:56.142 [2024-07-24 19:02:40.589493] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:56.142 [2024-07-24 19:02:40.597434] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:25:56.142 [2024-07-24 19:02:40.597456] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:56.142 [2024-07-24 19:02:40.597466] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:56.142 [2024-07-24 19:02:40.605455] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:25:56.142 [2024-07-24 19:02:40.605472] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:56.142 [2024-07-24 19:02:40.605480] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:56.142 [2024-07-24 19:02:40.613481] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:25:56.142 [2024-07-24 19:02:40.613493] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:25:56.142 [2024-07-24 19:02:40.613501] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:56.142 Running I/O for 5 seconds... 00:25:56.404 [2024-07-24 19:02:41.177519] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:56.404 [2024-07-24 19:02:41.177811] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:56.404 [2024-07-24 19:02:41.177929] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:56.404 [2024-07-24 19:02:41.177972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:56.404 [2024-07-24 19:02:41.178003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:56.404 [2024-07-24 19:02:41.178213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.179068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.179105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.179131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.179157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.179539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.179571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.179595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.179620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.179877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.180680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.180718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.180744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.180780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.181217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.181252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.181285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.181322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.181584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.182310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.182349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.182376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.182401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.182800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.182832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.182875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.182910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.183154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.184108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.184147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.184172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.184197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.184535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.184579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.404 [2024-07-24 19:02:41.184618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.184655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.184892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.185786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.185820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.185845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.185870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.186161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.186207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.186233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.186258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.186475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.187264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.187298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.187324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.187349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.187671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.187702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.187727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.187752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.188006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.188740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.188773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.188797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.188820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.189154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.189183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.189207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.189230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.189464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.190157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.190196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.190221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.190246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.190606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.190642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.190667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.190691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.190958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.191625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.191660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.191686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.191714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.192058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.192088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.192113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.192138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.192405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.193168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.193204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.193229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.193253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.193613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.193645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.193670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.193695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.193944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.194691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.194726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.194751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.194794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.195146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.195176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.195202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.195236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.195504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.196384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.196422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.196464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.196500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.196871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.196906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.196944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.196969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.197256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.198382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.198427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.198452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.198496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.198887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.198918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.198959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.198994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.199232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.200261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.200308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.200340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.200367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.200750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.200799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.200824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.405 [2024-07-24 19:02:41.200850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.201099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.201937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.201981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.202012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.202048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.202396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.202436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.202464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.202507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.202745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.203523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.203566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.203602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.203646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.203951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.203996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.204033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.204060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.204346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.204966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.205014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.205040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.205066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.205385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.205426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.205451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.205481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.205727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.206383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.206419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.206445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.206475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.206801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.206831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.206856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.206881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.207180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.207893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.207926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.207952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.207977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.208324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.208355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.208380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.208415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.208766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.209556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.209594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.209619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.209643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.210002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.210036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.210061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.210097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.210409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.211288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.211322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.211347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.211372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.211720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.211762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.211797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.211823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.212117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.213223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.213259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.213294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.213318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.213679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.213711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.213737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.213761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.213988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.214768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.214803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.214839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.214864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.215236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.215266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.215292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.215316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.215508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.216273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.216335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.216379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.216417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.216755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.216790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.216816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.216841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.217185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.406 [2024-07-24 19:02:41.217908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.217943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.217970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.217995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.218342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.218372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.218399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.218435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.218757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.219523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.219558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.219583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.219624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.220050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.220081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.220105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.220129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.220358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.221140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.221194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.221221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.221246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.221631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.221662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.221687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.221712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.221886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.222702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.222739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.222764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.222788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.223103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.223137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.223163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.223189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.223452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.224239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.224273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.224299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.224324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.224706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.224736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.224764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.224789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.224973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.225641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.225674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.225709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.225735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.226111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.226140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.226165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.226189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.226410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.227000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.227037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.227062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.227086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.227441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.227475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.227501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.227525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.227709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.228289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.229102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.229831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.230589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.230888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.231753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.232131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.233661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.234150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.234895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.235714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.236813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.237079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.237331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.237876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.239556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.240114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.240851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.241620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.242817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.243077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.243327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.244030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.245696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.246465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.247217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.247980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.249005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.249276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.407 [2024-07-24 19:02:41.249535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.250473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.251990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.253005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.253879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.254644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.255450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.255718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.256020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.256840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.258188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.258969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.259833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.260796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.261324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.261590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.262111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.262851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.264063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.264819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.265581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.266380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.266877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.267138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.267792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.268528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.269909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.270661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.271414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.272175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.272675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.272936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.273760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.274506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.276079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.276830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.277572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.278322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.278829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.279088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.280074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.280938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.282742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.283704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.284535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.285281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.285833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.286198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.286965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.287850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.289455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.290297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.291228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.292158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.292731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.293285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.294016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.294758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.296290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.297055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.297858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.298753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.299366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.300108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.300847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.301569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.303066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.303837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.304607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.305292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.305928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.306847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.307645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.308407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.309999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.310758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.311509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.312031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.312649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.313506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.314448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.315375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.317127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.318149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.319038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.319299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.320086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.320834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.321569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.322359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.323917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.324681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.408 [2024-07-24 19:02:41.325533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.325792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.326892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.328386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.328974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.329696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.330737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.331001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.331699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.332466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.333546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.334314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.335142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.335901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.336951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.337235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.338161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.338933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.339982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.340583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.341552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.342421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.343488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.343761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.344621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.345597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.346590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.347060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.347842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.348732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.349810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.350271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.351003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.351790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.352943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.353496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.354241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.354985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.356016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.356681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.357444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.358213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.359322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.360102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.360849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.361591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.362733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.363599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.364330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.365074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.366008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.366969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.367804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.368544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.369850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.370805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.371738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.372551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.373309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.374171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.375128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.376077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.377406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.378157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.378952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.379853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.380622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.381403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.382147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.382943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.384354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.385105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.385859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.386619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.387592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.409 [2024-07-24 19:02:41.388343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.389096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.389855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.391811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.392620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.393363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.394107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.395415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.396330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.397140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.397888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.399848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.400781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.401712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.402529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.403596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.404461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.405424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.406333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.407938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.408732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.410 [2024-07-24 19:02:41.409594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.410586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.411612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.412374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.413127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.413949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.415588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.416345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.417100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.417798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.418784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.419550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.420302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.420896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.423025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.423782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.424472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.424987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.425982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.426739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.427423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.428089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.429935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.430211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.431175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.431445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.432070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.432419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.432695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.432971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.434236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.434525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.434799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.434827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.435093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.435310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.435658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.435927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.436191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.437253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.437297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.437325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.437353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.437630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.437756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.438029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.438296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.438328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.439407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.439443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.439474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.439500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.439752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.439865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.439894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.439921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.439950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.441055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.441091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.441117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.441142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.441397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.441520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.441553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.441579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.441616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.442767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.442803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.442836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.442863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.443124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.443238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.443267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.443293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.443328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.444490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.444526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.444554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.444580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.444864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.444981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.445010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.445045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.445073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.446294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.446330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.446356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.446382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.446666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.673 [2024-07-24 19:02:41.446784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.446814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.446840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.446875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.448033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.448069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.448095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.448121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.448365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.448492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.448525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.448563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.448591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.449886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.449921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.449948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.449977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.450247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.450361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.450390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.450424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.450452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.451691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.451727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.451755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.451781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.452030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.452143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.452171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.452204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.452230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.453619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.453654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.453680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.453718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.453972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.454083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.454112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.454138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.454163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.455261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.455308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.455334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.455382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.455686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.455795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.455826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.455851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.455876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.456817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.456863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.456889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.456936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.457181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.457292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.457320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.457348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.457375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.458388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.458435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.458477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.458504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.458768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.458877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.458905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.458934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.458972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.460002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.460038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.460062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.460087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.460441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.460559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.460589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.460614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.460639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.461616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.461661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.461688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.461713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.462004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.462108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.462135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.462160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.462184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.463150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.463185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.463209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.463236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.463444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.463555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.463586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.674 [2024-07-24 19:02:41.463612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.463637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.464657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.464691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.464718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.464742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.465029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.465131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.465158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.465189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.465216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.466212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.466248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.466274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.466311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.466670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.466792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.466831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.466857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.466908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.467976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.468010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.468035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.468060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.468290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.468397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.468436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.468461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.468502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.469750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.469783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.469808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.469835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.470010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.470119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.470158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.470195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.470221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.471245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.471278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.471308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.471333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.471552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.471666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.471695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.471723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.471749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.472658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.472692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.472717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.472742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.472984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.473098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.473126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.473151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.473176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.474103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.474137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.474163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.474188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.474438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.474556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.474584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.474608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.474647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.475774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.475819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.475844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.475884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.476154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.476281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.476309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.476348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.476373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.477480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.477524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.477550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.477586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.477832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.477941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.477969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.477997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.478031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.479289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.479324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.479355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.479379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.479616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.479737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.479767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.479800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.479826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.480701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.675 [2024-07-24 19:02:41.480741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.480767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.480793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.481086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.481192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.481227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.481253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.481278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.482250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.482283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.482308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.482335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.482571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.482683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.482712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.482736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.482761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.483573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.483613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.483639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.483663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.483834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.483947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.483975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.484003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.484028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.484958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.484991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.485016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.485041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.485213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.485322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.485357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.485382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.485408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.486296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.486331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.486355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.486388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.486565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.486673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.486706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.486731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.486760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.487629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.487664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.488411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.488554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.488662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.488691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.488715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.488740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.489675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.489940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.490858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.491796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.492014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.492985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.493021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.494543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.495432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.495869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.496121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.496410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.496522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.497280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.498154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.499167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.500906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.501865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.502764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.503016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.503262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.503956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.504709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.505461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.506341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.507870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.508767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.509416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.509681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.510024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.510958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.511754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.512530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.513420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.514990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.515896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.516334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.516590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.516938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.518190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.519094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.676 [2024-07-24 19:02:41.519887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.520766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.522343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.523250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.523519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.523773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.523982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.524778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.525619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.526553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.527453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.529140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.529977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.530235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.530494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.530671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.531487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.532246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.533133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.533699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.535411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.535919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.536175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.536431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.536612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.537657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.538536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.539488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.539875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.541605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.541873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.542127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.542577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.542757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.543689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.544596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.545464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.546090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.547743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.548006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.548261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.548972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.549180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.550010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.550912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.551529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.552381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.553824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.554096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.554350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.555288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.555503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.556325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.557221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.557628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.558531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.559718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.559979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.560287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.561098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.561276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.562252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.563212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.563661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.564408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.565375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.565643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.566182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.566931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.567110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.568084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.568868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.569567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.570322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.571322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.677 [2024-07-24 19:02:41.571590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.572408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.573142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.573352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.574316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.574845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.575791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.576614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.577645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.577908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.578856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.579770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.579999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.580966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.581318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.582128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.583059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.584179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.584664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.585471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.586329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.586512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.587566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.588138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.588879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.589640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.590709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.591301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.592046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.592811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.592986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.593841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.594556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.595308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.596062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.597085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.597905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.598640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.599393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.599573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.600213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.601140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.601937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.602684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.603871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.604877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.605752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.606519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.606696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.607154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.608015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.608977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.609871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.611125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.611885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.612709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.613597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.613774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.614381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.615134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.615904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.616796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.618253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.619007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.619775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.620650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.620847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.621737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.622489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.623241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.624132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.626339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.627193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.627951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.628829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.629075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.630080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.631060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.631914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.632824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.634611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.635486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.636444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.637335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.637588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.638406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.639169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.640058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.640829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.642432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.643204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.644090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.678 [2024-07-24 19:02:41.644738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.644917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.645723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.646482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.647365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.647872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.649767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.650535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.651411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.651830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.652006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.652981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.653810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.654688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.654958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.656805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.657740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.658694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.659132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.659311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.660258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.661198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.662095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.662352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.663913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.664810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.665559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.666262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.666474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.667305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.668195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.668819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.669083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.670793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.671694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.672212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.673139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.673376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.674207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.675104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.675511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.675767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.677165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.679 [2024-07-24 19:02:41.677882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.678630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.679196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.679534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.679876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.680932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.681801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.682535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.684124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.684730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.684998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.685259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.685519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.685861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.686129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.686391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.686662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.688224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.688508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.688778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.689042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.689324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.689676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.689947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.690215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.690484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.691748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.692017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.692283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.692315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.692580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.692925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.693192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.693456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.693727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.695583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.695624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.942 [2024-07-24 19:02:41.695650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.695676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.695944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.696291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.696577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.696609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.696884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.697856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.697893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.697919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.697944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.698222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.698577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.698620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.698647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.698673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.699615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.699652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.699678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.699703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.699950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.700079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.700108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.700133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.700158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.701287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.701323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.701348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.701375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.701635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.701761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.701789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.701813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.701851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.703100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.703138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.703163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.703187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.703454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.703576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.703607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.703632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.703657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.704693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.704739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.704765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.704815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.705087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.705207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.705236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.705262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.705288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.706316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.706363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.706401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.706427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.706710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.706820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.706849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.706885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.706912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.707971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.708006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.708031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.708070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.708425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.708549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.708578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.708604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.708629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.709638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.709673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.709698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.709723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.709951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.710058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.710085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.710111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.710137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.711138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.711171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.711197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.711221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.711485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.711594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.711623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.711648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.711673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.712635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.712669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.712695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.943 [2024-07-24 19:02:41.712720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.713064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.713177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.713205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.713241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.713269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.714252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.714295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.714322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.714347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.714630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.714753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.714784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.714820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.714845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.715970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.716003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.716028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.716053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.716230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.716346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.716386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.716413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.716439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.717404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.717441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.717472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.717498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.717764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.717879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.717909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.717934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.717959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.718894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.718930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.718958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.718983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.719250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.719360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.719388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.719413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.719443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.720384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.720420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.720445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.720474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.720767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.720878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.720907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.720937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.720973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.722206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.722251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.722277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.722313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.722598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.722746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.722786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.722827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.722869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.724167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.724213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.724239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.724276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.724522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.724638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.724680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.724714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.724756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.725693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.725744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.725783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.725811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.726039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.726146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.726191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.726218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.726243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.727113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.727149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.727176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.727200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.727413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.727530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.727560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.727585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.727610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.944 [2024-07-24 19:02:41.728702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.728736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.728760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.728785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.728957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.729066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.729094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.729126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.729154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.730027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.730061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.730092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.730120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.730386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.730509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.730539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.730565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.730590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.731415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.731450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.731480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.731506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.731681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.731787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.731814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.731840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.731872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.732925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.732964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.732989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.733013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.733185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.733296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.733324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.733352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.733379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.734138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.734174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.734199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.734224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.734433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.734548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.734579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.734610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.734637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.735456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.735498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.735523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.735547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.735732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.735841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.735870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.735895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.735919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.736843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.736880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.736905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.736929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.737169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.737275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.737302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.737327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.737351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.738142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.738179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.738203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.738227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.738401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.738528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.738562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.738587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.738614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.739380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.739421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.739448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.739482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.739655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.739764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.739792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.739827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.739856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.740897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.740936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.740961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.740991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.945 [2024-07-24 19:02:41.741163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.741277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.741310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.741335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.741359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.742159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.742192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.742217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.742241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.742465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.742575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.742603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.742634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.742658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.743537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.743571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.743600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.743856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.744031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.744136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.744164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.744196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.744222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.745070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.745558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.746386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.747322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.747502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.747600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.747635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.748588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.748624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.750198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.750965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.751729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.752536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.752715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.752810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.753570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.754319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.755076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.757093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.757916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.758662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.759412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.759658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.760721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.761626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.762412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.763170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.765425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.766378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.767208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.767950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.768174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.769141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.770139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.771018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.771779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.773660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.774601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.775535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.776361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.776593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.777411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.778256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.779196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.780127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.781686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.782454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.783249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.784140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.784316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.785125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.785896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.786659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.787411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.789124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.789901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.790654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.791264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.791442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.792328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.793102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.793852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.794330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.796362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.797240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.946 [2024-07-24 19:02:41.798004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.798485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.798661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.799675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.800654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.801514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.801774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.803484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.804418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.805373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.805936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.806160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.806986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.807790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.808693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.808939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.810541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.811305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.812062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.812905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.813117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.813951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.814718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.815317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.815577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.817306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.818081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.818560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.819406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.819587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.820486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.821239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.821612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.821869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.823612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.824480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.824959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.825725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.825906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.826955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.827959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.828221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.828481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.830076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.830983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.831689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.832441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.832655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.833493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.834274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.834551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.834813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.836424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.837091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.838089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.838944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.839151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.839989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.840473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.840732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.841017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.842590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.843089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.843836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.844675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.844849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.845810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.846076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.846333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.846876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.848648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.849313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.850059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.850824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.851014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.947 [2024-07-24 19:02:41.851916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.852179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.852439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.853205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.854805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.855668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.856427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.857176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.857406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.858124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.858397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.858663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.859647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.860999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.861822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.862746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.863681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.863893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.864312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.864581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.865007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.865744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.867009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.867767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.868531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.869301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.869478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.869803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.870066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.870714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.871456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.872917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.873667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.874403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.875151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.875327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.875663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.875943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.876779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.877521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.879203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.880020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.880768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.881518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.881759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.882089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.882348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.883292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.884229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.885995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.886935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.887878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.888722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.888956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.889283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.889716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.890461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.891272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.892844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.893603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.894419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.895329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.895574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.895900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.896552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.897302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.898041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.899578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.900341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.901095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.901761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.902095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.902433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.903322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.904079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.904826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.906405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.907159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.907919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.908400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.908680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.909012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.909964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.910881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.911687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.913526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.914444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.915229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.915570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.915807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.916250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.917001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.948 [2024-07-24 19:02:41.917827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.918732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.920459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.920734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.920992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.921249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.921460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.921996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.922448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.923127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.923395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.925178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.925685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.926312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.927188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.927477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.927809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.928070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.928329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.928597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.929689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.929979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.930245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.930513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.930798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.931145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.931416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.931689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.931961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.933485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.933761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.934031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.934297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.934539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.934876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.935158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.935428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.935700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.937004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.937280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.937555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.937822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.938095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.938435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.938720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.938988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.939252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.940672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.940952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.941219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.941251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.941539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.941885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.942157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.942423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.942697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.944027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.944089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.944126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.944153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.944454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.944819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.945091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.945124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.945386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.946382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.946430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.946476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.946503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.946758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.947102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.947140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.947169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:56.949 [2024-07-24 19:02:41.947196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.948271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.948308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.948335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.948360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.948634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.948750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.948780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.948804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.948830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.949858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.949894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.949921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.949946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.950271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.950392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.950421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.950472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.950502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.951555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.951606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.951643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.951670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.951956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.952090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.952123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.952160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.952187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.953425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.953461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.953491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.953517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.953721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.212 [2024-07-24 19:02:41.953844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.953885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.953922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.953950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.955041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.955077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.955103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.955130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.955349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.955473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.955506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.955531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.955556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.956543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.956579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.956606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.956632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.956907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.957023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.957053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.957078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.957104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.958177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.958213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.958240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.958277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.958566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.958690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.958733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.958769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.958795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.960481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.960517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.960562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.960595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.960875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.961001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.961053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.961083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.961110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.962176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.962221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.962266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.962306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.962543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.962655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.962697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.962734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.962761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.963649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.963695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.963722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.963748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.963970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.964082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.964111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.964137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.964161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.965133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.965167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.965192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.965216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.965483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.965601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.965630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.965654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.965698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.966868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.966902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.966927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.966952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.967202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.967319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.967361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.967388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.967413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.968590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.968640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.968666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.968691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.968990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.969103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.969132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.969160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.969184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.970157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.970201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.970226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.970264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.970511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.970621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.970649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.213 [2024-07-24 19:02:41.970674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.970700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.971603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.971637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.971662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.971686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.971895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.971999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.972026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.972057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.972086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.972929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.972964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.972988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.973012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.973245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.973352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.973381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.973406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.973436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.974355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.974389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.974418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.974442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.974620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.974724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.974752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.974779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.974810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.975653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.975692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.975725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.975761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.975934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.976043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.976076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.976103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.976127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.977000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.977036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.977061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.977085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.977280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.977405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.977436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.977461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.977489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.978630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.978675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.978700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.978729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.978901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.979018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.979051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.979077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.979102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.979956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.979990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.980015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.980039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.980259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.980367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.980394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.980426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.980454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.981299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.981334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.981359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.981384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.981561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.981668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.981701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.981726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.981755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.982553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.982590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.982615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.982640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.982812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.982918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.982947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.982977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.983002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.983832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.983873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.983899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.983926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.984152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.984265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.984294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.984321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.984359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.214 [2024-07-24 19:02:41.985348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.985388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.985413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.985442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.985620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.985727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.985754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.985779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.985803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.986653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.986686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.986718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.986743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.986915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.987024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.987056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.987081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.987106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.987966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.988001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.988026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.988057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.988230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.988343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.988376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.988401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.988425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.989238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.989273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.989298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.989323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.989500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.989616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.989649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.989677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.989702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.990499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.990533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.990558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.990593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.990936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.991055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.991095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.991120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.991144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.992007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.992045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.992069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.992958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.993132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.993242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.993270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.993298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.993324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.994182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.995088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.995348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.995609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.995865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.995965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.995993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.996740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.996768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.998170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.998934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:41.999719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.000621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.000803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.000900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.001171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.001439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.002280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.003676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.004642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.005466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.006214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.006393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.006912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.007175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.007429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.008396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.009588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.010423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.011362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.012291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.012472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.012794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.013056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.013433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.014181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.015415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.016170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.016923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.215 [2024-07-24 19:02:42.017812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.017989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.018311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.018580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.019225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.019959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.021496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.022247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.023006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.023897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.024129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.024461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.024739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.025603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.026330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.028048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.028896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.029650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.030526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.030763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.031096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.031359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.032275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.033221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.034922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.035804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.036760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.037683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.037945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.038270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.038799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.039553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.040304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.041864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.042631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.043509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.044142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.044440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.044775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.045557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.046304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.047067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.048610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.049374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.050263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.050606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.050876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.051202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.052124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.053062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.053883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.055695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.056662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.057659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.057920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.058121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.058603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.059351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.060146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.061034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.062625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.063520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.064257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.064522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.064754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.065435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.066171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.066917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.067794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.069372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.070269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.070802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.071063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.071394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.072297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.073047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.073796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.074673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.076293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.077197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.077475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.077737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.078003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.078933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.079886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.080807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.081801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.083648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.084523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.084782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.085041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.085220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.216 [2024-07-24 19:02:42.086023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.086784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.087548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.088164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.089854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.090758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.091077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.091336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.091623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.092606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.093628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.094514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.095415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.097176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.098164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.098421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.098681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.098876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.099686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.100485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.101378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.102222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.103926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.104574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.104844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.105112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.105289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.106086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.106834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.107707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.108270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.110050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.110495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.110755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.111011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.111187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.112077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.112826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.113689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.114077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.115799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.116085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.116344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.116661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.116838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.117910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.118839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.119762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.120172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.121924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.122197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.122456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.122982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.123201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.124053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.124948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.125757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.126429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.127940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.128206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.128471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.129206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.129416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.130251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.131146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.131732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.132627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.133949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.134218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.134478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.135465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.135672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.136506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.137392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.137762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.138638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.139714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.217 [2024-07-24 19:02:42.139982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.140280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.141103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.141283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.142260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.143225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.143608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.144354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.145369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.145643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.145904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.146162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.146411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.146751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.147020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.147558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.148316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.149873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.150139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.150397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.150956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.151170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.151982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.152755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.153278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.154013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.155217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.155501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.155767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.156035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.156336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.156684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.156956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.157222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.157498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.158941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.159210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.159474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.159738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.159985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.160316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.160586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.160843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.161099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.162347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.162623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.162885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.163154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.163443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.163780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.164043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.164303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.164570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.165895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.166166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.166427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.166696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.167003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.167338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.167609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.167870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.168131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.169313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.169589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.169858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.170131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.170388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.170724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.170989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.171247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.171511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.172835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.173111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.173374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.173406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.173649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.173984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.174248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.174522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.174844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.176157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.176198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.176234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.176260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.176528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.176873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.177736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.177767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.178192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.179204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.179241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.179277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.179302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.218 [2024-07-24 19:02:42.179586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.180322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.180358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.180386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.180412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.181272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.181307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.181333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.181358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.181630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.181750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.181780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.181806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.181836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.182978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.183013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.183039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.183065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.183251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.183381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.183421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.183448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.183479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.184462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.184503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.184530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.184566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.184747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.184866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.184895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.184921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.184945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.186021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.186059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.186085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.186110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.186296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.186416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.186446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.186478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.186505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.187528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.187563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.187588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.187614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.187879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.188003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.188033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.188058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.188083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.189008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.189044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.189070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.189096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.189363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.189487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.189518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.189543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.189569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.190550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.190590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.190616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.190641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.190821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.190939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.190967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.190993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.191018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.192063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.192098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.192124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.192149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.192389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.192515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.192545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.192573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.192603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.193867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.193902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.193940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.193966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.194236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.194357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.194385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.194410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.194434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.195294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.195331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.195357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.195382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.195657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.195774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.219 [2024-07-24 19:02:42.195804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.195829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.195858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.196842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.196881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.196906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.196932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.197112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.197230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.197259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.197284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.197310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.198349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.198384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.198409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.198433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.198681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.198797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.198827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.198858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.198887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.199835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.199869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.199895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.199920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.200184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.200297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.200326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.200352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.200381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.201406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.201441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.201467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.201499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.201683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.201795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.201843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.201872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.220 [2024-07-24 19:02:42.201898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.480 [2024-07-24 19:02:42.349097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:25:57.480 [2024-07-24 19:02:42.353016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.353076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.353389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.353426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.354139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.354315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.355140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.355184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.355441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.355476] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.355761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.356523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.356558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.357432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.357611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.359169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.359218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.360135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.360176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.360461] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.360732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.360763] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.361030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.361212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.362712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.362756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.363046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.363078] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.363353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.364248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.364283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.365176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.365352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.366585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.366628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.367367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.367398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.367677] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.368579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.368615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.368942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.369118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.370678] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.370724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.370982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.371013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.371382] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.372169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.372204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.372960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.373137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.374606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.374654] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.375561] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.375607] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.375903] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.376797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.376831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.377088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.377340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.378798] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.378840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.379729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.379762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.380151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.381121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.381162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.382151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.382335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.383305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.383347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.383822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.383855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.384170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.385060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.385095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.385536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.385716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.480 [2024-07-24 19:02:42.386847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.386890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.387148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.387186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.387634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.387904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.387939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.388205] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.388558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.389962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.390005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.390827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.390862] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.391241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.392206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.392240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.393186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.393366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.394507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.394549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.395283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.395314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.395609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.396248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.396286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.397008] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.397219] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.398041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.398083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.398351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.398380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.398709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.399092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.399127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.399691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.399873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.400786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.400835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.401101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.401137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.401506] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.401774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.401805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.402068] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.402294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.403395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.403459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.403739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.403774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.404151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.404416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.404448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.404713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.405027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.405954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.405998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.406261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.406296] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.406621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.406886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.406919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.407177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.407397] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.408420] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.408463] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.408728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.408771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.409227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.409516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.409555] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.409811] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.410045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.410976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.411029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.411289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.411318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.411695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.411967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.412002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.412261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.412599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.413528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.413571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.413828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.413859] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.414194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.414464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.414511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.414768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.414987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.415982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.481 [2024-07-24 19:02:42.416027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.416284] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.416314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.416651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.416926] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.416962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.417232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.417534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.418656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.418700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.418961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.419000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.419455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.419729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.419767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.420028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.420321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.421316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.421361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.421623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.421655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.422047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.422325] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.422360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.422629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.422930] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.423777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.423826] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.424090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.424123] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.424443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.424714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.424753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.425012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.425347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.426399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.426444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.426712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.426757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.427175] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.427442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.427482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.427746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.428041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.429068] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.429113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.429385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.429421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.429796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.430063] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.430094] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.430358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.430590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.431708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.431753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.432024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.432059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.432423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.432712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.432755] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.433020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.433278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.434238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.434283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.434560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.434597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.434946] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.435211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.435246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.435528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:25:57.482 [2024-07-24 19:02:42.435898] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:01.670 00:26:01.670 Latency(us) 00:26:01.670 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:01.670 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:01.670 Verification LBA range: start 0x0 length 0x100 00:26:01.670 crypto_ram : 5.52 69.40 4.34 0.00 0.00 1803614.36 8488.47 1565873.49 00:26:01.670 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:01.670 Verification LBA range: start 0x100 length 0x100 00:26:01.670 crypto_ram : 5.53 68.75 4.30 0.00 0.00 1819933.41 54675.75 1605819.25 00:26:01.670 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:01.670 Verification LBA range: start 0x0 length 0x100 00:26:01.670 crypto_ram2 : 5.52 69.57 4.35 0.00 0.00 1758138.76 4774.77 1541906.04 00:26:01.670 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:01.670 Verification LBA range: start 0x100 length 0x100 00:26:01.670 crypto_ram2 : 5.53 69.44 4.34 0.00 0.00 1763535.83 3666.90 1589840.94 00:26:01.670 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:01.670 Verification LBA range: start 0x0 length 0x100 00:26:01.670 crypto_ram3 : 5.35 481.79 30.11 0.00 0.00 247564.66 23093.64 359511.77 00:26:01.670 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:01.670 Verification LBA range: start 0x100 length 0x100 00:26:01.670 crypto_ram3 : 5.36 477.66 29.85 0.00 0.00 249599.62 4119.41 365503.63 00:26:01.670 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:01.670 Verification LBA range: start 0x0 length 0x100 00:26:01.670 crypto_ram4 : 5.41 498.54 31.16 0.00 0.00 235315.95 12483.05 319566.02 00:26:01.670 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:01.670 Verification LBA range: start 0x100 length 0x100 00:26:01.670 crypto_ram4 : 5.42 493.23 30.83 0.00 0.00 237568.63 11234.74 313574.16 00:26:01.670 =================================================================================================================== 00:26:01.671 Total : 2228.38 139.27 0.00 0.00 438638.76 3666.90 1605819.25 00:26:01.671 00:26:01.671 real 0m8.427s 00:26:01.671 user 0m16.238s 00:26:01.671 sys 0m0.274s 00:26:01.671 19:02:46 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:01.671 19:02:46 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:26:01.671 ************************************ 00:26:01.671 END TEST bdev_verify_big_io 00:26:01.671 ************************************ 00:26:01.671 19:02:46 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:01.671 19:02:46 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:26:01.671 19:02:46 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:01.671 19:02:46 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:01.671 ************************************ 00:26:01.671 START TEST bdev_write_zeroes 00:26:01.671 ************************************ 00:26:01.671 19:02:46 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:01.671 [2024-07-24 19:02:46.670488] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:26:01.671 [2024-07-24 19:02:46.670523] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2235574 ] 00:26:01.930 [2024-07-24 19:02:46.731540] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:01.930 [2024-07-24 19:02:46.803150] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:01.930 [2024-07-24 19:02:46.824057] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:26:01.930 [2024-07-24 19:02:46.832084] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:01.930 [2024-07-24 19:02:46.840105] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:02.188 [2024-07-24 19:02:46.941614] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:26:04.091 [2024-07-24 19:02:49.085850] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:26:04.091 [2024-07-24 19:02:49.085905] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:04.091 [2024-07-24 19:02:49.085915] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:04.091 [2024-07-24 19:02:49.093872] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:26:04.091 [2024-07-24 19:02:49.093891] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:04.091 [2024-07-24 19:02:49.093900] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:04.349 [2024-07-24 19:02:49.101891] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:26:04.349 [2024-07-24 19:02:49.101905] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:04.349 [2024-07-24 19:02:49.101913] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:04.349 [2024-07-24 19:02:49.109911] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:26:04.349 [2024-07-24 19:02:49.109923] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:26:04.349 [2024-07-24 19:02:49.109932] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:04.349 Running I/O for 1 seconds... 00:26:05.284 00:26:05.284 Latency(us) 00:26:05.284 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:05.284 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:05.284 crypto_ram : 1.02 3015.13 11.78 0.00 0.00 42247.23 3464.05 49432.87 00:26:05.284 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:05.284 crypto_ram2 : 1.02 3021.01 11.80 0.00 0.00 42015.61 3417.23 47685.24 00:26:05.284 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:05.284 crypto_ram3 : 1.01 23499.50 91.79 0.00 0.00 5395.50 1583.79 6896.88 00:26:05.284 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:05.284 crypto_ram4 : 1.01 23484.68 91.74 0.00 0.00 5383.72 1583.79 6834.47 00:26:05.284 =================================================================================================================== 00:26:05.284 Total : 53020.33 207.11 0.00 0.00 9583.70 1583.79 49432.87 00:26:05.542 00:26:05.542 real 0m3.883s 00:26:05.542 user 0m3.598s 00:26:05.542 sys 0m0.246s 00:26:05.542 19:02:50 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:05.542 19:02:50 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:26:05.542 ************************************ 00:26:05.542 END TEST bdev_write_zeroes 00:26:05.542 ************************************ 00:26:05.542 19:02:50 blockdev_crypto_aesni -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:05.542 19:02:50 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:26:05.542 19:02:50 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:05.542 19:02:50 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:05.800 ************************************ 00:26:05.800 START TEST bdev_json_nonenclosed 00:26:05.800 ************************************ 00:26:05.800 19:02:50 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:05.800 [2024-07-24 19:02:50.620665] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:26:05.800 [2024-07-24 19:02:50.620699] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2236267 ] 00:26:05.800 [2024-07-24 19:02:50.681062] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:05.800 [2024-07-24 19:02:50.751410] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:05.800 [2024-07-24 19:02:50.751477] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:26:05.800 [2024-07-24 19:02:50.751489] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:26:05.800 [2024-07-24 19:02:50.751496] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:26:06.058 00:26:06.058 real 0m0.251s 00:26:06.058 user 0m0.173s 00:26:06.058 sys 0m0.076s 00:26:06.058 19:02:50 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:06.058 19:02:50 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:26:06.058 ************************************ 00:26:06.058 END TEST bdev_json_nonenclosed 00:26:06.058 ************************************ 00:26:06.058 19:02:50 blockdev_crypto_aesni -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:06.058 19:02:50 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:26:06.058 19:02:50 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:06.059 19:02:50 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:06.059 ************************************ 00:26:06.059 START TEST bdev_json_nonarray 00:26:06.059 ************************************ 00:26:06.059 19:02:50 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:06.059 [2024-07-24 19:02:50.923738] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:26:06.059 [2024-07-24 19:02:50.923774] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2236288 ] 00:26:06.059 [2024-07-24 19:02:50.985226] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:06.059 [2024-07-24 19:02:51.056483] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:06.059 [2024-07-24 19:02:51.056549] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:26:06.059 [2024-07-24 19:02:51.056560] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:26:06.059 [2024-07-24 19:02:51.056567] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:26:06.318 00:26:06.318 real 0m0.244s 00:26:06.318 user 0m0.163s 00:26:06.318 sys 0m0.079s 00:26:06.318 19:02:51 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:06.318 19:02:51 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:26:06.318 ************************************ 00:26:06.318 END TEST bdev_json_nonarray 00:26:06.318 ************************************ 00:26:06.318 19:02:51 blockdev_crypto_aesni -- bdev/blockdev.sh@786 -- # [[ crypto_aesni == bdev ]] 00:26:06.318 19:02:51 blockdev_crypto_aesni -- bdev/blockdev.sh@793 -- # [[ crypto_aesni == gpt ]] 00:26:06.318 19:02:51 blockdev_crypto_aesni -- bdev/blockdev.sh@797 -- # [[ crypto_aesni == crypto_sw ]] 00:26:06.318 19:02:51 blockdev_crypto_aesni -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:26:06.318 19:02:51 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # cleanup 00:26:06.318 19:02:51 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:26:06.318 19:02:51 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:26:06.318 19:02:51 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:26:06.318 19:02:51 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:26:06.318 19:02:51 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:26:06.318 19:02:51 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:26:06.318 00:26:06.318 real 1m6.191s 00:26:06.318 user 2m39.078s 00:26:06.318 sys 0m5.790s 00:26:06.318 19:02:51 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:06.318 19:02:51 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:06.318 ************************************ 00:26:06.318 END TEST blockdev_crypto_aesni 00:26:06.318 ************************************ 00:26:06.318 19:02:51 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:26:06.318 19:02:51 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:26:06.318 19:02:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:06.318 19:02:51 -- common/autotest_common.sh@10 -- # set +x 00:26:06.318 ************************************ 00:26:06.318 START TEST blockdev_crypto_sw 00:26:06.318 ************************************ 00:26:06.318 19:02:51 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:26:06.318 * Looking for test storage... 00:26:06.318 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:26:06.318 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:26:06.318 19:02:51 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:26:06.318 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:26:06.318 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:26:06.318 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:26:06.318 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:26:06.318 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:26:06.318 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:26:06.318 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@673 -- # uname -s 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@681 -- # test_type=crypto_sw 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # crypto_device= 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # dek= 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # env_ctx= 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@689 -- # [[ crypto_sw == bdev ]] 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@689 -- # [[ crypto_sw == crypto_* ]] 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2236556 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:26:06.577 19:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 2236556 00:26:06.577 19:02:51 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 2236556 ']' 00:26:06.577 19:02:51 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:06.577 19:02:51 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:06.577 19:02:51 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:06.577 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:06.577 19:02:51 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:06.577 19:02:51 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:06.577 [2024-07-24 19:02:51.395085] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:26:06.577 [2024-07-24 19:02:51.395138] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2236556 ] 00:26:06.577 [2024-07-24 19:02:51.459582] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:06.577 [2024-07-24 19:02:51.536384] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:26:07.512 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:26:07.512 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@710 -- # setup_crypto_sw_conf 00:26:07.512 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@192 -- # rpc_cmd 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:07.512 Malloc0 00:26:07.512 Malloc1 00:26:07.512 true 00:26:07.512 true 00:26:07.512 true 00:26:07.512 [2024-07-24 19:02:52.410051] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:07.512 crypto_ram 00:26:07.512 [2024-07-24 19:02:52.418080] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:07.512 crypto_ram2 00:26:07.512 [2024-07-24 19:02:52.426098] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:07.512 crypto_ram3 00:26:07.512 [ 00:26:07.512 { 00:26:07.512 "name": "Malloc1", 00:26:07.512 "aliases": [ 00:26:07.512 "dbf0748a-e390-40d8-a734-560b1df87f03" 00:26:07.512 ], 00:26:07.512 "product_name": "Malloc disk", 00:26:07.512 "block_size": 4096, 00:26:07.512 "num_blocks": 4096, 00:26:07.512 "uuid": "dbf0748a-e390-40d8-a734-560b1df87f03", 00:26:07.512 "assigned_rate_limits": { 00:26:07.512 "rw_ios_per_sec": 0, 00:26:07.512 "rw_mbytes_per_sec": 0, 00:26:07.512 "r_mbytes_per_sec": 0, 00:26:07.512 "w_mbytes_per_sec": 0 00:26:07.512 }, 00:26:07.512 "claimed": true, 00:26:07.512 "claim_type": "exclusive_write", 00:26:07.512 "zoned": false, 00:26:07.512 "supported_io_types": { 00:26:07.512 "read": true, 00:26:07.512 "write": true, 00:26:07.512 "unmap": true, 00:26:07.512 "flush": true, 00:26:07.512 "reset": true, 00:26:07.512 "nvme_admin": false, 00:26:07.512 "nvme_io": false, 00:26:07.512 "nvme_io_md": false, 00:26:07.512 "write_zeroes": true, 00:26:07.512 "zcopy": true, 00:26:07.512 "get_zone_info": false, 00:26:07.512 "zone_management": false, 00:26:07.512 "zone_append": false, 00:26:07.512 "compare": false, 00:26:07.512 "compare_and_write": false, 00:26:07.512 "abort": true, 00:26:07.512 "seek_hole": false, 00:26:07.512 "seek_data": false, 00:26:07.512 "copy": true, 00:26:07.512 "nvme_iov_md": false 00:26:07.512 }, 00:26:07.512 "memory_domains": [ 00:26:07.512 { 00:26:07.512 "dma_device_id": "system", 00:26:07.512 "dma_device_type": 1 00:26:07.512 }, 00:26:07.512 { 00:26:07.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:07.512 "dma_device_type": 2 00:26:07.512 } 00:26:07.512 ], 00:26:07.512 "driver_specific": {} 00:26:07.512 } 00:26:07.512 ] 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:07.512 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:07.512 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # cat 00:26:07.512 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:07.512 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:07.512 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:07.512 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:26:07.512 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:26:07.512 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:07.512 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:07.771 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:07.771 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:26:07.771 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r .name 00:26:07.771 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "94ec7948-716b-5f77-8d2b-b30563eaf30d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "94ec7948-716b-5f77-8d2b-b30563eaf30d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "24da42b2-53e3-5498-9468-672110bef52f"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "24da42b2-53e3-5498-9468-672110bef52f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:26:07.771 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:26:07.771 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:26:07.771 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:26:07.771 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # killprocess 2236556 00:26:07.771 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 2236556 ']' 00:26:07.771 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 2236556 00:26:07.771 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:26:07.771 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:07.771 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2236556 00:26:07.771 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:07.771 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:07.771 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2236556' 00:26:07.771 killing process with pid 2236556 00:26:07.771 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 2236556 00:26:07.771 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 2236556 00:26:08.039 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:26:08.039 19:02:52 blockdev_crypto_sw -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:26:08.039 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:08.039 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:08.039 19:02:52 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:08.039 ************************************ 00:26:08.039 START TEST bdev_hello_world 00:26:08.039 ************************************ 00:26:08.039 19:02:52 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:26:08.039 [2024-07-24 19:02:53.027202] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:26:08.039 [2024-07-24 19:02:53.027235] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2236817 ] 00:26:08.300 [2024-07-24 19:02:53.087772] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:08.300 [2024-07-24 19:02:53.159301] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:08.558 [2024-07-24 19:02:53.316054] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:08.558 [2024-07-24 19:02:53.316105] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:08.558 [2024-07-24 19:02:53.316116] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:08.558 [2024-07-24 19:02:53.324075] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:08.558 [2024-07-24 19:02:53.324091] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:08.558 [2024-07-24 19:02:53.324101] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:08.558 [2024-07-24 19:02:53.332097] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:08.558 [2024-07-24 19:02:53.332111] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:26:08.558 [2024-07-24 19:02:53.332124] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:08.558 [2024-07-24 19:02:53.369988] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:26:08.558 [2024-07-24 19:02:53.370012] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:26:08.558 [2024-07-24 19:02:53.370024] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:26:08.558 [2024-07-24 19:02:53.370852] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:26:08.558 [2024-07-24 19:02:53.370918] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:26:08.558 [2024-07-24 19:02:53.370930] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:26:08.558 [2024-07-24 19:02:53.370956] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:26:08.558 00:26:08.558 [2024-07-24 19:02:53.370969] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:26:08.558 00:26:08.558 real 0m0.560s 00:26:08.558 user 0m0.396s 00:26:08.558 sys 0m0.150s 00:26:08.558 19:02:53 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:08.558 19:02:53 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:26:08.558 ************************************ 00:26:08.558 END TEST bdev_hello_world 00:26:08.558 ************************************ 00:26:08.819 19:02:53 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:26:08.819 19:02:53 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:26:08.819 19:02:53 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:08.819 19:02:53 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:08.819 ************************************ 00:26:08.819 START TEST bdev_bounds 00:26:08.819 ************************************ 00:26:08.819 19:02:53 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:26:08.819 19:02:53 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:26:08.819 19:02:53 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=2236848 00:26:08.819 19:02:53 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:26:08.819 19:02:53 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 2236848' 00:26:08.819 Process bdevio pid: 2236848 00:26:08.819 19:02:53 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 2236848 00:26:08.819 19:02:53 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2236848 ']' 00:26:08.819 19:02:53 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:08.819 19:02:53 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:08.819 19:02:53 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:08.819 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:08.819 19:02:53 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:08.819 19:02:53 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:26:08.819 [2024-07-24 19:02:53.631568] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:26:08.819 [2024-07-24 19:02:53.631604] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2236848 ] 00:26:08.819 [2024-07-24 19:02:53.694041] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:08.819 [2024-07-24 19:02:53.774092] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:08.819 [2024-07-24 19:02:53.774105] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:26:08.819 [2024-07-24 19:02:53.774107] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:09.078 [2024-07-24 19:02:53.938741] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:09.078 [2024-07-24 19:02:53.938800] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:09.078 [2024-07-24 19:02:53.938810] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:09.078 [2024-07-24 19:02:53.946765] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:09.078 [2024-07-24 19:02:53.946780] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:09.078 [2024-07-24 19:02:53.946788] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:09.078 [2024-07-24 19:02:53.954789] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:09.078 [2024-07-24 19:02:53.954801] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:26:09.078 [2024-07-24 19:02:53.954810] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:09.646 19:02:54 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:09.646 19:02:54 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:26:09.646 19:02:54 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:26:09.646 I/O targets: 00:26:09.646 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:26:09.646 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:26:09.646 00:26:09.646 00:26:09.646 CUnit - A unit testing framework for C - Version 2.1-3 00:26:09.646 http://cunit.sourceforge.net/ 00:26:09.646 00:26:09.646 00:26:09.646 Suite: bdevio tests on: crypto_ram3 00:26:09.646 Test: blockdev write read block ...passed 00:26:09.646 Test: blockdev write zeroes read block ...passed 00:26:09.646 Test: blockdev write zeroes read no split ...passed 00:26:09.646 Test: blockdev write zeroes read split ...passed 00:26:09.646 Test: blockdev write zeroes read split partial ...passed 00:26:09.646 Test: blockdev reset ...passed 00:26:09.646 Test: blockdev write read 8 blocks ...passed 00:26:09.646 Test: blockdev write read size > 128k ...passed 00:26:09.646 Test: blockdev write read invalid size ...passed 00:26:09.646 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:09.646 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:09.646 Test: blockdev write read max offset ...passed 00:26:09.646 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:09.646 Test: blockdev writev readv 8 blocks ...passed 00:26:09.646 Test: blockdev writev readv 30 x 1block ...passed 00:26:09.646 Test: blockdev writev readv block ...passed 00:26:09.646 Test: blockdev writev readv size > 128k ...passed 00:26:09.646 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:09.646 Test: blockdev comparev and writev ...passed 00:26:09.646 Test: blockdev nvme passthru rw ...passed 00:26:09.646 Test: blockdev nvme passthru vendor specific ...passed 00:26:09.646 Test: blockdev nvme admin passthru ...passed 00:26:09.646 Test: blockdev copy ...passed 00:26:09.646 Suite: bdevio tests on: crypto_ram 00:26:09.646 Test: blockdev write read block ...passed 00:26:09.646 Test: blockdev write zeroes read block ...passed 00:26:09.646 Test: blockdev write zeroes read no split ...passed 00:26:09.646 Test: blockdev write zeroes read split ...passed 00:26:09.646 Test: blockdev write zeroes read split partial ...passed 00:26:09.646 Test: blockdev reset ...passed 00:26:09.646 Test: blockdev write read 8 blocks ...passed 00:26:09.646 Test: blockdev write read size > 128k ...passed 00:26:09.646 Test: blockdev write read invalid size ...passed 00:26:09.646 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:09.646 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:09.646 Test: blockdev write read max offset ...passed 00:26:09.646 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:09.646 Test: blockdev writev readv 8 blocks ...passed 00:26:09.646 Test: blockdev writev readv 30 x 1block ...passed 00:26:09.646 Test: blockdev writev readv block ...passed 00:26:09.646 Test: blockdev writev readv size > 128k ...passed 00:26:09.646 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:09.646 Test: blockdev comparev and writev ...passed 00:26:09.646 Test: blockdev nvme passthru rw ...passed 00:26:09.646 Test: blockdev nvme passthru vendor specific ...passed 00:26:09.646 Test: blockdev nvme admin passthru ...passed 00:26:09.646 Test: blockdev copy ...passed 00:26:09.646 00:26:09.646 Run Summary: Type Total Ran Passed Failed Inactive 00:26:09.646 suites 2 2 n/a 0 0 00:26:09.646 tests 46 46 46 0 0 00:26:09.646 asserts 260 260 260 0 n/a 00:26:09.646 00:26:09.646 Elapsed time = 0.076 seconds 00:26:09.646 0 00:26:09.646 19:02:54 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 2236848 00:26:09.646 19:02:54 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2236848 ']' 00:26:09.646 19:02:54 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2236848 00:26:09.646 19:02:54 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:26:09.646 19:02:54 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:09.646 19:02:54 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2236848 00:26:09.646 19:02:54 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:09.646 19:02:54 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:09.646 19:02:54 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2236848' 00:26:09.646 killing process with pid 2236848 00:26:09.646 19:02:54 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2236848 00:26:09.646 19:02:54 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2236848 00:26:09.904 19:02:54 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:26:09.904 00:26:09.904 real 0m1.191s 00:26:09.904 user 0m3.215s 00:26:09.904 sys 0m0.253s 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:26:09.905 ************************************ 00:26:09.905 END TEST bdev_bounds 00:26:09.905 ************************************ 00:26:09.905 19:02:54 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:26:09.905 19:02:54 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:26:09.905 19:02:54 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:09.905 19:02:54 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:09.905 ************************************ 00:26:09.905 START TEST bdev_nbd 00:26:09.905 ************************************ 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=2 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=2 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=2237102 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 2237102 /var/tmp/spdk-nbd.sock 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2237102 ']' 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:26:09.905 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:09.905 19:02:54 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:26:09.905 [2024-07-24 19:02:54.907285] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:26:09.905 [2024-07-24 19:02:54.907322] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:26:10.163 [2024-07-24 19:02:54.971276] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:10.163 [2024-07-24 19:02:55.049342] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:10.421 [2024-07-24 19:02:55.209828] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:10.421 [2024-07-24 19:02:55.209878] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:10.421 [2024-07-24 19:02:55.209888] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:10.421 [2024-07-24 19:02:55.217847] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:10.421 [2024-07-24 19:02:55.217863] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:10.421 [2024-07-24 19:02:55.217871] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:10.421 [2024-07-24 19:02:55.225866] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:10.421 [2024-07-24 19:02:55.225879] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:26:10.421 [2024-07-24 19:02:55.225887] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:10.985 19:02:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:10.985 19:02:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:26:10.985 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:26:10.985 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:10.985 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:26:10.985 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:10.986 1+0 records in 00:26:10.986 1+0 records out 00:26:10.986 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000226422 s, 18.1 MB/s 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:26:10.986 19:02:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:11.244 1+0 records in 00:26:11.244 1+0 records out 00:26:11.244 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000230316 s, 17.8 MB/s 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:26:11.244 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:26:11.503 { 00:26:11.503 "nbd_device": "/dev/nbd0", 00:26:11.503 "bdev_name": "crypto_ram" 00:26:11.503 }, 00:26:11.503 { 00:26:11.503 "nbd_device": "/dev/nbd1", 00:26:11.503 "bdev_name": "crypto_ram3" 00:26:11.503 } 00:26:11.503 ]' 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:26:11.503 { 00:26:11.503 "nbd_device": "/dev/nbd0", 00:26:11.503 "bdev_name": "crypto_ram" 00:26:11.503 }, 00:26:11.503 { 00:26:11.503 "nbd_device": "/dev/nbd1", 00:26:11.503 "bdev_name": "crypto_ram3" 00:26:11.503 } 00:26:11.503 ]' 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:11.503 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:26:11.761 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:11.761 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:11.761 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:11.761 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:11.761 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:11.761 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:11.761 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:11.761 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:11.761 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:26:11.761 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:11.761 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:12.019 19:02:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:26:12.278 /dev/nbd0 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:12.278 1+0 records in 00:26:12.278 1+0 records out 00:26:12.278 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000213459 s, 19.2 MB/s 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:26:12.278 /dev/nbd1 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:12.278 1+0 records in 00:26:12.278 1+0 records out 00:26:12.278 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000221524 s, 18.5 MB/s 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:26:12.278 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:26:12.537 { 00:26:12.537 "nbd_device": "/dev/nbd0", 00:26:12.537 "bdev_name": "crypto_ram" 00:26:12.537 }, 00:26:12.537 { 00:26:12.537 "nbd_device": "/dev/nbd1", 00:26:12.537 "bdev_name": "crypto_ram3" 00:26:12.537 } 00:26:12.537 ]' 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:26:12.537 { 00:26:12.537 "nbd_device": "/dev/nbd0", 00:26:12.537 "bdev_name": "crypto_ram" 00:26:12.537 }, 00:26:12.537 { 00:26:12.537 "nbd_device": "/dev/nbd1", 00:26:12.537 "bdev_name": "crypto_ram3" 00:26:12.537 } 00:26:12.537 ]' 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:26:12.537 /dev/nbd1' 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:26:12.537 /dev/nbd1' 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:26:12.537 256+0 records in 00:26:12.537 256+0 records out 00:26:12.537 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103786 s, 101 MB/s 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:26:12.537 256+0 records in 00:26:12.537 256+0 records out 00:26:12.537 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.013869 s, 75.6 MB/s 00:26:12.537 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:12.538 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:26:12.796 256+0 records in 00:26:12.796 256+0 records out 00:26:12.796 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0210084 s, 49.9 MB/s 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:12.796 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:26:13.054 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:13.054 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:13.054 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:13.054 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:13.054 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:13.054 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:13.054 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:13.054 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:13.054 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:26:13.054 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:13.054 19:02:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:13.313 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:26:13.313 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:26:13.313 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:26:13.313 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:26:13.313 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:26:13.313 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:26:13.313 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:26:13.313 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:26:13.313 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:26:13.313 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:26:13.313 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:26:13.313 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:26:13.313 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:26:13.313 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:13.313 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:13.313 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:26:13.313 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:26:13.313 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:26:13.594 malloc_lvol_verify 00:26:13.594 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:26:13.594 cd80138e-f2f5-4077-bcfd-ac10e4f11f3a 00:26:13.594 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:26:13.870 ce8e7e55-c5a9-4958-87e3-d77ffb5b957c 00:26:13.870 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:26:13.870 /dev/nbd0 00:26:13.870 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:26:13.870 mke2fs 1.46.5 (30-Dec-2021) 00:26:13.870 Discarding device blocks: 0/4096 done 00:26:13.870 Creating filesystem with 4096 1k blocks and 1024 inodes 00:26:13.870 00:26:13.870 Allocating group tables: 0/1 done 00:26:13.870 Writing inode tables: 0/1 done 00:26:13.870 Creating journal (1024 blocks): done 00:26:13.870 Writing superblocks and filesystem accounting information: 0/1 done 00:26:13.870 00:26:13.870 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:26:13.870 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:26:13.870 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:13.870 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:13.870 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:13.870 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:26:13.870 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:13.870 19:02:58 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 2237102 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2237102 ']' 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2237102 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2237102 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2237102' 00:26:14.129 killing process with pid 2237102 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2237102 00:26:14.129 19:02:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2237102 00:26:14.387 19:02:59 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:26:14.387 00:26:14.387 real 0m4.385s 00:26:14.387 user 0m6.406s 00:26:14.387 sys 0m1.432s 00:26:14.387 19:02:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:14.387 19:02:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:26:14.387 ************************************ 00:26:14.387 END TEST bdev_nbd 00:26:14.387 ************************************ 00:26:14.387 19:02:59 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:26:14.387 19:02:59 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # '[' crypto_sw = nvme ']' 00:26:14.387 19:02:59 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # '[' crypto_sw = gpt ']' 00:26:14.387 19:02:59 blockdev_crypto_sw -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:26:14.387 19:02:59 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:26:14.387 19:02:59 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:14.387 19:02:59 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:14.387 ************************************ 00:26:14.387 START TEST bdev_fio 00:26:14.387 ************************************ 00:26:14.387 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:26:14.387 19:02:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:26:14.387 19:02:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:26:14.387 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:14.387 19:02:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:26:14.387 19:02:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:26:14.387 19:02:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:26:14.387 19:02:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:26:14.387 19:02:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:26:14.387 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1278 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:14.387 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1279 -- # local workload=verify 00:26:14.387 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local bdev_type=AIO 00:26:14.387 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local env_context= 00:26:14.387 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local fio_dir=/usr/src/fio 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1289 -- # '[' -z verify ']' 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1293 -- # '[' -n '' ']' 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1297 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # cat 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1311 -- # '[' verify == verify ']' 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1312 -- # cat 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1321 -- # '[' AIO == AIO ']' 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1322 -- # /usr/src/fio/fio --version 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1322 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # echo serialize_overlap=1 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:26:14.388 ************************************ 00:26:14.388 START TEST bdev_fio_rw_verify 00:26:14.388 ************************************ 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1354 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # local fio_dir=/usr/src/fio 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local sanitizers 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # shift 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # local asan_lib= 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # for sanitizer in "${sanitizers[@]}" 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # awk '{print $3}' 00:26:14.388 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:14.646 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # grep libasan 00:26:14.646 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # asan_lib= 00:26:14.646 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # [[ -n '' ]] 00:26:14.646 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # for sanitizer in "${sanitizers[@]}" 00:26:14.646 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:14.646 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # grep libclang_rt.asan 00:26:14.646 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # awk '{print $3}' 00:26:14.646 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # asan_lib= 00:26:14.646 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # [[ -n '' ]] 00:26:14.646 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1350 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:26:14.646 19:02:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1350 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:14.904 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:14.904 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:14.904 fio-3.35 00:26:14.904 Starting 2 threads 00:26:27.101 00:26:27.101 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=2238277: Wed Jul 24 19:03:10 2024 00:26:27.101 read: IOPS=29.6k, BW=115MiB/s (121MB/s)(1155MiB/10000msec) 00:26:27.101 slat (usec): min=9, max=1710, avg=15.41, stdev= 6.26 00:26:27.101 clat (usec): min=5, max=1887, avg=109.22, stdev=50.02 00:26:27.101 lat (usec): min=18, max=1903, avg=124.64, stdev=52.67 00:26:27.101 clat percentiles (usec): 00:26:27.101 | 50.000th=[ 104], 99.000th=[ 269], 99.900th=[ 371], 99.990th=[ 445], 00:26:27.101 | 99.999th=[ 1860] 00:26:27.101 write: IOPS=35.4k, BW=138MiB/s (145MB/s)(1312MiB/9479msec); 0 zone resets 00:26:27.101 slat (usec): min=9, max=141, avg=25.06, stdev= 6.25 00:26:27.101 clat (usec): min=15, max=896, avg=145.71, stdev=72.95 00:26:27.101 lat (usec): min=34, max=994, avg=170.78, stdev=75.93 00:26:27.101 clat percentiles (usec): 00:26:27.101 | 50.000th=[ 141], 99.000th=[ 367], 99.900th=[ 537], 99.990th=[ 635], 00:26:27.101 | 99.999th=[ 857] 00:26:27.101 bw ( KiB/s): min=89864, max=147632, per=94.80%, avg=134360.00, stdev=7613.35, samples=38 00:26:27.101 iops : min=22466, max=36908, avg=33590.00, stdev=1903.34, samples=38 00:26:27.101 lat (usec) : 10=0.01%, 20=0.01%, 50=8.26%, 100=29.52%, 250=57.49% 00:26:27.101 lat (usec) : 500=4.60%, 750=0.10%, 1000=0.01% 00:26:27.101 lat (msec) : 2=0.01% 00:26:27.101 cpu : usr=99.65%, sys=0.01%, ctx=40, majf=0, minf=621 00:26:27.101 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:26:27.101 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:27.101 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:27.101 issued rwts: total=295633,335856,0,0 short=0,0,0,0 dropped=0,0,0,0 00:26:27.101 latency : target=0, window=0, percentile=100.00%, depth=8 00:26:27.101 00:26:27.101 Run status group 0 (all jobs): 00:26:27.101 READ: bw=115MiB/s (121MB/s), 115MiB/s-115MiB/s (121MB/s-121MB/s), io=1155MiB (1211MB), run=10000-10000msec 00:26:27.101 WRITE: bw=138MiB/s (145MB/s), 138MiB/s-138MiB/s (145MB/s-145MB/s), io=1312MiB (1376MB), run=9479-9479msec 00:26:27.101 00:26:27.101 real 0m10.994s 00:26:27.101 user 0m26.834s 00:26:27.101 sys 0m0.271s 00:26:27.101 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:26:27.102 ************************************ 00:26:27.102 END TEST bdev_fio_rw_verify 00:26:27.102 ************************************ 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1278 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1279 -- # local workload=trim 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local bdev_type= 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local env_context= 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local fio_dir=/usr/src/fio 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1289 -- # '[' -z trim ']' 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1293 -- # '[' -n '' ']' 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1297 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # cat 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1311 -- # '[' trim == verify ']' 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1326 -- # '[' trim == trim ']' 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1327 -- # echo rw=trimwrite 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "94ec7948-716b-5f77-8d2b-b30563eaf30d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "94ec7948-716b-5f77-8d2b-b30563eaf30d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "24da42b2-53e3-5498-9468-672110bef52f"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "24da42b2-53e3-5498-9468-672110bef52f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:26:27.102 crypto_ram3 ]] 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "94ec7948-716b-5f77-8d2b-b30563eaf30d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "94ec7948-716b-5f77-8d2b-b30563eaf30d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "24da42b2-53e3-5498-9468-672110bef52f"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "24da42b2-53e3-5498-9468-672110bef52f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:26:27.102 ************************************ 00:26:27.102 START TEST bdev_fio_trim 00:26:27.102 ************************************ 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1354 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # local fio_dir=/usr/src/fio 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local sanitizers 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # shift 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # local asan_lib= 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # for sanitizer in "${sanitizers[@]}" 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # grep libasan 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # awk '{print $3}' 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # asan_lib= 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # [[ -n '' ]] 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # for sanitizer in "${sanitizers[@]}" 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # grep libclang_rt.asan 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # awk '{print $3}' 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # asan_lib= 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # [[ -n '' ]] 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1350 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:26:27.102 19:03:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1350 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:27.102 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:27.102 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:27.102 fio-3.35 00:26:27.102 Starting 2 threads 00:26:37.084 00:26:37.084 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=2240716: Wed Jul 24 19:03:21 2024 00:26:37.084 write: IOPS=54.2k, BW=212MiB/s (222MB/s)(2116MiB/10001msec); 0 zone resets 00:26:37.084 slat (usec): min=9, max=291, avg=16.00, stdev= 3.38 00:26:37.084 clat (usec): min=25, max=1716, avg=121.63, stdev=67.93 00:26:37.084 lat (usec): min=35, max=1736, avg=137.63, stdev=70.38 00:26:37.084 clat percentiles (usec): 00:26:37.084 | 50.000th=[ 97], 99.000th=[ 255], 99.900th=[ 277], 99.990th=[ 529], 00:26:37.084 | 99.999th=[ 1647] 00:26:37.084 bw ( KiB/s): min=211777, max=218352, per=100.00%, avg=216737.74, stdev=783.26, samples=38 00:26:37.084 iops : min=52944, max=54588, avg=54184.42, stdev=195.84, samples=38 00:26:37.084 trim: IOPS=54.2k, BW=212MiB/s (222MB/s)(2116MiB/10001msec); 0 zone resets 00:26:37.084 slat (usec): min=3, max=126, avg= 7.44, stdev= 2.04 00:26:37.084 clat (usec): min=29, max=1567, avg=81.07, stdev=24.94 00:26:37.084 lat (usec): min=34, max=1575, avg=88.52, stdev=25.14 00:26:37.084 clat percentiles (usec): 00:26:37.084 | 50.000th=[ 82], 99.000th=[ 135], 99.900th=[ 149], 99.990th=[ 233], 00:26:37.084 | 99.999th=[ 363] 00:26:37.084 bw ( KiB/s): min=211809, max=218360, per=100.00%, avg=216739.42, stdev=780.67, samples=38 00:26:37.084 iops : min=52952, max=54590, avg=54184.84, stdev=195.19, samples=38 00:26:37.084 lat (usec) : 50=14.02%, 100=48.76%, 250=36.43%, 500=0.78%, 750=0.01% 00:26:37.084 lat (msec) : 2=0.01% 00:26:37.084 cpu : usr=99.72%, sys=0.00%, ctx=26, majf=0, minf=282 00:26:37.084 IO depths : 1=7.5%, 2=17.4%, 4=60.1%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:26:37.084 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:37.084 complete : 0=0.0%, 4=86.9%, 8=13.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:37.084 issued rwts: total=0,541667,541668,0 short=0,0,0,0 dropped=0,0,0,0 00:26:37.084 latency : target=0, window=0, percentile=100.00%, depth=8 00:26:37.084 00:26:37.084 Run status group 0 (all jobs): 00:26:37.084 WRITE: bw=212MiB/s (222MB/s), 212MiB/s-212MiB/s (222MB/s-222MB/s), io=2116MiB (2219MB), run=10001-10001msec 00:26:37.084 TRIM: bw=212MiB/s (222MB/s), 212MiB/s-212MiB/s (222MB/s-222MB/s), io=2116MiB (2219MB), run=10001-10001msec 00:26:37.084 00:26:37.084 real 0m11.048s 00:26:37.084 user 0m26.005s 00:26:37.084 sys 0m0.259s 00:26:37.084 19:03:21 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:37.084 19:03:21 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:26:37.084 ************************************ 00:26:37.084 END TEST bdev_fio_trim 00:26:37.084 ************************************ 00:26:37.085 19:03:21 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:26:37.085 19:03:21 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:37.085 19:03:21 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:26:37.085 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:37.085 19:03:21 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:26:37.085 00:26:37.085 real 0m22.336s 00:26:37.085 user 0m53.001s 00:26:37.085 sys 0m0.677s 00:26:37.085 19:03:21 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:37.085 19:03:21 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:26:37.085 ************************************ 00:26:37.085 END TEST bdev_fio 00:26:37.085 ************************************ 00:26:37.085 19:03:21 blockdev_crypto_sw -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:26:37.085 19:03:21 blockdev_crypto_sw -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:26:37.085 19:03:21 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:26:37.085 19:03:21 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:37.085 19:03:21 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:37.085 ************************************ 00:26:37.085 START TEST bdev_verify 00:26:37.085 ************************************ 00:26:37.085 19:03:21 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:26:37.085 [2024-07-24 19:03:21.768511] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:26:37.085 [2024-07-24 19:03:21.768556] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2242333 ] 00:26:37.085 [2024-07-24 19:03:21.834492] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:37.085 [2024-07-24 19:03:21.910717] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:37.085 [2024-07-24 19:03:21.910720] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:37.085 [2024-07-24 19:03:22.061945] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:37.085 [2024-07-24 19:03:22.061994] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:37.085 [2024-07-24 19:03:22.062003] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:37.085 [2024-07-24 19:03:22.069966] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:37.085 [2024-07-24 19:03:22.069982] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:37.085 [2024-07-24 19:03:22.069991] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:37.085 [2024-07-24 19:03:22.077987] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:37.085 [2024-07-24 19:03:22.078001] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:26:37.085 [2024-07-24 19:03:22.078009] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:37.342 Running I/O for 5 seconds... 00:26:42.607 00:26:42.607 Latency(us) 00:26:42.607 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:42.607 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:42.607 Verification LBA range: start 0x0 length 0x800 00:26:42.607 crypto_ram : 5.01 8015.62 31.31 0.00 0.00 15908.98 1146.88 19473.55 00:26:42.607 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:42.607 Verification LBA range: start 0x800 length 0x800 00:26:42.607 crypto_ram : 5.01 8017.25 31.32 0.00 0.00 15906.87 1318.52 19348.72 00:26:42.607 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:42.607 Verification LBA range: start 0x0 length 0x800 00:26:42.607 crypto_ram3 : 5.02 4006.56 15.65 0.00 0.00 31813.30 5492.54 23218.47 00:26:42.607 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:42.607 Verification LBA range: start 0x800 length 0x800 00:26:42.607 crypto_ram3 : 5.01 4007.33 15.65 0.00 0.00 31806.04 5991.86 23218.47 00:26:42.607 =================================================================================================================== 00:26:42.607 Total : 24046.76 93.93 0.00 0.00 21208.51 1146.88 23218.47 00:26:42.607 00:26:42.607 real 0m5.620s 00:26:42.607 user 0m10.743s 00:26:42.607 sys 0m0.171s 00:26:42.607 19:03:27 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:42.607 19:03:27 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:26:42.607 ************************************ 00:26:42.607 END TEST bdev_verify 00:26:42.607 ************************************ 00:26:42.607 19:03:27 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:26:42.607 19:03:27 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:26:42.607 19:03:27 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:42.607 19:03:27 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:42.607 ************************************ 00:26:42.607 START TEST bdev_verify_big_io 00:26:42.607 ************************************ 00:26:42.607 19:03:27 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:26:42.607 [2024-07-24 19:03:27.449228] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:26:42.607 [2024-07-24 19:03:27.449262] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2243252 ] 00:26:42.607 [2024-07-24 19:03:27.512257] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:42.607 [2024-07-24 19:03:27.583952] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:42.607 [2024-07-24 19:03:27.583955] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:42.865 [2024-07-24 19:03:27.747077] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:42.866 [2024-07-24 19:03:27.747133] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:42.866 [2024-07-24 19:03:27.747143] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:42.866 [2024-07-24 19:03:27.755100] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:42.866 [2024-07-24 19:03:27.755117] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:42.866 [2024-07-24 19:03:27.755126] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:42.866 [2024-07-24 19:03:27.763122] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:42.866 [2024-07-24 19:03:27.763136] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:26:42.866 [2024-07-24 19:03:27.763144] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:42.866 Running I/O for 5 seconds... 00:26:48.133 00:26:48.133 Latency(us) 00:26:48.133 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:48.133 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:48.133 Verification LBA range: start 0x0 length 0x80 00:26:48.133 crypto_ram : 5.20 714.10 44.63 0.00 0.00 176255.66 4400.27 255652.82 00:26:48.133 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:48.133 Verification LBA range: start 0x80 length 0x80 00:26:48.133 crypto_ram : 5.05 709.37 44.34 0.00 0.00 177225.67 4181.82 257650.10 00:26:48.133 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:48.133 Verification LBA range: start 0x0 length 0x80 00:26:48.133 crypto_ram3 : 5.21 368.71 23.04 0.00 0.00 332692.51 3947.76 257650.10 00:26:48.133 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:48.133 Verification LBA range: start 0x80 length 0x80 00:26:48.133 crypto_ram3 : 5.19 369.68 23.11 0.00 0.00 331529.24 4805.97 257650.10 00:26:48.133 =================================================================================================================== 00:26:48.133 Total : 2161.85 135.12 0.00 0.00 230311.02 3947.76 257650.10 00:26:48.391 00:26:48.391 real 0m5.818s 00:26:48.391 user 0m11.147s 00:26:48.391 sys 0m0.172s 00:26:48.391 19:03:33 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:48.391 19:03:33 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:26:48.391 ************************************ 00:26:48.391 END TEST bdev_verify_big_io 00:26:48.391 ************************************ 00:26:48.391 19:03:33 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:48.391 19:03:33 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:26:48.391 19:03:33 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:48.391 19:03:33 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:48.391 ************************************ 00:26:48.391 START TEST bdev_write_zeroes 00:26:48.391 ************************************ 00:26:48.392 19:03:33 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:48.392 [2024-07-24 19:03:33.314896] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:26:48.392 [2024-07-24 19:03:33.314931] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2244175 ] 00:26:48.392 [2024-07-24 19:03:33.376925] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:48.650 [2024-07-24 19:03:33.448485] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:48.650 [2024-07-24 19:03:33.606485] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:48.650 [2024-07-24 19:03:33.606541] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:48.650 [2024-07-24 19:03:33.606553] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:48.650 [2024-07-24 19:03:33.614505] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:48.650 [2024-07-24 19:03:33.614520] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:48.650 [2024-07-24 19:03:33.614528] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:48.650 [2024-07-24 19:03:33.622526] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:48.650 [2024-07-24 19:03:33.622540] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:26:48.650 [2024-07-24 19:03:33.622548] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:48.909 Running I/O for 1 seconds... 00:26:49.845 00:26:49.845 Latency(us) 00:26:49.845 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:49.845 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:49.845 crypto_ram : 1.01 41130.90 160.67 0.00 0.00 3105.17 1427.75 4681.14 00:26:49.845 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:49.845 crypto_ram3 : 1.01 20595.97 80.45 0.00 0.00 6182.10 2106.51 7084.13 00:26:49.846 =================================================================================================================== 00:26:49.846 Total : 61726.87 241.12 0.00 0.00 4132.92 1427.75 7084.13 00:26:49.846 00:26:49.846 real 0m1.568s 00:26:49.846 user 0m1.402s 00:26:49.846 sys 0m0.146s 00:26:49.846 19:03:34 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:49.846 19:03:34 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:26:49.846 ************************************ 00:26:49.846 END TEST bdev_write_zeroes 00:26:49.846 ************************************ 00:26:50.103 19:03:34 blockdev_crypto_sw -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:50.103 19:03:34 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:26:50.103 19:03:34 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:50.103 19:03:34 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:50.103 ************************************ 00:26:50.103 START TEST bdev_json_nonenclosed 00:26:50.103 ************************************ 00:26:50.103 19:03:34 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:50.103 [2024-07-24 19:03:34.973382] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:26:50.103 [2024-07-24 19:03:34.973424] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2244457 ] 00:26:50.103 [2024-07-24 19:03:35.038021] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:50.103 [2024-07-24 19:03:35.110367] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:50.103 [2024-07-24 19:03:35.110428] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:26:50.103 [2024-07-24 19:03:35.110442] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:26:50.103 [2024-07-24 19:03:35.110450] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:26:50.361 00:26:50.361 real 0m0.266s 00:26:50.361 user 0m0.174s 00:26:50.361 sys 0m0.090s 00:26:50.361 19:03:35 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:50.361 19:03:35 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:26:50.361 ************************************ 00:26:50.361 END TEST bdev_json_nonenclosed 00:26:50.361 ************************************ 00:26:50.361 19:03:35 blockdev_crypto_sw -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:50.361 19:03:35 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:26:50.361 19:03:35 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:50.361 19:03:35 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:50.361 ************************************ 00:26:50.361 START TEST bdev_json_nonarray 00:26:50.361 ************************************ 00:26:50.361 19:03:35 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:50.362 [2024-07-24 19:03:35.307465] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:26:50.362 [2024-07-24 19:03:35.307507] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2244658 ] 00:26:50.620 [2024-07-24 19:03:35.371320] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:50.620 [2024-07-24 19:03:35.441975] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:50.620 [2024-07-24 19:03:35.442041] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:26:50.620 [2024-07-24 19:03:35.442052] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:26:50.620 [2024-07-24 19:03:35.442060] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:26:50.620 00:26:50.620 real 0m0.258s 00:26:50.620 user 0m0.166s 00:26:50.620 sys 0m0.090s 00:26:50.620 19:03:35 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:50.620 19:03:35 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:26:50.620 ************************************ 00:26:50.620 END TEST bdev_json_nonarray 00:26:50.620 ************************************ 00:26:50.620 19:03:35 blockdev_crypto_sw -- bdev/blockdev.sh@786 -- # [[ crypto_sw == bdev ]] 00:26:50.620 19:03:35 blockdev_crypto_sw -- bdev/blockdev.sh@793 -- # [[ crypto_sw == gpt ]] 00:26:50.620 19:03:35 blockdev_crypto_sw -- bdev/blockdev.sh@797 -- # [[ crypto_sw == crypto_sw ]] 00:26:50.620 19:03:35 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:26:50.620 19:03:35 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:26:50.620 19:03:35 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:50.620 19:03:35 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:50.620 ************************************ 00:26:50.620 START TEST bdev_crypto_enomem 00:26:50.620 ************************************ 00:26:50.620 19:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:26:50.620 19:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@634 -- # local base_dev=base0 00:26:50.620 19:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local test_dev=crypt0 00:26:50.620 19:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local err_dev=EE_base0 00:26:50.620 19:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local qd=32 00:26:50.620 19:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # ERR_PID=2244679 00:26:50.620 19:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:26:50.620 19:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:26:50.620 19:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # waitforlisten 2244679 00:26:50.620 19:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 2244679 ']' 00:26:50.620 19:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:50.620 19:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:50.620 19:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:50.620 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:50.620 19:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:50.620 19:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:26:50.620 [2024-07-24 19:03:35.620202] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:26:50.620 [2024-07-24 19:03:35.620238] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2244679 ] 00:26:50.878 [2024-07-24 19:03:35.684780] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:50.878 [2024-07-24 19:03:35.762143] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:51.446 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:51.446 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:26:51.446 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@644 -- # rpc_cmd 00:26:51.446 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:51.446 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:26:51.446 true 00:26:51.446 base0 00:26:51.446 true 00:26:51.446 [2024-07-24 19:03:36.442757] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:51.446 crypt0 00:26:51.446 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:51.446 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@651 -- # waitforbdev crypt0 00:26:51.446 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:26:51.446 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:51.446 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:26:51.446 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:51.446 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:51.446 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:26:51.446 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:51.446 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:26:51.705 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:51.705 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:26:51.705 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:51.705 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:26:51.705 [ 00:26:51.705 { 00:26:51.705 "name": "crypt0", 00:26:51.705 "aliases": [ 00:26:51.705 "893fdc2e-c16c-57bc-835d-e20a4dcb12cd" 00:26:51.705 ], 00:26:51.705 "product_name": "crypto", 00:26:51.705 "block_size": 512, 00:26:51.705 "num_blocks": 2097152, 00:26:51.705 "uuid": "893fdc2e-c16c-57bc-835d-e20a4dcb12cd", 00:26:51.705 "assigned_rate_limits": { 00:26:51.705 "rw_ios_per_sec": 0, 00:26:51.705 "rw_mbytes_per_sec": 0, 00:26:51.705 "r_mbytes_per_sec": 0, 00:26:51.705 "w_mbytes_per_sec": 0 00:26:51.705 }, 00:26:51.705 "claimed": false, 00:26:51.705 "zoned": false, 00:26:51.705 "supported_io_types": { 00:26:51.705 "read": true, 00:26:51.705 "write": true, 00:26:51.705 "unmap": false, 00:26:51.705 "flush": false, 00:26:51.705 "reset": true, 00:26:51.705 "nvme_admin": false, 00:26:51.705 "nvme_io": false, 00:26:51.705 "nvme_io_md": false, 00:26:51.705 "write_zeroes": true, 00:26:51.705 "zcopy": false, 00:26:51.705 "get_zone_info": false, 00:26:51.705 "zone_management": false, 00:26:51.705 "zone_append": false, 00:26:51.705 "compare": false, 00:26:51.705 "compare_and_write": false, 00:26:51.705 "abort": false, 00:26:51.705 "seek_hole": false, 00:26:51.705 "seek_data": false, 00:26:51.705 "copy": false, 00:26:51.705 "nvme_iov_md": false 00:26:51.705 }, 00:26:51.705 "memory_domains": [ 00:26:51.705 { 00:26:51.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:51.705 "dma_device_type": 2 00:26:51.705 } 00:26:51.705 ], 00:26:51.705 "driver_specific": { 00:26:51.705 "crypto": { 00:26:51.705 "base_bdev_name": "EE_base0", 00:26:51.705 "name": "crypt0", 00:26:51.705 "key_name": "test_dek_sw" 00:26:51.705 } 00:26:51.705 } 00:26:51.705 } 00:26:51.705 ] 00:26:51.705 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:51.705 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:26:51.705 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # rpcpid=2244797 00:26:51.705 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@656 -- # sleep 1 00:26:51.705 19:03:36 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:26:51.705 Running I/O for 5 seconds... 00:26:52.641 19:03:37 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:26:52.641 19:03:37 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:52.641 19:03:37 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:26:52.641 19:03:37 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:52.641 19:03:37 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@659 -- # wait 2244797 00:26:56.830 00:26:56.830 Latency(us) 00:26:56.830 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:56.830 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:26:56.830 crypt0 : 5.00 55513.50 216.85 0.00 0.00 574.32 259.41 1388.74 00:26:56.830 =================================================================================================================== 00:26:56.831 Total : 55513.50 216.85 0.00 0.00 574.32 259.41 1388.74 00:26:56.831 0 00:26:56.831 19:03:41 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@661 -- # rpc_cmd bdev_crypto_delete crypt0 00:26:56.831 19:03:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:56.831 19:03:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:26:56.831 19:03:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:56.831 19:03:41 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@663 -- # killprocess 2244679 00:26:56.831 19:03:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 2244679 ']' 00:26:56.831 19:03:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 2244679 00:26:56.831 19:03:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:26:56.831 19:03:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:56.831 19:03:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2244679 00:26:56.831 19:03:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:26:56.831 19:03:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:26:56.831 19:03:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2244679' 00:26:56.831 killing process with pid 2244679 00:26:56.831 19:03:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 2244679 00:26:56.831 Received shutdown signal, test time was about 5.000000 seconds 00:26:56.831 00:26:56.831 Latency(us) 00:26:56.831 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:56.831 =================================================================================================================== 00:26:56.831 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:56.831 19:03:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 2244679 00:26:56.831 19:03:41 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # trap - SIGINT SIGTERM EXIT 00:26:56.831 00:26:56.831 real 0m6.220s 00:26:56.831 user 0m6.427s 00:26:56.831 sys 0m0.244s 00:26:56.831 19:03:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:56.831 19:03:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:26:56.831 ************************************ 00:26:56.831 END TEST bdev_crypto_enomem 00:26:56.831 ************************************ 00:26:56.831 19:03:41 blockdev_crypto_sw -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:26:56.831 19:03:41 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # cleanup 00:26:56.831 19:03:41 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:26:56.831 19:03:41 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:26:57.144 19:03:41 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:26:57.144 19:03:41 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:26:57.144 19:03:41 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:26:57.144 19:03:41 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:26:57.144 00:26:57.144 real 0m50.603s 00:26:57.144 user 1m35.120s 00:26:57.144 sys 0m4.322s 00:26:57.144 19:03:41 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:57.144 19:03:41 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:57.144 ************************************ 00:26:57.144 END TEST blockdev_crypto_sw 00:26:57.144 ************************************ 00:26:57.144 19:03:41 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:26:57.144 19:03:41 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:26:57.144 19:03:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:57.144 19:03:41 -- common/autotest_common.sh@10 -- # set +x 00:26:57.144 ************************************ 00:26:57.144 START TEST blockdev_crypto_qat 00:26:57.144 ************************************ 00:26:57.144 19:03:41 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:26:57.144 * Looking for test storage... 00:26:57.144 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:26:57.144 19:03:41 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:26:57.144 19:03:41 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:26:57.144 19:03:41 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:26:57.144 19:03:41 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:26:57.145 19:03:41 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:26:57.145 19:03:41 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:26:57.145 19:03:41 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:26:57.145 19:03:41 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:26:57.145 19:03:41 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:26:57.145 19:03:41 blockdev_crypto_qat -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:26:57.145 19:03:41 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:26:57.145 19:03:41 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:26:57.145 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@673 -- # uname -s 00:26:57.145 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:26:57.145 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:26:57.145 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@681 -- # test_type=crypto_qat 00:26:57.145 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # crypto_device= 00:26:57.145 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # dek= 00:26:57.145 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # env_ctx= 00:26:57.145 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:26:57.145 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:26:57.145 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@689 -- # [[ crypto_qat == bdev ]] 00:26:57.145 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@689 -- # [[ crypto_qat == crypto_* ]] 00:26:57.145 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:26:57.145 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:26:57.145 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2245687 00:26:57.145 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:26:57.145 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:26:57.145 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 2245687 00:26:57.145 19:03:42 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 2245687 ']' 00:26:57.145 19:03:42 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:57.145 19:03:42 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:57.145 19:03:42 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:57.145 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:57.145 19:03:42 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:57.145 19:03:42 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:26:57.145 [2024-07-24 19:03:42.064330] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:26:57.145 [2024-07-24 19:03:42.064377] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2245687 ] 00:26:57.145 [2024-07-24 19:03:42.130249] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:57.407 [2024-07-24 19:03:42.212022] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:57.973 19:03:42 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:57.973 19:03:42 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:26:57.973 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:26:57.973 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@707 -- # setup_crypto_qat_conf 00:26:57.973 19:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@169 -- # rpc_cmd 00:26:57.973 19:03:42 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:57.973 19:03:42 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:26:57.973 [2024-07-24 19:03:42.861966] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:26:57.973 [2024-07-24 19:03:42.869999] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:57.973 [2024-07-24 19:03:42.878014] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:57.973 [2024-07-24 19:03:42.941052] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:27:00.522 true 00:27:00.522 true 00:27:00.522 true 00:27:00.522 true 00:27:00.522 Malloc0 00:27:00.522 Malloc1 00:27:00.522 Malloc2 00:27:00.522 Malloc3 00:27:00.522 [2024-07-24 19:03:45.214756] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:27:00.522 crypto_ram 00:27:00.523 [2024-07-24 19:03:45.222774] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:27:00.523 crypto_ram1 00:27:00.523 [2024-07-24 19:03:45.230793] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:27:00.523 crypto_ram2 00:27:00.523 [2024-07-24 19:03:45.238813] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:27:00.523 crypto_ram3 00:27:00.523 [ 00:27:00.523 { 00:27:00.523 "name": "Malloc1", 00:27:00.523 "aliases": [ 00:27:00.523 "6d586967-129f-4101-89e2-1f21d2eb19cc" 00:27:00.523 ], 00:27:00.523 "product_name": "Malloc disk", 00:27:00.523 "block_size": 512, 00:27:00.523 "num_blocks": 65536, 00:27:00.523 "uuid": "6d586967-129f-4101-89e2-1f21d2eb19cc", 00:27:00.523 "assigned_rate_limits": { 00:27:00.523 "rw_ios_per_sec": 0, 00:27:00.523 "rw_mbytes_per_sec": 0, 00:27:00.523 "r_mbytes_per_sec": 0, 00:27:00.523 "w_mbytes_per_sec": 0 00:27:00.523 }, 00:27:00.523 "claimed": true, 00:27:00.523 "claim_type": "exclusive_write", 00:27:00.523 "zoned": false, 00:27:00.523 "supported_io_types": { 00:27:00.523 "read": true, 00:27:00.523 "write": true, 00:27:00.523 "unmap": true, 00:27:00.523 "flush": true, 00:27:00.523 "reset": true, 00:27:00.523 "nvme_admin": false, 00:27:00.523 "nvme_io": false, 00:27:00.523 "nvme_io_md": false, 00:27:00.523 "write_zeroes": true, 00:27:00.523 "zcopy": true, 00:27:00.523 "get_zone_info": false, 00:27:00.523 "zone_management": false, 00:27:00.523 "zone_append": false, 00:27:00.523 "compare": false, 00:27:00.523 "compare_and_write": false, 00:27:00.523 "abort": true, 00:27:00.523 "seek_hole": false, 00:27:00.523 "seek_data": false, 00:27:00.523 "copy": true, 00:27:00.523 "nvme_iov_md": false 00:27:00.523 }, 00:27:00.523 "memory_domains": [ 00:27:00.523 { 00:27:00.523 "dma_device_id": "system", 00:27:00.523 "dma_device_type": 1 00:27:00.523 }, 00:27:00.523 { 00:27:00.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:00.523 "dma_device_type": 2 00:27:00.523 } 00:27:00.523 ], 00:27:00.523 "driver_specific": {} 00:27:00.523 } 00:27:00.523 ] 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:00.523 19:03:45 blockdev_crypto_qat -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:00.523 19:03:45 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # cat 00:27:00.523 19:03:45 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:00.523 19:03:45 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:00.523 19:03:45 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:00.523 19:03:45 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:27:00.523 19:03:45 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:27:00.523 19:03:45 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:00.523 19:03:45 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:27:00.523 19:03:45 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r .name 00:27:00.523 19:03:45 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "30a265c9-0e53-591f-b1a8-7cabaea47b73"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "30a265c9-0e53-591f-b1a8-7cabaea47b73",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "cf136c12-6602-59ca-a3fb-2ef4d545b620"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "cf136c12-6602-59ca-a3fb-2ef4d545b620",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "5eb97947-a2f1-5113-85d3-b1b15f52115f"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "5eb97947-a2f1-5113-85d3-b1b15f52115f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "c25dcc82-8009-59ed-aa50-b951ec840a73"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "c25dcc82-8009-59ed-aa50-b951ec840a73",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:27:00.523 19:03:45 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:27:00.523 19:03:45 blockdev_crypto_qat -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:27:00.523 19:03:45 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:27:00.523 19:03:45 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # killprocess 2245687 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 2245687 ']' 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 2245687 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2245687 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2245687' 00:27:00.523 killing process with pid 2245687 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 2245687 00:27:00.523 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 2245687 00:27:01.089 19:03:45 blockdev_crypto_qat -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:27:01.089 19:03:45 blockdev_crypto_qat -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:27:01.089 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:01.089 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:01.089 19:03:45 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:01.089 ************************************ 00:27:01.089 START TEST bdev_hello_world 00:27:01.089 ************************************ 00:27:01.089 19:03:45 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:27:01.089 [2024-07-24 19:03:45.980335] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:27:01.089 [2024-07-24 19:03:45.980376] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2246356 ] 00:27:01.089 [2024-07-24 19:03:46.042837] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:01.347 [2024-07-24 19:03:46.114308] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:01.347 [2024-07-24 19:03:46.135185] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:27:01.347 [2024-07-24 19:03:46.143209] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:01.347 [2024-07-24 19:03:46.151227] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:01.347 [2024-07-24 19:03:46.251062] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:27:03.874 [2024-07-24 19:03:48.386640] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:27:03.874 [2024-07-24 19:03:48.386695] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:03.874 [2024-07-24 19:03:48.386706] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:03.874 [2024-07-24 19:03:48.394663] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:27:03.874 [2024-07-24 19:03:48.394684] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:03.874 [2024-07-24 19:03:48.394694] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:03.874 [2024-07-24 19:03:48.402686] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:27:03.874 [2024-07-24 19:03:48.402699] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:03.874 [2024-07-24 19:03:48.402712] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:03.874 [2024-07-24 19:03:48.410697] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:27:03.874 [2024-07-24 19:03:48.410709] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:03.874 [2024-07-24 19:03:48.410716] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:03.874 [2024-07-24 19:03:48.477736] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:27:03.874 [2024-07-24 19:03:48.477772] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:27:03.874 [2024-07-24 19:03:48.477783] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:27:03.874 [2024-07-24 19:03:48.478627] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:27:03.874 [2024-07-24 19:03:48.478692] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:27:03.874 [2024-07-24 19:03:48.478707] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:27:03.874 [2024-07-24 19:03:48.478742] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:27:03.874 00:27:03.874 [2024-07-24 19:03:48.478755] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:27:03.875 00:27:03.875 real 0m2.837s 00:27:03.875 user 0m2.559s 00:27:03.875 sys 0m0.242s 00:27:03.875 19:03:48 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:03.875 19:03:48 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:27:03.875 ************************************ 00:27:03.875 END TEST bdev_hello_world 00:27:03.875 ************************************ 00:27:03.875 19:03:48 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:27:03.875 19:03:48 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:27:03.875 19:03:48 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:03.875 19:03:48 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:03.875 ************************************ 00:27:03.875 START TEST bdev_bounds 00:27:03.875 ************************************ 00:27:03.875 19:03:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:27:03.875 19:03:48 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=2246828 00:27:03.875 19:03:48 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:27:03.875 19:03:48 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:27:03.875 19:03:48 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 2246828' 00:27:03.875 Process bdevio pid: 2246828 00:27:03.875 19:03:48 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 2246828 00:27:03.875 19:03:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2246828 ']' 00:27:03.875 19:03:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:03.875 19:03:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:03.875 19:03:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:03.875 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:03.875 19:03:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:03.875 19:03:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:27:04.133 [2024-07-24 19:03:48.889332] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:27:04.133 [2024-07-24 19:03:48.889374] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2246828 ] 00:27:04.133 [2024-07-24 19:03:48.953443] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:27:04.133 [2024-07-24 19:03:49.027230] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:04.133 [2024-07-24 19:03:49.027324] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:27:04.133 [2024-07-24 19:03:49.027327] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:04.133 [2024-07-24 19:03:49.048323] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:27:04.133 [2024-07-24 19:03:49.056352] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:04.133 [2024-07-24 19:03:49.064371] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:04.391 [2024-07-24 19:03:49.168802] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:27:06.294 [2024-07-24 19:03:51.303151] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:27:06.294 [2024-07-24 19:03:51.303210] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:06.294 [2024-07-24 19:03:51.303222] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:06.553 [2024-07-24 19:03:51.311167] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:27:06.553 [2024-07-24 19:03:51.311189] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:06.553 [2024-07-24 19:03:51.311199] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:06.553 [2024-07-24 19:03:51.319188] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:27:06.553 [2024-07-24 19:03:51.319201] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:06.553 [2024-07-24 19:03:51.319210] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:06.553 [2024-07-24 19:03:51.327209] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:27:06.553 [2024-07-24 19:03:51.327222] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:06.553 [2024-07-24 19:03:51.327229] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:06.553 19:03:51 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:06.553 19:03:51 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:27:06.553 19:03:51 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:27:06.553 I/O targets: 00:27:06.553 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:27:06.553 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:27:06.553 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:27:06.553 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:27:06.553 00:27:06.553 00:27:06.553 CUnit - A unit testing framework for C - Version 2.1-3 00:27:06.553 http://cunit.sourceforge.net/ 00:27:06.553 00:27:06.553 00:27:06.553 Suite: bdevio tests on: crypto_ram3 00:27:06.553 Test: blockdev write read block ...passed 00:27:06.553 Test: blockdev write zeroes read block ...passed 00:27:06.553 Test: blockdev write zeroes read no split ...passed 00:27:06.553 Test: blockdev write zeroes read split ...passed 00:27:06.553 Test: blockdev write zeroes read split partial ...passed 00:27:06.553 Test: blockdev reset ...passed 00:27:06.553 Test: blockdev write read 8 blocks ...passed 00:27:06.553 Test: blockdev write read size > 128k ...passed 00:27:06.553 Test: blockdev write read invalid size ...passed 00:27:06.553 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:06.553 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:06.553 Test: blockdev write read max offset ...passed 00:27:06.553 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:06.553 Test: blockdev writev readv 8 blocks ...passed 00:27:06.553 Test: blockdev writev readv 30 x 1block ...passed 00:27:06.553 Test: blockdev writev readv block ...passed 00:27:06.553 Test: blockdev writev readv size > 128k ...passed 00:27:06.553 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:06.553 Test: blockdev comparev and writev ...passed 00:27:06.553 Test: blockdev nvme passthru rw ...passed 00:27:06.553 Test: blockdev nvme passthru vendor specific ...passed 00:27:06.553 Test: blockdev nvme admin passthru ...passed 00:27:06.553 Test: blockdev copy ...passed 00:27:06.553 Suite: bdevio tests on: crypto_ram2 00:27:06.553 Test: blockdev write read block ...passed 00:27:06.553 Test: blockdev write zeroes read block ...passed 00:27:06.553 Test: blockdev write zeroes read no split ...passed 00:27:06.553 Test: blockdev write zeroes read split ...passed 00:27:06.553 Test: blockdev write zeroes read split partial ...passed 00:27:06.553 Test: blockdev reset ...passed 00:27:06.553 Test: blockdev write read 8 blocks ...passed 00:27:06.553 Test: blockdev write read size > 128k ...passed 00:27:06.553 Test: blockdev write read invalid size ...passed 00:27:06.554 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:06.554 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:06.554 Test: blockdev write read max offset ...passed 00:27:06.554 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:06.554 Test: blockdev writev readv 8 blocks ...passed 00:27:06.554 Test: blockdev writev readv 30 x 1block ...passed 00:27:06.554 Test: blockdev writev readv block ...passed 00:27:06.554 Test: blockdev writev readv size > 128k ...passed 00:27:06.554 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:06.554 Test: blockdev comparev and writev ...passed 00:27:06.554 Test: blockdev nvme passthru rw ...passed 00:27:06.554 Test: blockdev nvme passthru vendor specific ...passed 00:27:06.554 Test: blockdev nvme admin passthru ...passed 00:27:06.554 Test: blockdev copy ...passed 00:27:06.554 Suite: bdevio tests on: crypto_ram1 00:27:06.554 Test: blockdev write read block ...passed 00:27:06.554 Test: blockdev write zeroes read block ...passed 00:27:06.554 Test: blockdev write zeroes read no split ...passed 00:27:06.812 Test: blockdev write zeroes read split ...passed 00:27:06.812 Test: blockdev write zeroes read split partial ...passed 00:27:06.812 Test: blockdev reset ...passed 00:27:06.812 Test: blockdev write read 8 blocks ...passed 00:27:06.812 Test: blockdev write read size > 128k ...passed 00:27:06.812 Test: blockdev write read invalid size ...passed 00:27:06.812 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:06.812 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:06.812 Test: blockdev write read max offset ...passed 00:27:06.812 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:06.812 Test: blockdev writev readv 8 blocks ...passed 00:27:06.812 Test: blockdev writev readv 30 x 1block ...passed 00:27:06.812 Test: blockdev writev readv block ...passed 00:27:06.812 Test: blockdev writev readv size > 128k ...passed 00:27:06.812 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:06.812 Test: blockdev comparev and writev ...passed 00:27:06.812 Test: blockdev nvme passthru rw ...passed 00:27:06.812 Test: blockdev nvme passthru vendor specific ...passed 00:27:06.812 Test: blockdev nvme admin passthru ...passed 00:27:06.812 Test: blockdev copy ...passed 00:27:06.812 Suite: bdevio tests on: crypto_ram 00:27:06.812 Test: blockdev write read block ...passed 00:27:06.812 Test: blockdev write zeroes read block ...passed 00:27:06.812 Test: blockdev write zeroes read no split ...passed 00:27:06.812 Test: blockdev write zeroes read split ...passed 00:27:06.812 Test: blockdev write zeroes read split partial ...passed 00:27:06.812 Test: blockdev reset ...passed 00:27:06.812 Test: blockdev write read 8 blocks ...passed 00:27:06.812 Test: blockdev write read size > 128k ...passed 00:27:06.812 Test: blockdev write read invalid size ...passed 00:27:06.812 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:06.812 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:06.812 Test: blockdev write read max offset ...passed 00:27:06.812 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:06.812 Test: blockdev writev readv 8 blocks ...passed 00:27:06.812 Test: blockdev writev readv 30 x 1block ...passed 00:27:06.812 Test: blockdev writev readv block ...passed 00:27:06.812 Test: blockdev writev readv size > 128k ...passed 00:27:06.812 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:06.812 Test: blockdev comparev and writev ...passed 00:27:06.812 Test: blockdev nvme passthru rw ...passed 00:27:06.812 Test: blockdev nvme passthru vendor specific ...passed 00:27:06.812 Test: blockdev nvme admin passthru ...passed 00:27:06.812 Test: blockdev copy ...passed 00:27:06.812 00:27:06.812 Run Summary: Type Total Ran Passed Failed Inactive 00:27:06.812 suites 4 4 n/a 0 0 00:27:06.812 tests 92 92 92 0 0 00:27:06.812 asserts 520 520 520 0 n/a 00:27:06.812 00:27:06.812 Elapsed time = 0.510 seconds 00:27:06.812 0 00:27:06.812 19:03:51 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 2246828 00:27:06.812 19:03:51 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2246828 ']' 00:27:06.812 19:03:51 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2246828 00:27:06.812 19:03:51 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:27:06.812 19:03:51 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:06.812 19:03:51 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2246828 00:27:06.812 19:03:51 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:06.812 19:03:51 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:06.812 19:03:51 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2246828' 00:27:06.813 killing process with pid 2246828 00:27:06.813 19:03:51 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2246828 00:27:06.813 19:03:51 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2246828 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:27:07.381 00:27:07.381 real 0m3.257s 00:27:07.381 user 0m9.182s 00:27:07.381 sys 0m0.394s 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:27:07.381 ************************************ 00:27:07.381 END TEST bdev_bounds 00:27:07.381 ************************************ 00:27:07.381 19:03:52 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:27:07.381 19:03:52 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:27:07.381 19:03:52 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:07.381 19:03:52 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:07.381 ************************************ 00:27:07.381 START TEST bdev_nbd 00:27:07.381 ************************************ 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=4 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=4 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=2247486 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 2247486 /var/tmp/spdk-nbd.sock 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2247486 ']' 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:27:07.381 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:07.381 19:03:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:27:07.381 [2024-07-24 19:03:52.222679] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:27:07.381 [2024-07-24 19:03:52.222720] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:07.381 [2024-07-24 19:03:52.288358] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:07.381 [2024-07-24 19:03:52.360334] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:07.381 [2024-07-24 19:03:52.381216] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:27:07.381 [2024-07-24 19:03:52.389250] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:07.640 [2024-07-24 19:03:52.397254] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:07.640 [2024-07-24 19:03:52.491305] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:27:10.176 [2024-07-24 19:03:54.624089] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:27:10.176 [2024-07-24 19:03:54.624139] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:10.176 [2024-07-24 19:03:54.624150] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:10.176 [2024-07-24 19:03:54.632112] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:27:10.176 [2024-07-24 19:03:54.632134] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:10.176 [2024-07-24 19:03:54.632142] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:10.176 [2024-07-24 19:03:54.640128] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:27:10.176 [2024-07-24 19:03:54.640142] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:10.176 [2024-07-24 19:03:54.640150] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:10.176 [2024-07-24 19:03:54.648148] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:27:10.176 [2024-07-24 19:03:54.648160] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:10.176 [2024-07-24 19:03:54.648167] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:10.176 1+0 records in 00:27:10.176 1+0 records out 00:27:10.176 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000219677 s, 18.6 MB/s 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:10.176 19:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:10.176 1+0 records in 00:27:10.176 1+0 records out 00:27:10.176 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000208637 s, 19.6 MB/s 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:10.176 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:27:10.434 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:27:10.434 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:27:10.434 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:27:10.434 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:27:10.434 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:10.434 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:10.434 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:10.434 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:27:10.434 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:10.434 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:10.434 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:10.434 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:10.434 1+0 records in 00:27:10.434 1+0 records out 00:27:10.434 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229473 s, 17.8 MB/s 00:27:10.435 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:10.435 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:10.435 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:10.435 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:10.435 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:10.435 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:10.435 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:10.435 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:10.693 1+0 records in 00:27:10.693 1+0 records out 00:27:10.693 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000214834 s, 19.1 MB/s 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:10.693 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:27:10.951 { 00:27:10.951 "nbd_device": "/dev/nbd0", 00:27:10.951 "bdev_name": "crypto_ram" 00:27:10.951 }, 00:27:10.951 { 00:27:10.951 "nbd_device": "/dev/nbd1", 00:27:10.951 "bdev_name": "crypto_ram1" 00:27:10.951 }, 00:27:10.951 { 00:27:10.951 "nbd_device": "/dev/nbd2", 00:27:10.951 "bdev_name": "crypto_ram2" 00:27:10.951 }, 00:27:10.951 { 00:27:10.951 "nbd_device": "/dev/nbd3", 00:27:10.951 "bdev_name": "crypto_ram3" 00:27:10.951 } 00:27:10.951 ]' 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:27:10.951 { 00:27:10.951 "nbd_device": "/dev/nbd0", 00:27:10.951 "bdev_name": "crypto_ram" 00:27:10.951 }, 00:27:10.951 { 00:27:10.951 "nbd_device": "/dev/nbd1", 00:27:10.951 "bdev_name": "crypto_ram1" 00:27:10.951 }, 00:27:10.951 { 00:27:10.951 "nbd_device": "/dev/nbd2", 00:27:10.951 "bdev_name": "crypto_ram2" 00:27:10.951 }, 00:27:10.951 { 00:27:10.951 "nbd_device": "/dev/nbd3", 00:27:10.951 "bdev_name": "crypto_ram3" 00:27:10.951 } 00:27:10.951 ]' 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:10.951 19:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:27:11.209 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:11.209 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:11.209 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:11.209 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:11.209 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:11.209 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:11.209 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:11.209 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:11.209 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:11.209 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:27:11.467 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:27:11.467 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:27:11.467 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:27:11.467 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:11.467 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:11.467 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:27:11.467 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:11.467 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:11.467 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:11.467 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:27:11.726 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:11.727 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:11.727 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:11.727 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:27:11.727 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:11.727 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:11.727 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:27:11.985 /dev/nbd0 00:27:11.985 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:11.985 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:11.985 19:03:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:11.985 19:03:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:11.985 19:03:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:11.985 19:03:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:11.985 19:03:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:11.985 19:03:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:11.986 19:03:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:11.986 19:03:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:11.986 19:03:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:11.986 1+0 records in 00:27:11.986 1+0 records out 00:27:11.986 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00017443 s, 23.5 MB/s 00:27:11.986 19:03:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:11.986 19:03:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:11.986 19:03:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:11.986 19:03:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:11.986 19:03:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:11.986 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:11.986 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:11.986 19:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:27:12.244 /dev/nbd1 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:12.244 1+0 records in 00:27:12.244 1+0 records out 00:27:12.244 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000255178 s, 16.1 MB/s 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:12.244 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:27:12.502 /dev/nbd10 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:12.502 1+0 records in 00:27:12.502 1+0 records out 00:27:12.502 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024591 s, 16.7 MB/s 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:27:12.502 /dev/nbd11 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:12.502 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:12.760 1+0 records in 00:27:12.760 1+0 records out 00:27:12.760 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237219 s, 17.3 MB/s 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:27:12.760 { 00:27:12.760 "nbd_device": "/dev/nbd0", 00:27:12.760 "bdev_name": "crypto_ram" 00:27:12.760 }, 00:27:12.760 { 00:27:12.760 "nbd_device": "/dev/nbd1", 00:27:12.760 "bdev_name": "crypto_ram1" 00:27:12.760 }, 00:27:12.760 { 00:27:12.760 "nbd_device": "/dev/nbd10", 00:27:12.760 "bdev_name": "crypto_ram2" 00:27:12.760 }, 00:27:12.760 { 00:27:12.760 "nbd_device": "/dev/nbd11", 00:27:12.760 "bdev_name": "crypto_ram3" 00:27:12.760 } 00:27:12.760 ]' 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:27:12.760 { 00:27:12.760 "nbd_device": "/dev/nbd0", 00:27:12.760 "bdev_name": "crypto_ram" 00:27:12.760 }, 00:27:12.760 { 00:27:12.760 "nbd_device": "/dev/nbd1", 00:27:12.760 "bdev_name": "crypto_ram1" 00:27:12.760 }, 00:27:12.760 { 00:27:12.760 "nbd_device": "/dev/nbd10", 00:27:12.760 "bdev_name": "crypto_ram2" 00:27:12.760 }, 00:27:12.760 { 00:27:12.760 "nbd_device": "/dev/nbd11", 00:27:12.760 "bdev_name": "crypto_ram3" 00:27:12.760 } 00:27:12.760 ]' 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:27:12.760 /dev/nbd1 00:27:12.760 /dev/nbd10 00:27:12.760 /dev/nbd11' 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:27:12.760 /dev/nbd1 00:27:12.760 /dev/nbd10 00:27:12.760 /dev/nbd11' 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:27:12.760 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:27:13.019 256+0 records in 00:27:13.019 256+0 records out 00:27:13.019 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103107 s, 102 MB/s 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:27:13.019 256+0 records in 00:27:13.019 256+0 records out 00:27:13.019 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0348395 s, 30.1 MB/s 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:27:13.019 256+0 records in 00:27:13.019 256+0 records out 00:27:13.019 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0301812 s, 34.7 MB/s 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:27:13.019 256+0 records in 00:27:13.019 256+0 records out 00:27:13.019 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0262813 s, 39.9 MB/s 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:27:13.019 256+0 records in 00:27:13.019 256+0 records out 00:27:13.019 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0245433 s, 42.7 MB/s 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:13.019 19:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:13.277 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:13.277 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:13.277 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:13.277 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:13.277 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:13.277 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:13.277 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:13.277 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:13.277 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:13.277 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:27:13.535 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:13.535 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:13.535 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:13.535 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:13.535 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:13.535 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:13.535 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:13.536 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:13.536 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:13.536 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:27:13.536 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:27:13.536 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:27:13.536 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:27:13.536 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:13.536 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:13.536 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:27:13.536 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:13.536 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:13.536 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:13.536 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:27:13.793 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:27:13.793 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:27:13.793 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:27:13.793 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:13.793 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:13.793 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:27:13.793 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:13.793 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:13.793 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:13.793 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:13.793 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:14.051 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:27:14.051 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:27:14.051 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:14.051 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:27:14.051 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:27:14.051 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:14.051 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:27:14.051 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:27:14.051 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:27:14.051 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:27:14.051 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:27:14.051 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:27:14.051 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:14.052 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:14.052 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:14.052 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:27:14.052 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:27:14.052 19:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:27:14.052 malloc_lvol_verify 00:27:14.052 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:27:14.310 d383ebb4-2b0c-412b-8fd9-10c46f520676 00:27:14.310 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:27:14.568 f2adad78-0cf0-43aa-8576-87861a5ab535 00:27:14.568 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:27:14.568 /dev/nbd0 00:27:14.568 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:27:14.568 mke2fs 1.46.5 (30-Dec-2021) 00:27:14.568 Discarding device blocks: 0/4096 done 00:27:14.568 Creating filesystem with 4096 1k blocks and 1024 inodes 00:27:14.568 00:27:14.568 Allocating group tables: 0/1 done 00:27:14.568 Writing inode tables: 0/1 done 00:27:14.568 Creating journal (1024 blocks): done 00:27:14.568 Writing superblocks and filesystem accounting information: 0/1 done 00:27:14.568 00:27:14.568 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:27:14.568 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:27:14.568 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:14.568 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:14.568 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:14.568 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:27:14.568 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:14.568 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 2247486 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2247486 ']' 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2247486 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2247486 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2247486' 00:27:14.827 killing process with pid 2247486 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2247486 00:27:14.827 19:03:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2247486 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:27:15.394 00:27:15.394 real 0m8.129s 00:27:15.394 user 0m10.765s 00:27:15.394 sys 0m2.443s 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:27:15.394 ************************************ 00:27:15.394 END TEST bdev_nbd 00:27:15.394 ************************************ 00:27:15.394 19:04:00 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:27:15.394 19:04:00 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # '[' crypto_qat = nvme ']' 00:27:15.394 19:04:00 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # '[' crypto_qat = gpt ']' 00:27:15.394 19:04:00 blockdev_crypto_qat -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:27:15.394 19:04:00 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:27:15.394 19:04:00 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:15.394 19:04:00 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:15.394 ************************************ 00:27:15.394 START TEST bdev_fio 00:27:15.394 ************************************ 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:27:15.394 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1278 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1279 -- # local workload=verify 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local bdev_type=AIO 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local env_context= 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local fio_dir=/usr/src/fio 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1289 -- # '[' -z verify ']' 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1293 -- # '[' -n '' ']' 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1297 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # cat 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1311 -- # '[' verify == verify ']' 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1312 -- # cat 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1321 -- # '[' AIO == AIO ']' 00:27:15.394 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1322 -- # /usr/src/fio/fio --version 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1322 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # echo serialize_overlap=1 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram1]' 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram1 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram2]' 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram2 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:27:15.653 ************************************ 00:27:15.653 START TEST bdev_fio_rw_verify 00:27:15.653 ************************************ 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1354 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # local fio_dir=/usr/src/fio 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local sanitizers 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # shift 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # local asan_lib= 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # for sanitizer in "${sanitizers[@]}" 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # grep libasan 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # awk '{print $3}' 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # asan_lib= 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # [[ -n '' ]] 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # for sanitizer in "${sanitizers[@]}" 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # awk '{print $3}' 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # grep libclang_rt.asan 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # asan_lib= 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # [[ -n '' ]] 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1350 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:27:15.653 19:04:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1350 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:15.912 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:15.912 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:15.912 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:15.912 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:15.912 fio-3.35 00:27:15.912 Starting 4 threads 00:27:30.821 00:27:30.821 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2249531: Wed Jul 24 19:04:13 2024 00:27:30.821 read: IOPS=27.0k, BW=106MiB/s (111MB/s)(1056MiB/10001msec) 00:27:30.821 slat (usec): min=11, max=498, avg=52.65, stdev=37.32 00:27:30.821 clat (usec): min=16, max=1530, avg=290.08, stdev=203.01 00:27:30.821 lat (usec): min=37, max=1710, avg=342.73, stdev=224.39 00:27:30.821 clat percentiles (usec): 00:27:30.821 | 50.000th=[ 229], 99.000th=[ 1020], 99.900th=[ 1237], 99.990th=[ 1352], 00:27:30.821 | 99.999th=[ 1467] 00:27:30.821 write: IOPS=29.7k, BW=116MiB/s (122MB/s)(1132MiB/9744msec); 0 zone resets 00:27:30.821 slat (usec): min=18, max=970, avg=60.60, stdev=36.29 00:27:30.821 clat (usec): min=16, max=1522, avg=318.22, stdev=206.28 00:27:30.821 lat (usec): min=46, max=1715, avg=378.83, stdev=226.45 00:27:30.821 clat percentiles (usec): 00:27:30.821 | 50.000th=[ 269], 99.000th=[ 1074], 99.900th=[ 1287], 99.990th=[ 1401], 00:27:30.821 | 99.999th=[ 1483] 00:27:30.821 bw ( KiB/s): min=98200, max=159596, per=97.58%, avg=116117.26, stdev=3403.82, samples=76 00:27:30.821 iops : min=24550, max=39899, avg=29029.32, stdev=850.95, samples=76 00:27:30.821 lat (usec) : 20=0.01%, 50=0.04%, 100=7.06%, 250=43.76%, 500=35.37% 00:27:30.821 lat (usec) : 750=8.90%, 1000=3.48% 00:27:30.821 lat (msec) : 2=1.38% 00:27:30.821 cpu : usr=99.69%, sys=0.00%, ctx=78, majf=0, minf=268 00:27:30.821 IO depths : 1=1.8%, 2=28.1%, 4=56.1%, 8=14.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:27:30.821 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:30.821 complete : 0=0.0%, 4=87.7%, 8=12.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:30.821 issued rwts: total=270286,289869,0,0 short=0,0,0,0 dropped=0,0,0,0 00:27:30.821 latency : target=0, window=0, percentile=100.00%, depth=8 00:27:30.821 00:27:30.821 Run status group 0 (all jobs): 00:27:30.821 READ: bw=106MiB/s (111MB/s), 106MiB/s-106MiB/s (111MB/s-111MB/s), io=1056MiB (1107MB), run=10001-10001msec 00:27:30.821 WRITE: bw=116MiB/s (122MB/s), 116MiB/s-116MiB/s (122MB/s-122MB/s), io=1132MiB (1187MB), run=9744-9744msec 00:27:30.821 00:27:30.821 real 0m13.255s 00:27:30.821 user 0m48.557s 00:27:30.821 sys 0m0.394s 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:27:30.821 ************************************ 00:27:30.821 END TEST bdev_fio_rw_verify 00:27:30.821 ************************************ 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1278 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1279 -- # local workload=trim 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local bdev_type= 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local env_context= 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local fio_dir=/usr/src/fio 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1289 -- # '[' -z trim ']' 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1293 -- # '[' -n '' ']' 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1297 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # cat 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1311 -- # '[' trim == verify ']' 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1326 -- # '[' trim == trim ']' 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1327 -- # echo rw=trimwrite 00:27:30.821 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "30a265c9-0e53-591f-b1a8-7cabaea47b73"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "30a265c9-0e53-591f-b1a8-7cabaea47b73",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "cf136c12-6602-59ca-a3fb-2ef4d545b620"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "cf136c12-6602-59ca-a3fb-2ef4d545b620",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "5eb97947-a2f1-5113-85d3-b1b15f52115f"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "5eb97947-a2f1-5113-85d3-b1b15f52115f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "c25dcc82-8009-59ed-aa50-b951ec840a73"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "c25dcc82-8009-59ed-aa50-b951ec840a73",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:27:30.822 crypto_ram1 00:27:30.822 crypto_ram2 00:27:30.822 crypto_ram3 ]] 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "30a265c9-0e53-591f-b1a8-7cabaea47b73"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "30a265c9-0e53-591f-b1a8-7cabaea47b73",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "cf136c12-6602-59ca-a3fb-2ef4d545b620"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "cf136c12-6602-59ca-a3fb-2ef4d545b620",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "5eb97947-a2f1-5113-85d3-b1b15f52115f"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "5eb97947-a2f1-5113-85d3-b1b15f52115f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "c25dcc82-8009-59ed-aa50-b951ec840a73"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "c25dcc82-8009-59ed-aa50-b951ec840a73",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram1]' 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram1 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram2]' 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram2 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:27:30.822 ************************************ 00:27:30.822 START TEST bdev_fio_trim 00:27:30.822 ************************************ 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1354 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # local fio_dir=/usr/src/fio 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local sanitizers 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # shift 00:27:30.822 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # local asan_lib= 00:27:30.823 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # for sanitizer in "${sanitizers[@]}" 00:27:30.823 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:30.823 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # grep libasan 00:27:30.823 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # awk '{print $3}' 00:27:30.823 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # asan_lib= 00:27:30.823 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # [[ -n '' ]] 00:27:30.823 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # for sanitizer in "${sanitizers[@]}" 00:27:30.823 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:30.823 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # grep libclang_rt.asan 00:27:30.823 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # awk '{print $3}' 00:27:30.823 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # asan_lib= 00:27:30.823 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # [[ -n '' ]] 00:27:30.823 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1350 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:27:30.823 19:04:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1350 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:30.823 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:30.823 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:30.823 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:30.823 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:30.823 fio-3.35 00:27:30.823 Starting 4 threads 00:27:43.016 00:27:43.016 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2251833: Wed Jul 24 19:04:26 2024 00:27:43.016 write: IOPS=44.2k, BW=173MiB/s (181MB/s)(1726MiB/10001msec); 0 zone resets 00:27:43.016 slat (usec): min=11, max=1076, avg=50.22, stdev=24.07 00:27:43.016 clat (usec): min=27, max=1317, avg=194.50, stdev=100.69 00:27:43.016 lat (usec): min=39, max=1394, avg=244.73, stdev=112.92 00:27:43.016 clat percentiles (usec): 00:27:43.016 | 50.000th=[ 176], 99.000th=[ 482], 99.900th=[ 594], 99.990th=[ 685], 00:27:43.016 | 99.999th=[ 1106] 00:27:43.016 bw ( KiB/s): min=157600, max=277088, per=100.00%, avg=177281.21, stdev=9491.66, samples=76 00:27:43.016 iops : min=39400, max=69272, avg=44320.26, stdev=2372.88, samples=76 00:27:43.016 trim: IOPS=44.2k, BW=173MiB/s (181MB/s)(1726MiB/10001msec); 0 zone resets 00:27:43.016 slat (usec): min=4, max=102, avg=15.32, stdev= 6.67 00:27:43.016 clat (usec): min=39, max=1394, avg=244.88, stdev=112.94 00:27:43.016 lat (usec): min=44, max=1432, avg=260.21, stdev=115.12 00:27:43.016 clat percentiles (usec): 00:27:43.016 | 50.000th=[ 229], 99.000th=[ 570], 99.900th=[ 693], 99.990th=[ 824], 00:27:43.016 | 99.999th=[ 1352] 00:27:43.016 bw ( KiB/s): min=157600, max=277088, per=100.00%, avg=177281.21, stdev=9491.66, samples=76 00:27:43.016 iops : min=39400, max=69272, avg=44320.26, stdev=2372.88, samples=76 00:27:43.016 lat (usec) : 50=1.19%, 100=10.83%, 250=52.48%, 500=33.84%, 750=1.65% 00:27:43.016 lat (usec) : 1000=0.01% 00:27:43.016 lat (msec) : 2=0.01% 00:27:43.016 cpu : usr=99.69%, sys=0.01%, ctx=75, majf=0, minf=115 00:27:43.016 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:27:43.016 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:43.016 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:43.016 issued rwts: total=0,441738,441739,0 short=0,0,0,0 dropped=0,0,0,0 00:27:43.016 latency : target=0, window=0, percentile=100.00%, depth=8 00:27:43.016 00:27:43.016 Run status group 0 (all jobs): 00:27:43.016 WRITE: bw=173MiB/s (181MB/s), 173MiB/s-173MiB/s (181MB/s-181MB/s), io=1726MiB (1809MB), run=10001-10001msec 00:27:43.016 TRIM: bw=173MiB/s (181MB/s), 173MiB/s-173MiB/s (181MB/s-181MB/s), io=1726MiB (1809MB), run=10001-10001msec 00:27:43.016 00:27:43.016 real 0m13.217s 00:27:43.016 user 0m48.729s 00:27:43.016 sys 0m0.350s 00:27:43.016 19:04:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:43.016 19:04:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:27:43.016 ************************************ 00:27:43.016 END TEST bdev_fio_trim 00:27:43.016 ************************************ 00:27:43.016 19:04:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:27:43.016 19:04:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:43.016 19:04:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:27:43.016 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:43.016 19:04:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:27:43.016 00:27:43.016 real 0m26.778s 00:27:43.016 user 1m37.456s 00:27:43.016 sys 0m0.898s 00:27:43.016 19:04:27 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:43.016 19:04:27 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:27:43.016 ************************************ 00:27:43.016 END TEST bdev_fio 00:27:43.016 ************************************ 00:27:43.016 19:04:27 blockdev_crypto_qat -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:27:43.016 19:04:27 blockdev_crypto_qat -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:27:43.016 19:04:27 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:27:43.016 19:04:27 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:43.016 19:04:27 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:43.016 ************************************ 00:27:43.016 START TEST bdev_verify 00:27:43.016 ************************************ 00:27:43.016 19:04:27 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:27:43.016 [2024-07-24 19:04:27.261793] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:27:43.016 [2024-07-24 19:04:27.261834] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2253536 ] 00:27:43.016 [2024-07-24 19:04:27.325860] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:43.016 [2024-07-24 19:04:27.399822] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:43.016 [2024-07-24 19:04:27.399824] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:43.016 [2024-07-24 19:04:27.420823] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:27:43.016 [2024-07-24 19:04:27.428847] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:43.016 [2024-07-24 19:04:27.436872] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:43.016 [2024-07-24 19:04:27.537006] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:27:44.918 [2024-07-24 19:04:29.669817] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:27:44.918 [2024-07-24 19:04:29.669881] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:44.918 [2024-07-24 19:04:29.669889] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:44.918 [2024-07-24 19:04:29.677835] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:27:44.918 [2024-07-24 19:04:29.677846] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:44.918 [2024-07-24 19:04:29.677852] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:44.918 [2024-07-24 19:04:29.685855] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:27:44.918 [2024-07-24 19:04:29.685865] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:44.918 [2024-07-24 19:04:29.685871] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:44.918 [2024-07-24 19:04:29.693876] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:27:44.918 [2024-07-24 19:04:29.693886] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:44.918 [2024-07-24 19:04:29.693897] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:44.918 Running I/O for 5 seconds... 00:27:50.183 00:27:50.183 Latency(us) 00:27:50.183 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:50.183 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:27:50.183 Verification LBA range: start 0x0 length 0x1000 00:27:50.183 crypto_ram : 5.05 735.35 2.87 0.00 0.00 173799.21 2980.33 113346.07 00:27:50.183 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:27:50.183 Verification LBA range: start 0x1000 length 0x1000 00:27:50.183 crypto_ram : 5.04 736.08 2.88 0.00 0.00 173621.75 3807.33 113346.07 00:27:50.183 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:27:50.183 Verification LBA range: start 0x0 length 0x1000 00:27:50.183 crypto_ram1 : 5.05 735.25 2.87 0.00 0.00 173452.58 3198.78 104857.60 00:27:50.183 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:27:50.183 Verification LBA range: start 0x1000 length 0x1000 00:27:50.183 crypto_ram1 : 5.04 735.98 2.87 0.00 0.00 173271.28 4088.20 104857.60 00:27:50.183 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:27:50.183 Verification LBA range: start 0x0 length 0x1000 00:27:50.183 crypto_ram2 : 5.03 5748.85 22.46 0.00 0.00 22121.51 5835.82 17476.27 00:27:50.183 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:27:50.183 Verification LBA range: start 0x1000 length 0x1000 00:27:50.183 crypto_ram2 : 5.03 5771.46 22.54 0.00 0.00 22041.55 4025.78 17601.10 00:27:50.183 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:27:50.183 Verification LBA range: start 0x0 length 0x1000 00:27:50.183 crypto_ram3 : 5.04 5762.63 22.51 0.00 0.00 22032.51 2652.65 17476.27 00:27:50.183 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:27:50.183 Verification LBA range: start 0x1000 length 0x1000 00:27:50.183 crypto_ram3 : 5.04 5770.27 22.54 0.00 0.00 22000.74 3588.88 17351.44 00:27:50.183 =================================================================================================================== 00:27:50.183 Total : 25995.88 101.55 0.00 0.00 39226.43 2652.65 113346.07 00:27:50.183 00:27:50.183 real 0m7.926s 00:27:50.183 user 0m15.247s 00:27:50.183 sys 0m0.264s 00:27:50.183 19:04:35 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:50.183 19:04:35 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:27:50.183 ************************************ 00:27:50.183 END TEST bdev_verify 00:27:50.183 ************************************ 00:27:50.183 19:04:35 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:27:50.183 19:04:35 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:27:50.183 19:04:35 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:50.183 19:04:35 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:50.442 ************************************ 00:27:50.442 START TEST bdev_verify_big_io 00:27:50.442 ************************************ 00:27:50.442 19:04:35 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:27:50.442 [2024-07-24 19:04:35.245477] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:27:50.442 [2024-07-24 19:04:35.245510] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2254820 ] 00:27:50.442 [2024-07-24 19:04:35.305587] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:50.442 [2024-07-24 19:04:35.378950] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:50.442 [2024-07-24 19:04:35.378963] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:50.442 [2024-07-24 19:04:35.400299] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:27:50.442 [2024-07-24 19:04:35.408324] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:50.442 [2024-07-24 19:04:35.416358] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:50.700 [2024-07-24 19:04:35.510989] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:27:53.231 [2024-07-24 19:04:37.644033] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:27:53.231 [2024-07-24 19:04:37.644082] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:53.231 [2024-07-24 19:04:37.644090] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:53.231 [2024-07-24 19:04:37.652052] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:27:53.231 [2024-07-24 19:04:37.652064] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:53.231 [2024-07-24 19:04:37.652069] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:53.231 [2024-07-24 19:04:37.660073] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:27:53.231 [2024-07-24 19:04:37.660083] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:53.231 [2024-07-24 19:04:37.660088] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:53.231 [2024-07-24 19:04:37.668094] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:27:53.231 [2024-07-24 19:04:37.668103] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:53.231 [2024-07-24 19:04:37.668108] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:53.231 Running I/O for 5 seconds... 00:27:53.492 [2024-07-24 19:04:38.272562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.272866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.272915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.272946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.272971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.272996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.273297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.273306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.275906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.275937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.275962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.275991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.276334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.276365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.276391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.276417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.276692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.276701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.279308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.279338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.279366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.279391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.279715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.279756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.279794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.492 [2024-07-24 19:04:38.279820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.280113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.280122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.282866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.282896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.282921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.282966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.283284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.283311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.283347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.283371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.283655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.283664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.286632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.286671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.286704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.286729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.287037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.287065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.287094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.287119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.287409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.287418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.289950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.289988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.290017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.290065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.290396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.290423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.290448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.290476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.290829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.290840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.293284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.293313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.293339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.293364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.293697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.293724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.293749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.293774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.294093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.294102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.296438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.296497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.296525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.296551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.296931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.296960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.296984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.297012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.297333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.297342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.299613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.299642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.299668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.299694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.300033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.300061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.300087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.300112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.300396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.300406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.302934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.302963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.302988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.303013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.303357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.303386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.303411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.303435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.303714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.303723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.306129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.306157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.306182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.306207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.493 [2024-07-24 19:04:38.306557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.306585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.306609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.306636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.306961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.306970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.309512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.309542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.309567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.309595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.309891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.309918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.309943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.309971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.310288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.310297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.312586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.312625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.312654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.312680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.313056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.313084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.313109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.313134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.313450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.313460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.315751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.315781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.315806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.315832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.316166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.316195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.316220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.316245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.316526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.316535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.318845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.318873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.318898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.318926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.319265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.319292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.319334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.319359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.319604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.319613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.321984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.322014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.322039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.322065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.322407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.322446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.322475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.322500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.322788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.322797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.325236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.325264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.325301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.325325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.325621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.325659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.325685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.325721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.326005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.326014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.328364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.328405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.328430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.328456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.328765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.328794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.328820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.328846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.329171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.329180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.331447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.331509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.331535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.331576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.331901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.331930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.331954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.331979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.332295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.332304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.334502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.334533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.334575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.334602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.334955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.494 [2024-07-24 19:04:38.334983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.335009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.335035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.335354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.335368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.337491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.337530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.337555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.337580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.337938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.337966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.337991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.338018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.338332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.338342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.340707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.340737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.340762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.340787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.341125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.341152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.341178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.341204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.341494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.341503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.343647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.343676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.343719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.343756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.344108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.344146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.344172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.344197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.344547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.344564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.346802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.346831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.346875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.346901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.347244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.347275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.347301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.347327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.347650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.347666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.349814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.349853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.349879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.349903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.350260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.350288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.350312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.350337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.350668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.350678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.352764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.352794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.352819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.352845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.353191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.353218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.353243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.353268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.353493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.353502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.355606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.355636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.355662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.355687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.356045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.356083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.356108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.356147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.356419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.356427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.358665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.358693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.358736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.358762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.359052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.359080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.359106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.359153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.359421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.359432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.495 [2024-07-24 19:04:38.361902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.361950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.361983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.362008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.362323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.362350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.362376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.362400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.362720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.362729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.364748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.364781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.364825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.364850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.365047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.365089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.365114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.365138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.365309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.365318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.366650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.366679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.366731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.366759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.366968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.367001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.367027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.367053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.367376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.367386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.369374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.369402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.369426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.369449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.369658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.369685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.369709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.369736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.369910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.369918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.372169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.372775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.373036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.373311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.373930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.374741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.375562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.376550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.376728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.376736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.379080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.379343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.379605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.379862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.380457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.381378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.382463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.383480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.383659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.383669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.385531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.385825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.386090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.386368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.387454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.388287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.389224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.390245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.390485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.390494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.391879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.392142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.392402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.392664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.394603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.395497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.396323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.397159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.397346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.397355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.399432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.399699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.400678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.401777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.402975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.403408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.404230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.496 [2024-07-24 19:04:38.405207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.405388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.405397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.407502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.408354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.409171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.410175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.411161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.412161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.413034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.414011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.414191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.414199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.416503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.417329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.418357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.419373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.420133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.420947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.421957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.422971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.423157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.423167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.425945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.426825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.427845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.428868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.430150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.431084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.432097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.433201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.433582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.433592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.436528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.437422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.438414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.439411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.440768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.441915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.443005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.444021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.444326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.444335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.447057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.448034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.449048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.449787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.450806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.451779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.452754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.453271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.453640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.453651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.456315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.457301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.458362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.458894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.460225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.461259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.462269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.462543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.462878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.462887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.465528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.466533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.497 [2024-07-24 19:04:38.467169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.468311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.469486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.470481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.470896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.471164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.471514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.471525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.474229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.475405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.476038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.476897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.478074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.478963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.479227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.479488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.479804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.479813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.482243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.482781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.483829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.484971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.486155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.486490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.486747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.487005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.487360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.487370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.489721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.490433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.491257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.492246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.493301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.493576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.493863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.494127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.494472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.494483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.496131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.497168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.498 [2024-07-24 19:04:38.498318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.499401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.499886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.500153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.500417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.500689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.500980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.500989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.502795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.503656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.504678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.505690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.506255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.506520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.506777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.507033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.507214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.507223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.509415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.760 [2024-07-24 19:04:38.510519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.511600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.512636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.513215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.513480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.513760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.514427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.514652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.514661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.516680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.517691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.518676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.519056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.519700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.519982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.520246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.521268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.521455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.521464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.523686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.524708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.525629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.525895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.526494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.526760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.527514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.528362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.528554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.528563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.530728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.531731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.532116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.532383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.532982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.533248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.534235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.535355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.535545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.535555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.537745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.538710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.538979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.539244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.539823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.540571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.541385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.542371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.542575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.542584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.544760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.545127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.545389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.545650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.546266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.547224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.548285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.549292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.549477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.549486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.551638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.551924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.552194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.552459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.553545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.554379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.555386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.556398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.556644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.556654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.558184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.558449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.558713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.558972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.560176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.561242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.562259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.563220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.563443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.563452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.564767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.565030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.565288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.565568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.566600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.567601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.568590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.569019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.569199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.569208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.761 [2024-07-24 19:04:38.570636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.570922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.571179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.571873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.573125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.574151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.574984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.575923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.576136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.576145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.577626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.577912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.578181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.579345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.580519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.581523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.581973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.582900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.583081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.583090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.584694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.584956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.585608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.586446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.587686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.588160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.588979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.589929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.590114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.590123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.592249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.592521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.592786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.593064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.593697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.593967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.594238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.594510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.594859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.594868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.596928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.597193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.597451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.597482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.598043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.598310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.598581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.598854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.599180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.599189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.601011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.601272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.601557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.601841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.601876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.602249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.602527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.602800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.603058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.603326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.603609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.603619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.605579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.605637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.605672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.605699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.605959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.606002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.606029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.606055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.606081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.606378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.606387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.608184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.608214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.608238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.608274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.608564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.608605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.608648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.608674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.608698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.609064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.609073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.610847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.610888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.610926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.610961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.611281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.611313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.611338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.762 [2024-07-24 19:04:38.611363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.611387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.611725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.611735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.613401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.613432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.613456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.613513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.613867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.613900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.613930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.613956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.613983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.614304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.614313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.616044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.616073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.616104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.616129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.616460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.616499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.616526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.616553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.616579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.616854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.616863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.618618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.618647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.618690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.618718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.619048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.619082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.619109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.619136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.619161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.619520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.619529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.621268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.621302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.621345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.621371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.621649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.621681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.621707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.621732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.621757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.622087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.622100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.623778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.623808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.623838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.623864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.624217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.624251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.624277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.624302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.624327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.624659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.624669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.626449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.626486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.626527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.626554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.626904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.626940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.626967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.626993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.627018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.627312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.627320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.629005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.629033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.629058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.629083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.629408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.629440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.629466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.629500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.629529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.629786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.629795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.631530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.631563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.631589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.631615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.631920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.631962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.632012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.632050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.763 [2024-07-24 19:04:38.632086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.632405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.632413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.634258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.634287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.634332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.634360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.634605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.634648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.634674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.634701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.634726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.635006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.635014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.636913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.636942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.636977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.637002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.637326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.637377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.637407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.637447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.637488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.637741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.637750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.639582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.639649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.639676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.639703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.639942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.639985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.640012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.640038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.640063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.640397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.640406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.642163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.642217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.642242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.642270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.642570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.642613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.642639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.642664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.642690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.643024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.643036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.644735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.644774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.644803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.644834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.645134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.645165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.645191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.645216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.645240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.645581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.645590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.647354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.647400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.647427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.647452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.647844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.647883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.647909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.647934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.647960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.648236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.648245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.650013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.650042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.650067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.650092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.650429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.650462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.650493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.650518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.650543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.650874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.650883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.652576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.652610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.652637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.652663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.652993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.653040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.653067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.653092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.653117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.653479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.764 [2024-07-24 19:04:38.653490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.655222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.655252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.655282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.655308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.655590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.655623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.655649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.655677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.655704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.656036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.656046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.657697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.657726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.657780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.657807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.658163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.658199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.658226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.658251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.658279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.658595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.658604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.660397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.660427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.660452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.660482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.660803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.660836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.660862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.660886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.660911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.661142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.661151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.662900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.662929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.662954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.662979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.663299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.663332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.663369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.663403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.663431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.663615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.663624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.665130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.665158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.665186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.665211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.665533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.665566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.665591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.665620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.665647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.665985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.665994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.667066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.667096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.667121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.667146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.667405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.667447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.667481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.667507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.667532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.667732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.667741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.668867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.668895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.668920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.668945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.669270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.669302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.669328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.669353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.669379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.765 [2024-07-24 19:04:38.669667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.669676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.671046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.671075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.671115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.671140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.671326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.671364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.671390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.671415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.671440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.671735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.671744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.672787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.672824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.672855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.672882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.673191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.673224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.673250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.673276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.673302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.673631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.673642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.675214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.675243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.675267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.675292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.675472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.675514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.675539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.675566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.675601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.675775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.675783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.676990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.677020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.677050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.677074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.677276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.677316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.677343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.677368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.677394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.677751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.677761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.679385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.679419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.680543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.680580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.680765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.680797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.680840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.680865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.680889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.681069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.681078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.682193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.682221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.682246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.682868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.683222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.683258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.683283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.683309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.683336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.683671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.683680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.686195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.687205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.687751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.688600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.688784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.689904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.690921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.691191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.691455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.691786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.691795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.694148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.694830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.695956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.696923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.697103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.698109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.698545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.698810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.699073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.699432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.699441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.766 [2024-07-24 19:04:38.701608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.702169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.703019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.704032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.704210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.705184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.705443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.705706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.705966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.706304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.706314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.708071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.709166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.710147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.711199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.711384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.711859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.712119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.712376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.712637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.712912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.712924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.714445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.715310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.716316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.717310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.717514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.717786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.718046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.718302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.718581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.718765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.718774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.720962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.721919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.722938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.723990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.724283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.724568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.724836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.725099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.725731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.725956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.725964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.727932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.728943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.729953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.730748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.731060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.731335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.731604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.731868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.732925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.733154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.733163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.735084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.736067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.737065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.737387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.737755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.738022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.738279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.738780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.739643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.739836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.739845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.741948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.742975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.743803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.744086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.744416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.744699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.744957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.746028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.746996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.747174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.747183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.749347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.750368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.750639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.750899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.751166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.751427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.752029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.752865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.753878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.754057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.754067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.767 [2024-07-24 19:04:38.756243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.768 [2024-07-24 19:04:38.757076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.768 [2024-07-24 19:04:38.757342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.768 [2024-07-24 19:04:38.757611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.768 [2024-07-24 19:04:38.757937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.768 [2024-07-24 19:04:38.758200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.768 [2024-07-24 19:04:38.759239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.768 [2024-07-24 19:04:38.760183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.768 [2024-07-24 19:04:38.761109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.768 [2024-07-24 19:04:38.761293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.768 [2024-07-24 19:04:38.761302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.768 [2024-07-24 19:04:38.763425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.768 [2024-07-24 19:04:38.763818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.768 [2024-07-24 19:04:38.764085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.768 [2024-07-24 19:04:38.764349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.768 [2024-07-24 19:04:38.764685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:53.768 [2024-07-24 19:04:38.765137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.030 [2024-07-24 19:04:38.765984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.030 [2024-07-24 19:04:38.767000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.767963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.768147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.768156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.770252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.770542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.770817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.771074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.771394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.772285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.773114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.774125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.775126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.775387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.775396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.776927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.777196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.777460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.777730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.778009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.778934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.779962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.780963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.781915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.782148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.782157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.783492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.783761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.784025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.784289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.784478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.785311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.786314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.787307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.787773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.787957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.787966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.789343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.789619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.789885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.790424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.790650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.791796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.792831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.793817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.794608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.794835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.794844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.796284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.796548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.796808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.797832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.798067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.799107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.800116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.800632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.801625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.801806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.801815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.803432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.803718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.804226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.805052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.805233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.806300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.807336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.808121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.808951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.809131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.809140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.810837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.811120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.812235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.813213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.813396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.814395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.814870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.815799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.816802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.816981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.816990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.818770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.819352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.031 [2024-07-24 19:04:38.820193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.821204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.821390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.822392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.823217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.824061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.825087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.825266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.825274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.827153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.828274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.829269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.830307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.830492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.830941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.831841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.832842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.833849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.834032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.834041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.836444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.837278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.838275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.839269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.839514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.840380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.841208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.842200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.843196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.843485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.843494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.846522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.847652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.848659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.849585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.849808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.850626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.851589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.852566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.853299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.853594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.853603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.856094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.857123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.858115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.858602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.858782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.859619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.860609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.861594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.861862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.862218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.862227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.864899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.865943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.866932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.867684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.867909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.868913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.869898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.870663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.870921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.871249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.871258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.873673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.874657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.875138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.876131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.876312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.877301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.878289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.878578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.878836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.879136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.879146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.881649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.881923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.883027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.884045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.884225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.884499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.884759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.885015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.885273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.885562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.885571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.887530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.887796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.032 [2024-07-24 19:04:38.888062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.888320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.888631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.888896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.889154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.889414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.889715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.890081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.890091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.892077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.892355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.892623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.892887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.893184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.893463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.893740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.894007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.894271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.894625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.894636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.896601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.896870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.897138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.897409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.897738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.898013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.898277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.898546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.898815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.899099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.899111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.901130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.901400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.901437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.901699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.902045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.902321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.902591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.902858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.903121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.903484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.903511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.905461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.905728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.905986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.906015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.906348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.906622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.906885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.907146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.907405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.907741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.907751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.909417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.909447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.909478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.909504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.909826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.909859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.909886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.909917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.909943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.910281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.910290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.911970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.911998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.912040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.912070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.912355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.912387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.912413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.912438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.912463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.912801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.912810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.914525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.914555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.914581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.914606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.914907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.914939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.914965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.914990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.915015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.915341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.915350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.917100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.917129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.917165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.917189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.033 [2024-07-24 19:04:38.917564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.917596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.917622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.917649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.917674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.917965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.917974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.919730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.919762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.919804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.919828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.920153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.920188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.920214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.920241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.920267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.920505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.920514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.922224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.922253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.922278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.922303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.922628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.922659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.922685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.922726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.922750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.923000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.923009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.924758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.924787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.924811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.924836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.925129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.925170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.925208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.925246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.925279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.925609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.925621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.927431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.927459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.927492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.927534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.927767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.927808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.927836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.927861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.927886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.928162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.928170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.930035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.930064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.930100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.930124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.930421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.930481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.930507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.930544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.930582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.930914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.930922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.932777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.932818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.932843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.932868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.933133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.933179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.933206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.933233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.933259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.933574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.933583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.935346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.935375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.935401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.935435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.935741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.935784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.935823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.935848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.935874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.936227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.936236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.937950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.937990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.938041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.938070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.938385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.938416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.938441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.938465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.034 [2024-07-24 19:04:38.938494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.938825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.938835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.940501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.940530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.940571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.940597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.940933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.940973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.940999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.941024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.941051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.941387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.941396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.943286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.943317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.943342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.943367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.943706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.943739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.943765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.943790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.943815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.944100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.944109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.945808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.945835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.945860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.945884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.946218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.946250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.946276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.946300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.946325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.946665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.946674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.948368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.948395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.948426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.948451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.948777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.948812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.948837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.948862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.948886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.949211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.949220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.950967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.950995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.951020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.951045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.951333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.951373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.951398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.951422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.951450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.951771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.951781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.952993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.953021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.953048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.953072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.953433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.953466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.953495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.953521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.953547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.953874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.953884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.955766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.955807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.955833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.035 [2024-07-24 19:04:38.955858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.956210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.956242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.956269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.956295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.956321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.956580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.956589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.957960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.957989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.958015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.958039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.958213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.958250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.958275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.958300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.958324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.958536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.958544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.959563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.959591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.959615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.959640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.959939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.959993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.960019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.960044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.960070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.960396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.960405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.962002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.962030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.962055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.962079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.962251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.962289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.962314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.962338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.962362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.962536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.962545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.963683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.963711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.963735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.963759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.963934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.963972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.963997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.964021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.964046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.964348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.964357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.966265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.966292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.966318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.966342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.966582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.966624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.966654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.966678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.966702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.966878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.966885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.967969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.967996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.968020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.968044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.968219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.968259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.968285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.968309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.968333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.968506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.968515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.970211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.970239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.970264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.970289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.970577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.970611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.970636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.970660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.970684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.970889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.970897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.971994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.972021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.036 [2024-07-24 19:04:38.972046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.972072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.972275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.972313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.972338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.972362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.972386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.972559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.972568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.974149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.974205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.974243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.974270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.974626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.974659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.974685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.974711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.974737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.975024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.975033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.976060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.976096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.976121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.976147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.976386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.976423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.976448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.976475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.976500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.976704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.976713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.977940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.977972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.977999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.978024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.978341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.978385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.978411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.978436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.978461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.978797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.978807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.979956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.980000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.980966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.980996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.981247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.981281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.981307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.981331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.981356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.981570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.981578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.982799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.982828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.982853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.983109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.983443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.983490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.983518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.983544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.983570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.983891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.983903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.985397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.986228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.987221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.988209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.988389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.988662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.988920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.989176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.989432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.989617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.989626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.991611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.992442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.993424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.994403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.994722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.995006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.995270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.995532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.995912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.996091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.996100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.998051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.037 [2024-07-24 19:04:38.999047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.000039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.000780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.001080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.001346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.001605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.001864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.002995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.003177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.003185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.005113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.006097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.007092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.007355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.007696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.007966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.008225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.008697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.009517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.009695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.009704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.011786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.012783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.013487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.013751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.014082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.014344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.014605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.015690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.016728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.016908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.016916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.019004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.020032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.020298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.020558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.020839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.021109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.021617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.022460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.023446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.023629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.023638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.025799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.026516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.026788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.027051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.027409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.027684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.028808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.029828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.030881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.031066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.031074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.033195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.033485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.033752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.034019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.034366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.034786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.038 [2024-07-24 19:04:39.035632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.036646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.037666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.037853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.037862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.039750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.040041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.040309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.040575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.040914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.042062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.043073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.044113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.045181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.045472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.045481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.046827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.047110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.047377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.047647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.047941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.048854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.049882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.050885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.051891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.052179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.052188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.053543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.053829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.054093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.054357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.054545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.055407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.056428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.057449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.057896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.058076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.058086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.059422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.059693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.059954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.060520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.060763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.061812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.062806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.063700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.064543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.064771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.064780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.066307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.066576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.066847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.067778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.067958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.068946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.069934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.070432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.071254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.071435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.071443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.073054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.073315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.073964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.074788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.074970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.301 [2024-07-24 19:04:39.075987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.076805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.077741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.078577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.078758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.078766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.080653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.080916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.081900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.082983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.083164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.084164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.084631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.085449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.086445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.086630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.086638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.088368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.089019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.089856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.090851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.091031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.091889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.092818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.093646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.094668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.094857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.094866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.097101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.097925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.098927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.099934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.100115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.100752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.101597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.102583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.103579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.103792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.103801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.106272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.107104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.108121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.109131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.109381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.110353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.111204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.112223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.113240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.113530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.113539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.116216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.117237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.118219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.119063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.119292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.120145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.121168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.122185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.122859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.123158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.123167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.125673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.126684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.127693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.128196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.128380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.129234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.130232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.131252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.131534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.131881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.131891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.134362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.135383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.136223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.137156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.137375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.138370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.139392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.140030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.140300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.140644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.140654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.143103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.144141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.144605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.145561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.145746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.146770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.147791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.148055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.302 [2024-07-24 19:04:39.148312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.148602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.148611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.150945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.151907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.152743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.153567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.153750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.154780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.155460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.155736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.156003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.156324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.156333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.158303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.159398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.160389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.161387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.161665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.161934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.162191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.162447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.163057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.163272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.163280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.165259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.166272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.167260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.168036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.168300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.168574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.168834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.169092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.169351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.169608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.169621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.171662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.171928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.172192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.172451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.172751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.173020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.173280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.173548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.173809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.174137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.174146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.176156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.176421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.176689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.176949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.177321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.177600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.177860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.178116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.178371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.178695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.178704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.180769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.181036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.181308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.181577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.181930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.182196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.182454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.182720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.182983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.183234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.183242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.185308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.185583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.185841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.186097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.186435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.186707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.186971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.187232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.187494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.187787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.187796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.189717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.189976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.190233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.190497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.190733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.191005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.191265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.191529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.191788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.192035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.192044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.303 [2024-07-24 19:04:39.194033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.194301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.194338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.194612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.194961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.195234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.195502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.195762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.196026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.196302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.196311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.198412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.198684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.198945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.198975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.199265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.199540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.199800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.200064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.200329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.200676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.200686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.202384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.202413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.202438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.202464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.202767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.202807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.202834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.202859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.202884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.203221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.203232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.204839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.204867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.204910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.204935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.205283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.205317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.205343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.205368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.205394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.205688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.205698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.207448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.207480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.207505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.207530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.207845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.207876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.207902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.207927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.207952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.208215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.208224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.209947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.209976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.210021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.210046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.210380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.210412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.210439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.210481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.210507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.210866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.210876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.212676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.212708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.212752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.212778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.213039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.213083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.213109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.213135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.213160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.213496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.213507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.215117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.215146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.215188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.215214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.215578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.215613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.215640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.304 [2024-07-24 19:04:39.215668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.215694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.216015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.216025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.218021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.218059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.218085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.218109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.218462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.218497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.218523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.218548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.218574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.218868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.218876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.220592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.220619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.220647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.220673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.221002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.221034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.221059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.221084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.221116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.221377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.221386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.223137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.223166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.223193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.223218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.223532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.223578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.223608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.223661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.223698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.223979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.223987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.225860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.225888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.225914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.225938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.226177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.226218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.226247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.226271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.226298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.226530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.226539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.228329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.228358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.228383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.228418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.228737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.228780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.228817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.228842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.228894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.229191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.229199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.231023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.231070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.231112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.231146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.231483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.231542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.231579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.231606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.231631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.231881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.231890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.233650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.233678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.233719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.233744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.233993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.234039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.234066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.305 [2024-07-24 19:04:39.234097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.234131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.234317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.234326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.236000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.236028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.236053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.236078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.236384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.236426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.236452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.236492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.236529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.236807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.236815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.238675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.238704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.238729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.238753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.238937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.238971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.238995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.239020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.239047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.239252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.239259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.240348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.240385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.240418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.240445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.240628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.240668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.240697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.240722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.240746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.240922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.240930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.242479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.242508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.242535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.242562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.242887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.242921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.242947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.242972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.243001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.243179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.243187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.244326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.244354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.244380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.244405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.244656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.244700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.244727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.244751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.244776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.244955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.244966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.246414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.246443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.246490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.246522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.246864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.246907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.246934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.246960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.246987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.247328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.247337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.248381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.248410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.248438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.248475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.248762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.248798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.248823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.248849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.248874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.249085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.249093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.250395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.250426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.250472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.250498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.250822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.250857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.306 [2024-07-24 19:04:39.250898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.250938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.250963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.251309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.251318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.252506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.252535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.252580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.252608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.252790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.252831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.252858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.252885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.252911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.253115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.253124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.254296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.254326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.254354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.254390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.254767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.254802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.254829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.254854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.254881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.255188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.255196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.256744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.256808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.256837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.256862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.257045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.257087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.257113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.257139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.257163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.257369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.257378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.258456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.258489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.258513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.258538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.258809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.258854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.258881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.258906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.258930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.259271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.259281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.260806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.260833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.260857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.260881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.261055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.261093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.261118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.261142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.261166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.261338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.261345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.262525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.262553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.262578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.262604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.262782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.262821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.262846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.262876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.262902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.263214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.263222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.264997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.265036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.265064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.265088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.265295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.265332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.265358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.265382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.265406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.265585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.265593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.266748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.266776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.266805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.266833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.267012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.267047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.267081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.267106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.267130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.267305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.307 [2024-07-24 19:04:39.267313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.269128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.269156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.269182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.269206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.269380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.269414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.269439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.269463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.269497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.269679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.269686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.270834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.270871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.271879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.271909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.272089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.272134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.272159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.272184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.272208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.272390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.272399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.274182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.274214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.274240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.275395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.275581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.275618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.275643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.275667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.275697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.275876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.275885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.278100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.279063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.279331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.279601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.279880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.280157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.280880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.281719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.282727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.282910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.282918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.285106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.285658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.285931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.286197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.286554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.286829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.287975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.289004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.290062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.290246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.290255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.292597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.292890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.293157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.293424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.293771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.294316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.295149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.296130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.297115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.297304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.297313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.299039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.299306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.299568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.299832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.300159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.301161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.302078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.303096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.304110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.304408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.304417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.305764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.308 [2024-07-24 19:04:39.306034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.570 [2024-07-24 19:04:39.306301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.570 [2024-07-24 19:04:39.306572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.570 [2024-07-24 19:04:39.306829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.570 [2024-07-24 19:04:39.307681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.570 [2024-07-24 19:04:39.308700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.309711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.310488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.310702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.310711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.312126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.312413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.312696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.312956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.313138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.313971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.314959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.315951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.316397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.316581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.316590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.318002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.318265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.318530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.319203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.319422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.320456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.321446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.322418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.323215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.323431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.323439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.324966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.325241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.325512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.326599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.326808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.327839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.328858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.329310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.330225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.330404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.330412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.332121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.332412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.333051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.333875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.334055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.335056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.335935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.336780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.337604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.337783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.337792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.339514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.339775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.340874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.341847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.342026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.343033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.343485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.344385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.345397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.345580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.345589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.347379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.348010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.348846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.349826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.350005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.350916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.351734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.352559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.353542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.353722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.353734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.355675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.356812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.357788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.358784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.358963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.359430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.360376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.361425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.362413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.362597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.362606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.364918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.365742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.571 [2024-07-24 19:04:39.366725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.367711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.367938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.368786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.369609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.370599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.371587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.371849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.371858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.375035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.376005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.377024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.378093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.378473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.379403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.380437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.381426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.382350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.382594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.382603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.385026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.386005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.386984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.387626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.387807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.388638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.389614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.390593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.391041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.391381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.391391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.393851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.394844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.395870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.396397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.396622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.397704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.398716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.399687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.399953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.400271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.400281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.402778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.403764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.404494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.405489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.405706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.406714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.407700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.408193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.408456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.408779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.408789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.411348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.412345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.412826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.413675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.413868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.414990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.416025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.416288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.416567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.416831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.416840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.419226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.419973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.420989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.421907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.422094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.423122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.423673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.423939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.424197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.424560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.424570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.426788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.427240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.428072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.429093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.429277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.430354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.430621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.430880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.431162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.431495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.431506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.433409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.433955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.434805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.572 [2024-07-24 19:04:39.435826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.436011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.437054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.437325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.437595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.437862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.438194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.438205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.439935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.440955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.441974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.442507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.442862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.443132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.443390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.443653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.444776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.444987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.444996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.447027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.448034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.449038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.449328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.449688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.449969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.450229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.450491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.450764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.451050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.451058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.453254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.453559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.453834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.454094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.454455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.454725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.454997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.455262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.455529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.455853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.455862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.457957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.458221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.458484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.458744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.459046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.459317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.459582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.459841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.460099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.460385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.460394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.462338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.462608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.462870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.463133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.463449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.463720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.463979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.464239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.464509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.464764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.464773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.466799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.467068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.467328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.467593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.467924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.468192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.468456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.468721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.468980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.469240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.469249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.471191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.471466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.471749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.472020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.472282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.472561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.472825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.473083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.473344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.473610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.473619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.475602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.475874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.476149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.476428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.476764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.477043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.573 [2024-07-24 19:04:39.477302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.477568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.477840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.478178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.478188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.480172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.481095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.481381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.481643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.481822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.482256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.482537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.482803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.483076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.483262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.483271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.485111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.485376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.485408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.485676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.485862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.486204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.486462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.487517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.487788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.488129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.488138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.489865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.490926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.491198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.491230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.491560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.491841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.492106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.493164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.493432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.493752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.493764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.495334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.495378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.495404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.495442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.495770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.495826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.495853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.495878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.495902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.496084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.496092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.497743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.497775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.497801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.497826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.498092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.498143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.498172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.498198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.498223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.498457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.498471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.499903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.499930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.499955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.499978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.500261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.500299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.500328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.500353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.500380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.500700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.500710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.502153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.502182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.502210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.502235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.502559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.502605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.502633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.502657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.502681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.502856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.502868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.504630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.504658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.504682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.504706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.504904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.504947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.504975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.574 [2024-07-24 19:04:39.505000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.505025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.505338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.505348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.506893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.506922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.506946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.506971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.507209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.507250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.507275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.507301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.507325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.507504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.507512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.509088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.509118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.509143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.509167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.509481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.509514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.509551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.509576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.509616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.509888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.509897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.511519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.511546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.511573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.511597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.511776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.511815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.511840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.511865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.511889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.512194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.512204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.513639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.513668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.513693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.513718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.514042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.514077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.514103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.514128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.514154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.514328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.514337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.516066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.516098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.516126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.516150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.516327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.516367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.516404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.516429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.516455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.516795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.516804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.518381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.518421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.518477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.518507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.518851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.518897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.518933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.518959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.518984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.519215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.519223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.520727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.520755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.520785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.575 [2024-07-24 19:04:39.520810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.521152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.521183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.521209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.521234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.521279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.521475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.521483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.522854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.522909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.522935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.522963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.523293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.523327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.523353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.523377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.523402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.523656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.523665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.525230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.525264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.525291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.525315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.525571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.525612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.525637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.525664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.525688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.526015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.526025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.527817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.527862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.527898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.527925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.528292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.528326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.528352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.528378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.528403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.528708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.528717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.530237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.530266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.530293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.530317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.530501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.530539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.530565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.530589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.530614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.530836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.530844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.534270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.534305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.534332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.534359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.534647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.534691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.534717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.534743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.534767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.535085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.535095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.537931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.537963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.538012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.538038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.538218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.538253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.538278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.538306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.538342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.538535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.538543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.541676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.541707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.541731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.541756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.541965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.542002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.542028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.542051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.542076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.542249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.542256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.545351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.545384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.545419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.545443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.545670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.576 [2024-07-24 19:04:39.545711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.545736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.545760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.545785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.546098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.546107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.548511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.548543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.548568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.548591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.548833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.548873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.548905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.548929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.548954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.549135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.549144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.551842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.551874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.551899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.551925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.552239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.552271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.552299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.552324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.552350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.552530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.552539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.554927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.554965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.554992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.555017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.555190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.555237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.555264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.555289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.555313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.555491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.555500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.557519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.557551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.557820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.557847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.557875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.577 [2024-07-24 19:04:39.558046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.608742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.614492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.614541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.614780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.614814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.615053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.615091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.615326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.615587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.615596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.615603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.620971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.621236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.621497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.621673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.621682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.623857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.624940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.625955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.626929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.627461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.627723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.627979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.628717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.628947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.628955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.630903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.631905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.632897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.633278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.633880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.634145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.634465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.635395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.838 [2024-07-24 19:04:39.635581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.635590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.637742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.638733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.639566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.639829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.640400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.640661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.641562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.642383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.642564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.642572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.644713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.645709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.645973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.646229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.646786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.647292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.648141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.649127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.649305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.649314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.651509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.652188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.652450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.652714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.653311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.654372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.655324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.656381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.656570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.656579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.658745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.659008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.659268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.659529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.660653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.661481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.662464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.663450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.663701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.663710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.665100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.665366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.665628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.665886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.666955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.667953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.668943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.669690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.669870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.669878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.671284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.671549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.671810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.672178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.673294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.674304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.675361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.675947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.676191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.676200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.677667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.677929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.678188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.679174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.680387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.681381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.681896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.682900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.683081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.683090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.684771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.685036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.685686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.686511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.687687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.688560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.689418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.690246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.839 [2024-07-24 19:04:39.690424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.690433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.692163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.692427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.693543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.694557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.695738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.696191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.697050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.698042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.698219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.698228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.700047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.700701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.701527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.702512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.703600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.704462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.705286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.706266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.706445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.706454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.708331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.709405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.710470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.711597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.712229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.713066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.714058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.715049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.715228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.715237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.717595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.718418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.719332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.720027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.721025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.721954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.722560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.723665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.724010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.724021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.726466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.727298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.728335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.729352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.730781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.731822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.732937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.733972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.734248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.734257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.736785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.737793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.738789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.739573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.740705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.741721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.742715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.743530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.743789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.743798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.745756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.746021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.746279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.746540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.747117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.747383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.747645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.747902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.748262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.748271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.750169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.750428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.750687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.750950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.751493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.751752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.752018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.752276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.752533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.752542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.754475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.754743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.755027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.755292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.755876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.756144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.840 [2024-07-24 19:04:39.756418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.756697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.757029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.757038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.759025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.759286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.759548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.759807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.760374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.760645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.760906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.761161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.761488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.761497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.763344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.763623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.763882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.764145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.764717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.764974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.765228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.765492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.765754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.765763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.767920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.768185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.768447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.768708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.769320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.769589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.769853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.770113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.770430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.770441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.772388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.772421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.772679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.772941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.773485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.773754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.774010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.774271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.774596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.774606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.776490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.776523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.776802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.776831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.777384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.777418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.777696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.777724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.778066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.778075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.780155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.780189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.780451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.780482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.781092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.781133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.781403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.781440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.781762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.781771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.784304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.784347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.784631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.784660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.785244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.785276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.785553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.785589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.785951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.785961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.787989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.788023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.788303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.788330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.788901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.788930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.789194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.789226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.789493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.789501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.791522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.791557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.791839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.791870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.841 [2024-07-24 19:04:39.792490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.792520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.792807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.792837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.793184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.793193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.795274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.795307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.795590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.795629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.796221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.796254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.796522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.796550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.796850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.796860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.798854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.798887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.799142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.799169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.799785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.799818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.800080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.800113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.800480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.800493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.802376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.802436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.802464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.802490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.802939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.802970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.802995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.803249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.803552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.803562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.805280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.805307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.805333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.805359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.805752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.805781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.805810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.805837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.806216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.806226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.807957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.807988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.808032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.808058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.808390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.808417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.808443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.808473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.808808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.808817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.809908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.809936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.809961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.809986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.810304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.810343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.810370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.810396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.810582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.810592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.811757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.811786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.811811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.811839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.812177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.812206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.812234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.812260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.812578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.812591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.842 [2024-07-24 19:04:39.813946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.813975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.814000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.814025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.814235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.814262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.814287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.814312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.814590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.814599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.815643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.815679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.815706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.815731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.816031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.816058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.816084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.816110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.816442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.816452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.817938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.817967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.817992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.818016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.818226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.818252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.818277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.818302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.818484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.818497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.819680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.819730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.819757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.819785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.819995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.820024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.820051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.820079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.820373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.820383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.822075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.822103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.822150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.822175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.822414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.822440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.822477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.822505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.822686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.822695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.823863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.823893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.823919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.823943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.824150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.824178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.824202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.824227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.824441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.824451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.826394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.826425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.826449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.826478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.826721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.826752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.843 [2024-07-24 19:04:39.826777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.844 [2024-07-24 19:04:39.826802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.844 [2024-07-24 19:04:39.826980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.844 [2024-07-24 19:04:39.826988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.844 [2024-07-24 19:04:39.828143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.844 [2024-07-24 19:04:39.828171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.844 [2024-07-24 19:04:39.828195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.844 [2024-07-24 19:04:39.828428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.844 [2024-07-24 19:04:39.828455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.844 [2024-07-24 19:04:39.828486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:54.844 [2024-07-24 19:04:39.828662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.929029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.929609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.929655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.929890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.929926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.930161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.930527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.930537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.936429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.936736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.936996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.937252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.937596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.937606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.939413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.940443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.941323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.942319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.943076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.943344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.943611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.943881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.944185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.944193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.945634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.946453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.947465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.948462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.948909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.949171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.949427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.949712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.949919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.949927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.951767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.952590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.953598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.954614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.955152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.955410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.955673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.956045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.956227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.956235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.958293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.959341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.960330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.961217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.961787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.962055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.962318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.963209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.963441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.963450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.965356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.966388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.967392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.967857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.105 [2024-07-24 19:04:39.968496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.968756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.969126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.969963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.970145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.970155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.972378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.973390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.974299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.974561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.975166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.975424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.976338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.977210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.977397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.977406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.979510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.980521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.980827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.981092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.981670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.981944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.982925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.983913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.984095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.984104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.985505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.985788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.986053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.986317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.987325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.988321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.989302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.989831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.990019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.990028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.991387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.991653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.991912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.992405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.993771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.994848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.995853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.996375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.996595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.996605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.998004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.998270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.998548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.998581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.999108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.999379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.999648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:39.999919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.000275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.000286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.002377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.002411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.002685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.002734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.003333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.003366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.003636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.003665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.003951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.003960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.005913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.005949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.006213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.006241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.006860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.006894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.007160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.007191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.007539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.007549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.009562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.009599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.009867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.009896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.010456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.010492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.010758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.011027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.011369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.011378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.013432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.013476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.013742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.014009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.014371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.106 [2024-07-24 19:04:40.014643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.014910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.014949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.015241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.015251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.017252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.017571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.017882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.017931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.018610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.018949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.018986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.019308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.019665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.019684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.022870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.023224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.023261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.023536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.024105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.024136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.024399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.024672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.025013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.025024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.026935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.026970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.027236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.027522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.027821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.028101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.028368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.028397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.028765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.028776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.030547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.030816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.031082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.031112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.031641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.031910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.031941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.032206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.032526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.032537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.034456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.034735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.034767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.035037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.035638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.035676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.035938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.036202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.036565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.036577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.038584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.038853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.038884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.039150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.039461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.039739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.040004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.040033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.040365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.040374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.042091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.042359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.042632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.042661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.043248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.043282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.043562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.043831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.044122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.044131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.046129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.046163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.046425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.046457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.046827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.047095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.047127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.107 [2024-07-24 19:04:40.047394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.047735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.047745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.049719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.049754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.050031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.050059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.050434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.050707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.050736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.051002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.051290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.051299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.053409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.053466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.053741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.053774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.054103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.054370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.054399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.054669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.055014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.055026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.056989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.057023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.057288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.057325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.057564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.057966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.057999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.059053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.059392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.059403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.061522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.061583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.061853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.061882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.062232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.062503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.062532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.062794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.063039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.063048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.065080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.065113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.065396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.065435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.065750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.066899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.066934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.067199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.067539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.067548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.070017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.070071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.071075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.071105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.071322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.072025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.072053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.072891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.073077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.073086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.074757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.074792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.075055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.075084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.075298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.076299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.076337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.077428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.077625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.077634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.079945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.079985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.080698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.080737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.081018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.081277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.081305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.082232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.082540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.082549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.084829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.084862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.085382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.085404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.085628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.086604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.087633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.087675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.108 [2024-07-24 19:04:40.087855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.087864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.089673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.089702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.089728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.089753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.089974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.090000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.090025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.090050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.090269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.090278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.091407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.091443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.091473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.091504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.091711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.091746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.091776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.091801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.091982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.091991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.093487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.093515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.093558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.093584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.093793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.093820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.093849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.093874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.094214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.094223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.095444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.095477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.095517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.095546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.095760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.095787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.095819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.095846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.096166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.096174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.097213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.097243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.097269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.097295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.097633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.097661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.097687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.097713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.098051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.098061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.099607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.099635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.099683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.099711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.099918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.099966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.099998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.100023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.100203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.100212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.101318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.101347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.101371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.101665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.101694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.101737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.109 [2024-07-24 19:04:40.101919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.145990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.146034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.150143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.150182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.150224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.150262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.151359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.151548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.153127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.153155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.153919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.153948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.154300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.154337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.155061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.155090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.155115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.155330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.155338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.157238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.157277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.158288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.158319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.158974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.159004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.159260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.159287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.159565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.159575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.162621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.162676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.163501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.163530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.164708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.164739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.165202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.165230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.165409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.165418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.167577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.167611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.168420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.168448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.169680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.169713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.170538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.170569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.170750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.170758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.173934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.173975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.174423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.174452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.175840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.175874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.176855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.176885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.177065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.177074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.179039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.179072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.180049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.180079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.180706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.180737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.181872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.181903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.182270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.182280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.185658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.185695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.186687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.186717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.187848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.187881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.188148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.188177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.188545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.188556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.191063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.191097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.192147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.192184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.193348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.371 [2024-07-24 19:04:40.193379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.194358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.194388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.194574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.194583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.197251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.197288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.198366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.198402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.199577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.199616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.200614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.200648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.200951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.200960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.202251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.202283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.202546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.202570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.203129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.203158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.204037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.204066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.204287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.204296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.207557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.208553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.208591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.208622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.209285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.209527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.209805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.209835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.209861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.210692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.210964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.210973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.213269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.214004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.215078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.215990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.216169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.217181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.217697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.217961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.218225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.218611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.218622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.222046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.223091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.224104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.224825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.225007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.225581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.225847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.226755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.227094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.227444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.227457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.229126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.230247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.231234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.232281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.232467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.232931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.233189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.233444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.233727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.234012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.234021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.237994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.239018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.239759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.240689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.241004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.241274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.242198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.242534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.242808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.242986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.242995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.245153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.246148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.247172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.248219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.248573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.248843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.249100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.249356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.250039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.250259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.250267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.372 [2024-07-24 19:04:40.253947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.254770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.255660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.255976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.256318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.257204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.257570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.257834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.258942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.259123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.259131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.261160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.261573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.261832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.261862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.262152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.262415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.262953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.263771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.264763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.264943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.264952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.268769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.268806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.269069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.269327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.269508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.269806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.270064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.270095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.271022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.271202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.271211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.272361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.273380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.274372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.274400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.274589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.274860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.274887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.275143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.275398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.275738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.275748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.279707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.280749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.280780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.281792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.282073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.282117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.283121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.283392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.283421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.283775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.283785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.285638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.285672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.285936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.286202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.286561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.286836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.287108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.287140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.288033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.288321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.288330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.290269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.290561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.290826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.290858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.291252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.291529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.291800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.291831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.292155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.292340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.292349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.294298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.294593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.294868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.373 [2024-07-24 19:04:40.294899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.295254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.295529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.295792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.295824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.296089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.296413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.296422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.299280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.299557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.299819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.299848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.300195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.300458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.300719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.300748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.301006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.301261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.301270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.303305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.303569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.303837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.303868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.304132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.304401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.304662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.304693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.304950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.305282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.305292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.307372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.307643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.307912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.307949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.308215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.308489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.308750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.308779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.309035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.309330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.309338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.311162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.312077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.312344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.312375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.312660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.312948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.313211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.313239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.313498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.313781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.313790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.316853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.317420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.317681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.317710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.317971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.318242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.318531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.318563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.318829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.319203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.319212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.321076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.321358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.322467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.322507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.322876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.323149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.323419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.323454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.323728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.324085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.324095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.326491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.326917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.327758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.328024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.328351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.328637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.328909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.328940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.329206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.329515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.329524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.331165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.331877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.331906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.332161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.332376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.333054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.333312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.333341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.374 [2024-07-24 19:04:40.333603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.333852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.333861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.336896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.336934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.337195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.337223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.337550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.338610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.338646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.338906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.339164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.339439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.339449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.341447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.341493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.341767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.341804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.342061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.342097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.342906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.342933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.343191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.343430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.343438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.346364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.346400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.346660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.346687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.346928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.346970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.347234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.347265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.348254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.348594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.348603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.350490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.350525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.350790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.350821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.351141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.351176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.351431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.351465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.351747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.352072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.352081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.354183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.354220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.354485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.354518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.354700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.354740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.355004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.355034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.355885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.356196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.356204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.358289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.358345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.358972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.359001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.359232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.359269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.359532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.359560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.360373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.360644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.360656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.363941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.363979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.364336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.364365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.364699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.364734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.364993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.365032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.375 [2024-07-24 19:04:40.365289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.365475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.365484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.367438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.367477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.368441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.368475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.368651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.368682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.369756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.369795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.370764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.371013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.371021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.374119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.374157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.374964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.374993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.375307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.375346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.375618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.375646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.376784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.376971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.376 [2024-07-24 19:04:40.376980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.378941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.378975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.379999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.381051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.381396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.381438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.382394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.382422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.382696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.382963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.382971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.386055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.387061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.387869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.387899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.388140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.388180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.389203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.390189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.390217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.390495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.390504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.391927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.391955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.392000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.392027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.392408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.392443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.392473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.392499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.392526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.392816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.392824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.395822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.395853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.395878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.395902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.396077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.396115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.396140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.396165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.396188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.396363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.396371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.397936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.397977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.398005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.638 [2024-07-24 19:04:40.398030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.398205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.398241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.398268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.398292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.398317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.398639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.398648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.401284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.401316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.401341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.401367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.401572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.401610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.401634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.401659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.401683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.401854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.401862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.403154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.403183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.403209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.403233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.403542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.403576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.403603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.403627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.403652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.403831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.403839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.406244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.406276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.406302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.406327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.406507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.406541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.406566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.406591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.406623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.406809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.406817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.407909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.407955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.407984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.408241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.408546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.408584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.408608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.408633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.408657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.408837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.408845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.411224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.412067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.412097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.412122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.412343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.412385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.412410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.413401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.413431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.413612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.413621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.416355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.416412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.416438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.416698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.416987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.417023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.417841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.417870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.417895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.418077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.418086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.421040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.421072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.421586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.421618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.421797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.422064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.422093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.422120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.422477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.422658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.422667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.424147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.425283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.425318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.425343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.425522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.425560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.639 [2024-07-24 19:04:40.425585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.426356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.426384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.426622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.426631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.430771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.430807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.430849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.431120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.431459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.431500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.431531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.432489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.432523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.432703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.432712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.433930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.433962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.433994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.435130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.435321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.435361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.435386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.436541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.436576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.436881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.436890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.439971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.440003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.440027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.440845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.441030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.441069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.441094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.442091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.442120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.442370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.442379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.443459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.443494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.443520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.443983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.444177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.444218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.444244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.444515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.444543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.444795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.444804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.447583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.447614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.447655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.448241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.448430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.448466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.448496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.449454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.449486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.449668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.449677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.450974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.451003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.451032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.451526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.451713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.451756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.451783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.452049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.452076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.452313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.452322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.455017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.455049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.455077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.456097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.456285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.456327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.456352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.456740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.456769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.456947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.456956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.458665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.458693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.640 [2024-07-24 19:04:40.458735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.459381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.459622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.459663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.459688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.460696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.460737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.460916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.460924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.463793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.463838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.463867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.463891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.464071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.464107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.464133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.464391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.464417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.464699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.464711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.467173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.467211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.468194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.468224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.468405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.468445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.468475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.469327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.469355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.469566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.469575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.473251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.474325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.474354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.474636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.474941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.474978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.475849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.475879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.475903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.476084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.476093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.479019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.479762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.479792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.480844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.481206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.481493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.481523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.482619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.482658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.483013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.483024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.485529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.486376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.486406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.487389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.487579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.488485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.488533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.489429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.489458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.489830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.489840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.491543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.492548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.492578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.493685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.494066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.494997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.495028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.496019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.496048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.496227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.496235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.498710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.498976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.499004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.500012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.500249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.501288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.501322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.502332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.502361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.502696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.502706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.641 [2024-07-24 19:04:40.505840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.506106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.506134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.506831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.507071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.507342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.507371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.508224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.508254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.508467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.508480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.511700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.512719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.512750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.513239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.513423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.513719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.513750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.514288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.514318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.514553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.514563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.516946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.518006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.518043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.519077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.519260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.520267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.520297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.520721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.520750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.520936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.520946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.522754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.523773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.523804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.524801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.525009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.525822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.525853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.526676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.526704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.526882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.526890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.529473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.529774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.529805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.529830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.530139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.531147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.531181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.532206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.532236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.532422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.532431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.536251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.536292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.536318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.536661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.536988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.537818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.537850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.537875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.538141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.538466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.538483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.542686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.543683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.544711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.545134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.545319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.545602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.545961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.546853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.547117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.547389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.547398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.551308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.552343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.553110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.553989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.554282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.554573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.555396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.555843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.556103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.556289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.556298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.559740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.560791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.642 [2024-07-24 19:04:40.561178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.562001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.562338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.562713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.563606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.563878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.564508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.564725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.564734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.568493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.569208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.570125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.570418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.570784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.571640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.572044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.572300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.573481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.573680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.573689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.577738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.578132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.578950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.579229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.579502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.580432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.580715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.581091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.582107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.582291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.582300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.585934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.586200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.587118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.587401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.587742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.588703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.589555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.590538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.591519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.591809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.591818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.595206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.595694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.596416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.596445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.596768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.597359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.598177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.599165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.600157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.600403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.600413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.603847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.603883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.604140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.605130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.605495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.605762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.606707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.606740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.607744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.607925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.607934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.611144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.611520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.612347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.612376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.612714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.613156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.613185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.613854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.614112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.614350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.643 [2024-07-24 19:04:40.614360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.616391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.616676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.616717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.616976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.617161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.617198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.617460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.617721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.617750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.617930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.617938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.620157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.620195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.621108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.621366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.621675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.621950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.622282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.622312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.623087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.623423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.623433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.625818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.626596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.626856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.626884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.627114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.627870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.628131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.628160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.628418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.628670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.628681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.631981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.632253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.633007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.633039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.633328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.633602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.634411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.634440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.634745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.635071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.635085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.637716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.637986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.638245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.638276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.638559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.639410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.639836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.639866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.640127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.640314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.640323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.644 [2024-07-24 19:04:40.642622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.643669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.643943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.643976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.644311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.644591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.644863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.644900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.645938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.646286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.646295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.649369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.649643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.649902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.649933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.650113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.650384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.650673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.650713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.650992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.651342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.651351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.655211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.655516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.656466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.656499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.656832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.657111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.658091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.658120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.658383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.658700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.658709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.660827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.661096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.661361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.661401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.661796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.662849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.663108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.663138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.663553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.663735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.663744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.666044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.667060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.667319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.667353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.667657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.667931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.906 [2024-07-24 19:04:40.668239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.668270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.669068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.669404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.669413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.672613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.672880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.673264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.674088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.674419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.674692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.674961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.674999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.675409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.675599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.675607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.677542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.677821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.677852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.678677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.679023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.679290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.680250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.680282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.680540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.680849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.680858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.682916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.682954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.683213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.683255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.683553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.683825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.683856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.684756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.685014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.685300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.685309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.688287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.688325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.688688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.688720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.688901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.688940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.689200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.689228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.689491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.689794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.689802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.692544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.692588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.692849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.692880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.693110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.693151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.693840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.693869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.694125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.694335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.694343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.696937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.696977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.697685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.697717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.698002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.698039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.698299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.698325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.698989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.699214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.699223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.702059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.702096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.702353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.702391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.702694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.702738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.703001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.703042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.704102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.704482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.704492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.707441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.707486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.707746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.707775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.708067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.708112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.709097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.709126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.709381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.907 [2024-07-24 19:04:40.709697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.709706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.711758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.711795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.712055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.712085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.712366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.712410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.713099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.713128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.713553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.713888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.713898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.718103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.718142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.718574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.718603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.718782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.718821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.719901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.719938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.720922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.721103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.721112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.724608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.724646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.725497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.725525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.725705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.725745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.726734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.726767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.727203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.727387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.727395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.730333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.730370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.731132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.731390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.731653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.731691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.732509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.732538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.733520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.733705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.733714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.736585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.737698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.737960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.737989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.738297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.738335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.739326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.739589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.739617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.739896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.739904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.742862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.742895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.742919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.742943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.743124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.743162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.743187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.743212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.743236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.743413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.743420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.745611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.745644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.745670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.745696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.745970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.746004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.746029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.746053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.746078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.746315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.746323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.749237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.749269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.749294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.749318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.749502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.749540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.749566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.749590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.749615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.749973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.749982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.753569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.753601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.753630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.908 [2024-07-24 19:04:40.753655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.753872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.753910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.753935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.753960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.753985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.754164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.754172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.757109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.757141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.757166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.757197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.757480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.757516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.757541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.757565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.757590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.757843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.757853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.759622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.759654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.759679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.759703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.759883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.759921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.759946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.759972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.759996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.760175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.760183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.762865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.762897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.762926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.763403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.763734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.763770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.763796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.763822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.763848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.764032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.764041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.766513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.766963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.766992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.767017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.767241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.767288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.767313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.768336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.768367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.768554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.768563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.771077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.771113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.771149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.772148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.772337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.772375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.773399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.773435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.773466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.773647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.773656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.776594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.776627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.776890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.776918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.777248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.778217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.778247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.778278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.778539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.778865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.778874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.781969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.783004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.783036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.783061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.783245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.783286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.783312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.783911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.783955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.784137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.784146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.787056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.787100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.787133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.788146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.788336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.788377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.909 [2024-07-24 19:04:40.788409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.789211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.789240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.789490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.789499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.793226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.793263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.793306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.794226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.794587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.794622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.794649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.795183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.795211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.795427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.795436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.798338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.798370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.798411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.799412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.799678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.799720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.799751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.800740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.800768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.801107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.801117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.803454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.803489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.803515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.804502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.804733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.804775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.804801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.805895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.805930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.806115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.806124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.809601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.809634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.809658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.810359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.810694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.810727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.810753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.811405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.811433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.811651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.811660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.814814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.814847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.814875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.816016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.816307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.816349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.816375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.817136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.817165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.817493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.817503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.820315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.820354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.820379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.821363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.821629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.821672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.821701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.822614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.822643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.822824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.822833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.825658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.825702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.825728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.826795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.826999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.827039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.827065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.828051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.828080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.828260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.828268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.831014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.910 [2024-07-24 19:04:40.831047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.831074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.831099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.831415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.831448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.831477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.831737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.831763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.832095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.832104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.835679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.835714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.836733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.836762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.836942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.836981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.837006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.837616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.837657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.838020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.838030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.840745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.841855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.841891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.842643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.842860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.842899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.843893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.843923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.843947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.844126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.844135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.846665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.847632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.847662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.848775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.848955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.849403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.849432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.850249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.850278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.850458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.850467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.853314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.854284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.854314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.855273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.855455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.856455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.856487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.856927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.856956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.857160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.857169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.859847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.860133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.860165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.861098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.861323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.862358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.862389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.863398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.863427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.863689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.863699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.867164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.867453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.867484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.867747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.868005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.868829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.868860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.869862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.869890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.870071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.870080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.872990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.873271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.873299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.873558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.873899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.874162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.874193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.875056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.875085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.911 [2024-07-24 19:04:40.875264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.875273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.878218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.878997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.879028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.879291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.879616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.879881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.879909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.880166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.880194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.880371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.880380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.882863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.883875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.883909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.884740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.885039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.885308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.885336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.885596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.885624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.885947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.885956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.888979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.889918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.889947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.891036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.891221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.891579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.891609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.891865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.891892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.892193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.892203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.894634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.895566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.895596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.895628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.895808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.896813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.896843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.897851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.897881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.898212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.898221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.901439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.901478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.901503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.902295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.902514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.903332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.903361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.903385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.904372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.904555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.904563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.907954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.908898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.909364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.910341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.910530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.911478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:55.912 [2024-07-24 19:04:40.911851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.912116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.912379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.912715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.912725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.916171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.917200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.918203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.918922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.919217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.919486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.919743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.920000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.921020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.921262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.921270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.925517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.925827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.926088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.926350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.926707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.927039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.927913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.928935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.929948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.930135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.930145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.933367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.933659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.933925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.935044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.935226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.936238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.937233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.937571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.938657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.938852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.938861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.941842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.942135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.942407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.942678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.942963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.943230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.943499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.943762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.944021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.944345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.944354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.946898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.947162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.947422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.947695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.947993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.948259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.948520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.948777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.949037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.949309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.949319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.175 [2024-07-24 19:04:40.951404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.951678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.951943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.951973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.952300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.952568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.952826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.953088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.953352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.953759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.953772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.955713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.955761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.956016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.956276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.956630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.956895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.957157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.957189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.957453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.957820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.957831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.959532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.959800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.960061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.960089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.960382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.960651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.960680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.960939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.961200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.961508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.961518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.963543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.963809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.963838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.964094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.964381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.964414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.964676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.964934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.964965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.965202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.965211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.967281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.967320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.967585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.967847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.968184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.968447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.968710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.968755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.969017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.969289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.969298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.971462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.971732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.971995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.972026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.972372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.972641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.972903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.972931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.973189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.973543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.973552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.975653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.975944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.976216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.976247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.976581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.976860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.977117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.977146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.977403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.977702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.977716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.979693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.979956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.980218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.980246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.980589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.980852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.981109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.981139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.981396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.981659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.176 [2024-07-24 19:04:40.981668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.983703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.983970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.984229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.984259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.984591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.984855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.985112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.985149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.985406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.985694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.985703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.988182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.988448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.988713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.988743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.989072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.989338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.989606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.989648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.989925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.990250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.990259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.992398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.992672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.992932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.992962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.993242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.993509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.993769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.993800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.994060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.994348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.994357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.997007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.997462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.998205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.998236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.998533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.998797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.999053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.999082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.999336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.999597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:40.999606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.001568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.001830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.002095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.002139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.002499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.002765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.003032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.003060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.003317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.003652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.003661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.005648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.005918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.006196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.006453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.006736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.007001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.007547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.007579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.008392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.008579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.008587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.009694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.010688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.010718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.011710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.011963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.012239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.012499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.012528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.012784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.013090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.013098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.014708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.014742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.015569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.015597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.015778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.016793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.016823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.017398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.017668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.017999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.018009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.177 [2024-07-24 19:04:41.020464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.020499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.021495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.021523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.021791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.021831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.022693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.022723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.023705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.023886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.023895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.025747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.025779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.026664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.026693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.026914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.026952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.027940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.027969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.028955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.029304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.029313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.030659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.030690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.030946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.030973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.031244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.031277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.031536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.031563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.032420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.032681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.032691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.034643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.034677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.035680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.035710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.035897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.035939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.036342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.036371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.036638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.036930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.036939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.039436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.039483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.040510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.040539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.040755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.040791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.041630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.041659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.042671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.042856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.042865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.044759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.044789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.045865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.045901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.046090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.046125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.047133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.047163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.048222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.048539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.048548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.049861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.049892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.050171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.050199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.050498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.050532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.050795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.050822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.051734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.051982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.051991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.053940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.053973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.054950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.054979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.055161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.055201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.055618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.055646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.055901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.056200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.056209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.058709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.058750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.059753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.178 [2024-07-24 19:04:41.060524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.060740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.060779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.061770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.061800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.062784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.063001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.063011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.064816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.065801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.066657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.066685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.066864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.066903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.067888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.068340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.068368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.068568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.068577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.069731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.069759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.069784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.069811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.070140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.070172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.070199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.070224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.070250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.070595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.070604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.071885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.071912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.071936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.071960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.072136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.072174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.072198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.072223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.072253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.072506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.072514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.073529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.073557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.073583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.073608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.073914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.073946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.073972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.073997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.074023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.074364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.074373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.075898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.075927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.075968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.076013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.076191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.076223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.076253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.076282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.076307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.076484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.076492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.077617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.077645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.077669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.077693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.077925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.077965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.077991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.078016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.078051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.078390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.078399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.079994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.080035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.080059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.080084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.080260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.080299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.080324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.179 [2024-07-24 19:04:41.080348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.080372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.080548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.080559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.081701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.081728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.081756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.082739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.083007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.083048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.083077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.083101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.083126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.083480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.083490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.085046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.086044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.086074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.086099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.086275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.086318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.086343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.086996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.087031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.087213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.087221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.088594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.088624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.088650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.088906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.089245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.089279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.089801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.089833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.089858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.090080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.090088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.091196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.091224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.092047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.092076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.092257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.093285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.093316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.093342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.093662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.094036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.094046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.095632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.096670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.096700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.096724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.096900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.096942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.096967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.097794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.097823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.098003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.098011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.099341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.099373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.099401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.099661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.100012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.100049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.100075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.100498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.100527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.100762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.100771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.101889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.101918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.101942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.102761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.102942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.102982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.103007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.103993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.104022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.104271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.180 [2024-07-24 19:04:41.104280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.106457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.106486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.106511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.107313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.107497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.107539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.107567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.108547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.108576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.108754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.108763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.109839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.109866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.109894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.110392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.110747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.110781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.110806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.111062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.111089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.111446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.111455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.112629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.112677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.112704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.113587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.113825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.113860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.113885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.114708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.114738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.114919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.114927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.116410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.116449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.116482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.116739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.117041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.117078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.117102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.117910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.117939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.118118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.118126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.119272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.119301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.119327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.120335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.120518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.120558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.120583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.121114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.121143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.121506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.121516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.123101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.123128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.123158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.124147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.124347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.124388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.124418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.125392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.125422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.125653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.125662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.126753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.126783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.126808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.126846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.127187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.127220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.127247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.127509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.127539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.127900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.127912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.130107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.130146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.130833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.130862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.131075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.131114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.131140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.132130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.132159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.132336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.132344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.134022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.134318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.134346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.181 [2024-07-24 19:04:41.135170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.135351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.135397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.136430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.136476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.136502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.136687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.136696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.137752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.138441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.138472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.138740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.139080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.139343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.139375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.139680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.139709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.139887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.139896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.141005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.141826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.141856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.142830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.143010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.143978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.144009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.144269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.144296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.144636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.144646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.146149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.147141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.147171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.148149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.148420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.149441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.149486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.150479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.150508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.150684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.150693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.152228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.152493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.152521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.153451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.153665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.154667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.154697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.155680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.155708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.155962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.155971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.157007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.157358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.157385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.157644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.157932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.158195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.158223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.158891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.158921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.159146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.159155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.160282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.161106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.161136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.162123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.162303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.162958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.162995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.163252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.163280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.163600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.163609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.165047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.166166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.166200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.166710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.166888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.167893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.167924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.168694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.168722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.169049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.169058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.182 [2024-07-24 19:04:41.170716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.171575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.171604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.172613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.172795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.173249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.173279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.174091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.174120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.174301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.174310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.175734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.176002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.176028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.176055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.176390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.177250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.177280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.178114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.178144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.178328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.183 [2024-07-24 19:04:41.178340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.180515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.180548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.180581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.181653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.181921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.182193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.182222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.182249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.182515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.182848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.182858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.184868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.185701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.186525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.187514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.187693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.188381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.188647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.188903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.189159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.189484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.189494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.190902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.192005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.192994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.194079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.194264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.194717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.194985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.195252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.195519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.195852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.195861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.197872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.198135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.198400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.198683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.199021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.199292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.199560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.199832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.200094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.200401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.200410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.202366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.202635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.202892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.203148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.203466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.203739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.204001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.204259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.204519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.204873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.204883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.206889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.207150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.207410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.207680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.207970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.208242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.208502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.208762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.209019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.209268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.209277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.211321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.211609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.211877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.212140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.212486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.212766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.213032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.213295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.213563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.213911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.213920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.215979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.216252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.216524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.216556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.216870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.446 [2024-07-24 19:04:41.217151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.217414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.217680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.217939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.218257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.218267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.220199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.220231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.220520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.220797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.221057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.221325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.221588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.221620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.221878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.222203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.222213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.224087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.224371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.224647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.224687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.225002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.225280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.225312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.225590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.225845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.226172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.226182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.228038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.228320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.228351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.228625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.228908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.228950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.229207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.229464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.229498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.229860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.229872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.231776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.231809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.232088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.232355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.232710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.233002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.233262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.233294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.233555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.233892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.233903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.235549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.235831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.236096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.236125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.236396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.236687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.236956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.236987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.237250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.237616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.237627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.239560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.239832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.240095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.240132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.240475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.240767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.241034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.241065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.241327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.241679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.241689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.243637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.243926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.244188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.244219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.244530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.244810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.245075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.245118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.245384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.245733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.245746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.247223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.247487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.247745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.247776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.248076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.248354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.248643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.447 [2024-07-24 19:04:41.248676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.248937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.249241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.249250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.251159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.251421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.251683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.251715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.251959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.252229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.252521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.252552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.252813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.253174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.253184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.255154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.255418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.255708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.255749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.256072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.256344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.256629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.256661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.256930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.257267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.257277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.259439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.260215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.261036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.261069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.261253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.262298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.262950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.262982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.263245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.263595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.263606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.266296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.267318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.267765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.267797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.268003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.269161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.270232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.270264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.271191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.271443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.271452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.273994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.275021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.276038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.276730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.276919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.277779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.278768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.278798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.279780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.280068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.280077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.282232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.283158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.283188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.284215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.284401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.285537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.286211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.286241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.287080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.287267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.287276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.288981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.289020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.289284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.289314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.289505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.290340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.290371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.291377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.292398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.292702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.292711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.294100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.294133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.294395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.294423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.294736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.294769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.295032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.295060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.295951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.296199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.296208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.448 [2024-07-24 19:04:41.298179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.298212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.299221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.299251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.299435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.299482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.299933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.299973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.300235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.300564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.300577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.303113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.303181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.304195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.304223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.304439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.304481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.305289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.305317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.306303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.306488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.306514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.308409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.308452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.309511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.309539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.309735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.309771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.310804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.310841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.311857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.312165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.312175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.313478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.313513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.313784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.313812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.314087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.314129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.314385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.314417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.315337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.315555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.315565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.317474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.317506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.318528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.318557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.318736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.318776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.319114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.319141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.319397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.319717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.319727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.322169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.322202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.323113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.323143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.323340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.323376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.324199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.324228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.325241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.325425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.325434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.327365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.327398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.328490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.328518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.328701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.328740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.329784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.329820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.330888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.331147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.331155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.332396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.332427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.332691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.332948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.333290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.333324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.334232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.334262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.335134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.335318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.335328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.336456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.337431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.449 [2024-07-24 19:04:41.338454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.338489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.338853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.338897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.339164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.339431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.339463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.339827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.339837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.340991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.341022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.341048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.341074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.341262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.341300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.341325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.341351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.341376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.341559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.341568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.342694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.342737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.342763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.342789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.343142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.343175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.343201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.343227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.343254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.343539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.343548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.344916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.344946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.344971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.344995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.345170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.345209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.345234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.345258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.345282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.345579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.345589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.346631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.346660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.346685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.346710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.347074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.347117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.347143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.347169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.347194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.347534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.347544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.349094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.349123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.349147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.349171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.349344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.349382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.349407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.349432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.349456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.349641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.349650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.350809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.350837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.350863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.350888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.351070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.351109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.351135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.351160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.351190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.351461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.351477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.450 [2024-07-24 19:04:41.353217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.353244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.353271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.354086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.354269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.354308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.354334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.354358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.354382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.354585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.354594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.355769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.356763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.356794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.356820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.357087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.357134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.357161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.357418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.357445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.357742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.357751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.360323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.360367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.360394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.361284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.361520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.361558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.362373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.362402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.362426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.362633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.362643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.364109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.364138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.364402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.364429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.364757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.365796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.365827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.365855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.366866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.367055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.367063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.368230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.369268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.369299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.369324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.369571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.369613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.369641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.369913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.369945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.370299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.370310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.372770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.372821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.372847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.373852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.374145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.374186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.374212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.375059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.375089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.375276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.375285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.376772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.376802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.376845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.377111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.377450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.377493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.377520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.378577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.378613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.378793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.378801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.379918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.379947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.380006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.381034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.381223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.381262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.381297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.382290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.382320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.382618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.382627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.384410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.384444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.384490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.385311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.385508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.451 [2024-07-24 19:04:41.385549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.385575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.386582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.386612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.386971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.386980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.388023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.388051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.388084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.388349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.388695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.388731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.388758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.389021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.389047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.389383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.389392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.390489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.390518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.390547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.391027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.391215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.391251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.391284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.392383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.392422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.392617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.392630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.394208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.394237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.394277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.394550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.394817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.394853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.394878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.395697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.395727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.395907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.395916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.396990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.397017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.397042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.398056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.398244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.398284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.398310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.398648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.398677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.399024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.399034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.400569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.400597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.400622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.400646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.400826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.400864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.400891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.401904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.401937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.402234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.402244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.403837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.403885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.404143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.404170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.404496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.404540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.404566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.404822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.404849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.405135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.405144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.406206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.406834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.406863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.407685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.407867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.407907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.408897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.408927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.408952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.409182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.409191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.411046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.412149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.412176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.413253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.413436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.414457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.414492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.415063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.415092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.415311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.452 [2024-07-24 19:04:41.415320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.416572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.416837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.416866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.417126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.417482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.418460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.418497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.419423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.419452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.419640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.419650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.420785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.421093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.421121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.421378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.421681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.421953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.421981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.422703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.422743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.422955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.422964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.424098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.424993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.425024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.426106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.426293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.426717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.426760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.427017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.427044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.427321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.427330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.428779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.429831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.429862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.430744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.430982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.431817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.431848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.432835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.432865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.433046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.433054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.434689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.434972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.435002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.436008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.436228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.437264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.437295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.438286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.438317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.438656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.438665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.439718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.439989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.440021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.440283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.440562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.440838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.440870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.441489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.441518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.441778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.441787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.442931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.443763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.443793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.444802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.444989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.445531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.445563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.445851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.445879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.446184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.446193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.447649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.448668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.448699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.448725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.448911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.449463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.453 [2024-07-24 19:04:41.449501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.450326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.450355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.450553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.450562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.452253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.452288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.452329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.452599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.452834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.453115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.453161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.453188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.453451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.453779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.453789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.455855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.456117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.456377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.456663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.456977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.457265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.457538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.457809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.458068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.458342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.458351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.460358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.460652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.716 [2024-07-24 19:04:41.460922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.461191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.461542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.461813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.462083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.462349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.462628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.462974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.462982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.464970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.465241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.465510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.465787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.466132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.466406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.466690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.466972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.467238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.467601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.467614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.469652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.469926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.470188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.470453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.470778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.471054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.471321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.471592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.471860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.472189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.472198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.474291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.474577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.474842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.475101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.475373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.475671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.475940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.476209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.476482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.476829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.476839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.478713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.478976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.479236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.479516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.479777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.480056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.480322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.480592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.480862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.481214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.481224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.483182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.483448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.483714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.483748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.484034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.484304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.484568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.484828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.485087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.485376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.485385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.487462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.487531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.487810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.488072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.488396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.488667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.488925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.488954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.489212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.489489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.489498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.491306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.491601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.491876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.491908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.492228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.492505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.492536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.492803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.493059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.493345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.493354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.495331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.495630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.495669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.495938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.496285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.717 [2024-07-24 19:04:41.496323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.496593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.496863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.496891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.497188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.497200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.499179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.499212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.499477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.499761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.500035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.500308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.500577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.500607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.500875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.501151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.501160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.502903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.503164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.503856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.503888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.504122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.504796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.505344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.505375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.505647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.505975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.505985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.508026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.508310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.508587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.508619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.508961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.509240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.509517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.509550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.509820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.510122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.510132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.512054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.512318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.512602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.512633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.512911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.513190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.513460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.513498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.513770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.514031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.514040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.515900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.516164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.516423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.516452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.516662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.517516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.518517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.518549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.519495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.519791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.519800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.521251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.521540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.521807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.521839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.522187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.522593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.523420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.523450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.524433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.524643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.524652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.526819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.527565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.527832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.527864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.528204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.528476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.528758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.528789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.529888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.530070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.530079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.532353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.533457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.534494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.534525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.534784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.535061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.535328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.535359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.535637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.718 [2024-07-24 19:04:41.535865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.535874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.537740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.538602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.539668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.539720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.539926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.540598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.540878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.540912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.541179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.541546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.541557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.543797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.544289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.545136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.546153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.546339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.547446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.547720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.547752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.548018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.548286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.548295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.549670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.550662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.550693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.551374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.551570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.552413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.553411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.553443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.554456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.554792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.554801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.557556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.557590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.558596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.558626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.558811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.559792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.559823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.560777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.561592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.561775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.561784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.563562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.563600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.563863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.563894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.564075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.564111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.565111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.565144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.566239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.566419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.566428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.568587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.568622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.568914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.568945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.569269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.569307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.569574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.569608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.569863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.570051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.570060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.572297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.572348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.573452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.573491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.573683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.573722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.574759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.574789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.575047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.575371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.575381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.577871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.577906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.578898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.578928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.579195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.579234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.580269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.580307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.581373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.581564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.581573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.583366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.583402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.719 [2024-07-24 19:04:41.583962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.583992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.584205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.584245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.585353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.585383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.586459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.586672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.586682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.588352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.588401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.588669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.588700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.589015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.589063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.589320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.589348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.589863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.590066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.590075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.592034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.592068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.593044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.593075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.593253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.593294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.594058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.594088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.594349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.594716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.594729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.597282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.597317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.598311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.598343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.598635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.598675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.599519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.599550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.600555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.600743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.600762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.602553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.602587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.603454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.604311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.604505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.604547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.605556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.605586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.606016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.606197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.606206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.607380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.607670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.607940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.607971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.608304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.608339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.608996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.609845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.609875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.610054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.610063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.611149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.611178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.611206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.611231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.611407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.611445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.611476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.611501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.611526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.611703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.611711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.613541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.613572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.613599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.613625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.613896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.613932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.613957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.613982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.614007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.614220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.614229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.615328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.615357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.615382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.615413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.615599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.615635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.720 [2024-07-24 19:04:41.615660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.615692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.615721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.615898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.615910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.617529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.617558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.617600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.617625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.617960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.617992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.618018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.618043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.618069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.618250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.618259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.619409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.619444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.619475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.619500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.619684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.619721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.619765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.619791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.619815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.619993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.620001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.621393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.621421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.621448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.621478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.621765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.621807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.621834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.621859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.621888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.622215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.622224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.623286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.623315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.623344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.623841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.624044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.624089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.624117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.624142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.624167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.624351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.624359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.625879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.626174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.626206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.626232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.626590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.626627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.626654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.627751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.627780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.627969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.627978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.630178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.630220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.630245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.631231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.631417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.631462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.631753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.631784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.631811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.632159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.632169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.633675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.633704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.634688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.634719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.634897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.721 [2024-07-24 19:04:41.635367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.635396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.635421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.636268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.636453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.636462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.637877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.638140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.638170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.638195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.638526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.638559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.638585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.639460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.639534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.639718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.639726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.642038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.642077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.642102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.643094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.643276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.643315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.643342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.643608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.643642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.643970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.643979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.645447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.645480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.645506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.646513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.646692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.646731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.646756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.647206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.647234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.647415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.647424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.648651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.648680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.648706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.648970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.649242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.649284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.649309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.649581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.649610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.649803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.649812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.650937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.650972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.650998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.651894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.652073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.652108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.652141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.653196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.653227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.653412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.653421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.655181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.655210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.655237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.656234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.656484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.656524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.656549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.657522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.657551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.657729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.657738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.658848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.658876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.658927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.659942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.660207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.660253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.660280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.660555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.660584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.660866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.660874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.662264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.662292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.662316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.663312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.663615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.663668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.663693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.664556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.722 [2024-07-24 19:04:41.664584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.664764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.664772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.666132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.666164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.666190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.666449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.666778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.666816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.666842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.667664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.667693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.667906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.667914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.669064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.669094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.669129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.669154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.669329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.669365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.669395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.670394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.670423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.670603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.670613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.672886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.672919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.673906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.673934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.674112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.674152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.674177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.674624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.674656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.674836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.674844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.676317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.676587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.676618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.676875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.677114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.677149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.677964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.677993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.678018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.678197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.678206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.679364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.680356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.680386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.681487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.681765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.682035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.682065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.682322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.682352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.682693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.682706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.683863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.684686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.684716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.685731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.685967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.686975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.687007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.687988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.688017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.688378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.688387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.690322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.691183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.691213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.692196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.692376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.693461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.693493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.694268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.694296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.694530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.694539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.695844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.696105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.696151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.696410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.696747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.697847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.697876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.698954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.698984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.699162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.699171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.723 [2024-07-24 19:04:41.700383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.701371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.701400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.701936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.702239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.702513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.702543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.702805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.702836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.703166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.703175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.704247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.704548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.704580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.705484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.705666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.706679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.706715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.707762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.707799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.708166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.708178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.710017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.710281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.710312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.710574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.710924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.711191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.711222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.711483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.711511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.711788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.711797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.713550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.713814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.713847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.714110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.714380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.714656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.714688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.714945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.714976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.715315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.715327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.717059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.717329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.717361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.717387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.717720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.717992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.718024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.718291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.718326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.718635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.718644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.720718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.720755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.720793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.724 [2024-07-24 19:04:41.721055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.721394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.721689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.721723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.721748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.722014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.722324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.722333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.724422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.724702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.724967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.725245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.725545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.725820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.726088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.726358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.726635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.726999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.727011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.728943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.729206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.729465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.729727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.730033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.730311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.730582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.730843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.731100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.731425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.731435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.733351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.733618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.733879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.734141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.986 [2024-07-24 19:04:41.734405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.734701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.734969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.735235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.735507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.735784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.735792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.737814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.738083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.738345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.738607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.738913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.739182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.739441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.739732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.740678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.740956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.740965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.742853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.743120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.744057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.744324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.744651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.745473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.745867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.746124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.746382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.746649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.746659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.748381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.748648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.748913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.749175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.749355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.749710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.749968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.750984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.751249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.751578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.751588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.753323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.754413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.754683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.754714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.755027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.755295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.755564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.756617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.756882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.757197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.757206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.759011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.759049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.759934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.760212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.760502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.761458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.761722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.761752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.762010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.762341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.762349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.763888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.764574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.764834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.764865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.765122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.765391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.765422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.766148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.766636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.766969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.766979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.768737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.769480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.769511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.769874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.770205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.770242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.771070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.771444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.771479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.771817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.771827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.987 [2024-07-24 19:04:41.774127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.774167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.774431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.774712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.774892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.775160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.775420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.775456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.775722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.776062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.776070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.777485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.777748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.778008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.778060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.778368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.778744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.779575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.779607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.779864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.780113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.780122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.782154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.782870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.783130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.783160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.783387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.784146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.784427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.784461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.784729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.785018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.785027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.787124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.787394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.787664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.787697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.787948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.788229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.788499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.788533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.788795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.789126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.789136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.791141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.791410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.792538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.792569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.792759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.793778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.794787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.794818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.795343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.795559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.795568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.797069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.797356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.797625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.797658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.797857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.798711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.799726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.799758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.800765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.801054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.801063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.802514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.802801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.803069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.803100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.803458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.803909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.804730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.804760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.805751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.805933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.805942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.808094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.808724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.808998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.809030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.809355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.809627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.809887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.809920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.810834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.811016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.811025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.813178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.814158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.814963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.988 [2024-07-24 19:04:41.814994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.815292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.815568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.815830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.815864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.816118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.816296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.816304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.818511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.819629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.820691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.821698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.822013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.822282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.822547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.822579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.822834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.823083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.823091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.824229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.825202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.825232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.826145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.826332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.827354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.827851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.827890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.828146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.828472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.828485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.831034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.831076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.832115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.832143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.832369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.833203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.833233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.834222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.835209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.835447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.835456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.838574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.838614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.839730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.839763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.839939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.839971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.841090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.841120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.841849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.842086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.842095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.843616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.843649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.843908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.843939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.844280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.844313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.845403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.845439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.846554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.846736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.846744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.848882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.848915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.849725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.849754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.850040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.850081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.850342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.850371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.850634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.850957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.850966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.852560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.852594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.853488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.853518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.853698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.989 [2024-07-24 19:04:41.853736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.854750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.854780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.855636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.855919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.855928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.858411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.858444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.859447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.859479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.859661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.859704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.860172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.860201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.861086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.861263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.861271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.862939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.862973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.863229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.863257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.863485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.863521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.864333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.864361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.865345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.865530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.865538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.867668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.867701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.867961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.867991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.868333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.868366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.868631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.868662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.868917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.869180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.869189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.870965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.870998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.871815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.871847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.872027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.872067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.873063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.873092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.873402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.873773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.873783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.876358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.876392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.877363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.878198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.878429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.878471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.879287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.879315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.880303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.880485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.880494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.882212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.882534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.883403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.883433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.883617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.883656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.884658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.885593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.885622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.885856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.885865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.887011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.887043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.887076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.887100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.887452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.887489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.887516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.887541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.887570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.887856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.887865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.889258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.889285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.889309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.990 [2024-07-24 19:04:41.889333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.889511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.889549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.889573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.889597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.889622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.889927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.889936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.891044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.891073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.891104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.891131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.891445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.891481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.891507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.891532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.891557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.891895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.891905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.893452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.893484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.893510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.893534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.893714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.893752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.893778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.893803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.893834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.894016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.894024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.895243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.895271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.895295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.895319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.895539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.895578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.895604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.895629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.895667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.896006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.896016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.897616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.897645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.897684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.897708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.897886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.897923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.897952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.897980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.898004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.898179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.898187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.899438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.899471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.899499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.900497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.900781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.900820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.900849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.900874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.900900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.901254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.901263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.903786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.904648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.904679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.904705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.904888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.904927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.991 [2024-07-24 19:04:41.904952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.905863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.905892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.906079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.906088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.910003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.910039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.910081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.911072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.911259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.911299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.912069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.912099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.912125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.912305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.912314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.915021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.915054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.915314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.915342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.915601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.916425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.916454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.916482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.917465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.917650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.917659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.920627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.920893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.920921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.920946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.921239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.921273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.921298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.921556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.921584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.921840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.921848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.925522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.925559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.925607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.926625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.926921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.926972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.926999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.927256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.927283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.927573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.927581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.930011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.930044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.930069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.931183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.931368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.931404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.931429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.932422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.932453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.932638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.932647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.934699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.934731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.934756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.935739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.935920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.935959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.935984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.936520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.936550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.936727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.936738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.939368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.939400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.939425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.939688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.939902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.939937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.939962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.940786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.940816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.940996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.941005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.944185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.944226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.944254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.944520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.944843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.944876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.992 [2024-07-24 19:04:41.944903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.945158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.945188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.945537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.945548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.948214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.948248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.948281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.949430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.949747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.949791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.949817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.950074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.950106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.950405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.950414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.952885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.952946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.952975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.953924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.954106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.954142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.954173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.955210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.955240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.955416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.955425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.957383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.957417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.957458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.958465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.958665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.958704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.958730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.959176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.959203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.959387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.959396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.962175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.962208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.962251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.962276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.962617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.962650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.962680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.963759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.963795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.963973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.963982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.968057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.968105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.968365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.968393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.968726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.968760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.968786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.969040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.969067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.969398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.969407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.972639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.973689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.973720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.974721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.974913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.974954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.975629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.975658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.975684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.975975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.975984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.977718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.978760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.978792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.979782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.979964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.980964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.980994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.981431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.981459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.981673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.981682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.982918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.983179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.983206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.983460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.983801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.984700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.993 [2024-07-24 19:04:41.984730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.994 [2024-07-24 19:04:41.985606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.994 [2024-07-24 19:04:41.985635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.994 [2024-07-24 19:04:41.985814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.994 [2024-07-24 19:04:41.985822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.994 [2024-07-24 19:04:41.987032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.994 [2024-07-24 19:04:41.988124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.994 [2024-07-24 19:04:41.988162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.994 [2024-07-24 19:04:41.989200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.994 [2024-07-24 19:04:41.989502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.994 [2024-07-24 19:04:41.989774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.994 [2024-07-24 19:04:41.989806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.994 [2024-07-24 19:04:41.990068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.994 [2024-07-24 19:04:41.990094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.994 [2024-07-24 19:04:41.990436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:56.994 [2024-07-24 19:04:41.990446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.992125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.992397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.992426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.992699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.993019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.993297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.993329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.993598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.993629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.993965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.993974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.995791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.996074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.996102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.996364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.996703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.996971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.997002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.997263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.257 [2024-07-24 19:04:41.997293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:41.997595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:41.997605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:41.999360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:41.999629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:41.999661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:41.999931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.000254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.000522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.000553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.000811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.000841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.001212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.001220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.003008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.003271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.003309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.003573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.003896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.004163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.004209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.004475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.004503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.004832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.004841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.006556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.006826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.006855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.007119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.007404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.007681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.007714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.007976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.008003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.008330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.008339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.010281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.010568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.010596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.010623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.010918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.011180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.011209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.011475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.011507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.011751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.011760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.013904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.013937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.013975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.014232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.014604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.014869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.014910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.014935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.015189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.015492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.015520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.017578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.017840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.018104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.018366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.018697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.018962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.019218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.019480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.019742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.020033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.020042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.022053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.022316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.022581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.022838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.023177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.023442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.023731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.023998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.024260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.024604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.024614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.026647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.026919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.027179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.027444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.027735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.028004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.028260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.028520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.028781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.029122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.258 [2024-07-24 19:04:42.029131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.031259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.031530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.031790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.032046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.032324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.032592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.032852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.033113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.033370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.033710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.033720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.035620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.035903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.036172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.036438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.036720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.036999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.037266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.037533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.037799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.038145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.038154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.040326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.040599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.041722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.041997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.042182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.042487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.042753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.043014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.043277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.043597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.043607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.045625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.045899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.046169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.046201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.046498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.046770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.047034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.047297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.047569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.047813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.047823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.049905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.049956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.050222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.050491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.050832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.051102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.051368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.051398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.051675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.051981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.051992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.053965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.054253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.054531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.054565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.054893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.055163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.055191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.055453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.056124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.056345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.056353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.058300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.059326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.059358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.060379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.060628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.060671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.060942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.061210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.061241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.061613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.061623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.063757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.063790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.064310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.065144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.065327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.066353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.067312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.067342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.067605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.067939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.067950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.259 [2024-07-24 19:04:42.069448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.070440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.071426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.071456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.071772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.072833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.073942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.073974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.074952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.075136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.075144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.077508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.078336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.079331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.079360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.079547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.080352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.081307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.081337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.082270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.082452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.082461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.084359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.084699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.085554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.085583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.085766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.086897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.087600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.087628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.088166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.088351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.088360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.089712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.089972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.090229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.090258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.090557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.091500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.092518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.092548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.093536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.093718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.093727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.095726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.095986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.096245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.096277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.096631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.096895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.097941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.097978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.099040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.099226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.099234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.101398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.102526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.102787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.102816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.103156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.103418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.103680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.103709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.104670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.104891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.104899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.106799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.107782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.108768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.108798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.109070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.109340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.109600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.109628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.109882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.110147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.110156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.111670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.112499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.113489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.113519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.113700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.114545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.114807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.114836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.115090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.115444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.115455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.117627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.260 [2024-07-24 19:04:42.118081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.118966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.119966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.120148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.121222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.121494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.121523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.121778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.122061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.122070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.123419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.124416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.124446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.125002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.125186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.126043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.127062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.127092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.128072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.128354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.128363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.131619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.131656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.132437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.132465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.132649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.133485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.133515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.134481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.135456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.135705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.135714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.138963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.138993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.140072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.140102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.140281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.140322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.141359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.141387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.142171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.142380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.142388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.143844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.143875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.144134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.144162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.144487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.144521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.145638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.145669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.146763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.146943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.146951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.149148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.149186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.150119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.150149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.150429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.150461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.150723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.150752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.151009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.151341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.151350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.153066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.153097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.154151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.154179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.154361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.154396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.155424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.155460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.156474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.156796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.156805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.159329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.159360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.160378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.160409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.160598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.160641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.161357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.161384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.162505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.162692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.162700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.164292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.164324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.164595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.261 [2024-07-24 19:04:42.164623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.164917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.164952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.165763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.165792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.166779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.166962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.166970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.169154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.169188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.169842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.169870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.170196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.170232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.170488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.170528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.170783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.171115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.171124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.172577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.172610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.173476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.173506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.173693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.173744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.174733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.174763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.175515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.175810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.175824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.178261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.178293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.179279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.180275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.180555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.180601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.181599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.181639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.182634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.182815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.182824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.184378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.184646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.185399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.185429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.185680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.185720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.186699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.187681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.187710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.188000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.188009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.189042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.189070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.189094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.189118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.189479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.189517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.189543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.189567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.189592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.189903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.189913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.191452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.191483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.191508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.191532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.191708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.191746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.191771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.191796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.191820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.191995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.262 [2024-07-24 19:04:42.192004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.193143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.193170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.193196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.193227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.193405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.193439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.193464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.193501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.193529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.193816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.193825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.195554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.195581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.195604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.195628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.195843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.195881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.195906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.195930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.195954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.196133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.196140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.197278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.197313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.197341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.197364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.197543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.197585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.197611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.197635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.197660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.197850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.197858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.199565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.199593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.199618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.199646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.199826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.199856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.199891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.199916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.199947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.200128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.200135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.201331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.201359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.201386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.202377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.202560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.202599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.202624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.202648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.202672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.202902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.202911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.204769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.205837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.205872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.205897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.206076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.206112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.206138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.207261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.207297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.207479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.207487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.209615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.209646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.209672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.209932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.210271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.210304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.210564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.210592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.210618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.210949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.210958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.211989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.212016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.212458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.212490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.212685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.213808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.213840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.213866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.214850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.215030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.215039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.216872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.263 [2024-07-24 19:04:42.217500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.217530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.217555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.217736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.217775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.217800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.218788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.218817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.219096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.219105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.220453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.220494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.220521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.220775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.221098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.221131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.221157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.221928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.221957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.222182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.222191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.223285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.223313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.223344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.224389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.224581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.224622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.224648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.225683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.225720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.225990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.225999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.227817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.227844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.227868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.228674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.228860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.228900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.228925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.229911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.229940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.230213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.230227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.231255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.231283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.231308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.231574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.231906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.231940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.231966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.232220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.232246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.232580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.232590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.233681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.233710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.233734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.234368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.234558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.234595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.234619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.235497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.235526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.235706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.235715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.237357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.237386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.237430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.237709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.237971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.238008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.238033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.238860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.238889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.239067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.239075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.240152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.240179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.240205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.241191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.241371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.241411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.241436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.241778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.241807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.242150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.242159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.243717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.243748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.243772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.264 [2024-07-24 19:04:42.244757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.244937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.244977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.245002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.245793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.245823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.246002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.246010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.247128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.247157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.247189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.247214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.247585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.247622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.247648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.247906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.247932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.248264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.248276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.250429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.250471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.251030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.251058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.251276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.251316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.251341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.252347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.252377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.252557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.252566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.254224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.254492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.254521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.254783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.255070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.255120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.255376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.255404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.255430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.255722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.255731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.257535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.257802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.257834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.258096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.258427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.258705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.258739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.259003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.259037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.259362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.259370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.261163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.261439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.265 [2024-07-24 19:04:42.261486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.261749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.262092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.262372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.262401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.262675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.262722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.263025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.263034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.264915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.265180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.265218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.265482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.265814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.266088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.266119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.266381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.266409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.266749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.266758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.268391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.268654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.268683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.268945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.269229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.269505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.269538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.269804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.269830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.270169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.270178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.271948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.272208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.272236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.272494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.272839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.273106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.273146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.273407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.273438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.273728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.273738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.275452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.275740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.275773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.276035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.276374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.276650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.276680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.276942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.276984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.277288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.277297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.279044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.279305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.279335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.279624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.279934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.280219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.280249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.280515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.280542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.280896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.280908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.282672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.282931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.282958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.283212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.283495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.283764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.283795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.284051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.284078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.284409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.529 [2024-07-24 19:04:42.284419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.286098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.286383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.286412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.286439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.286791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.287060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.287096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.287360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.287392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.287722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.287731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.289786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.289829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.289869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.290134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.290391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.290667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.290698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.290726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.290987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.291322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.291332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.293269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.293540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.293807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.294075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.294370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.294646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.294911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.295173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.295450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.295745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.295755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.297917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.298193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.298478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.298744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.299105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.299382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.299657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.299929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.300195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.300504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.300513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.302454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.302739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.303003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.303270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.303453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.303768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.304825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.305089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.305345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.305639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.305648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.307560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.307820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.308077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.308336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.308593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.308861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.309120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.309376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.309636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.309896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.309905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.311822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:57.530 [2024-07-24 19:04:42.338767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.339584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.341357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.341630] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.341694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.341946] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.342342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.342529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.342581] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.343601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.343650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.344747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.344798] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.345558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.346388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.346574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.346587] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.346597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.350202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.350701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.351793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.352808] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.352991] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.353491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.530 [2024-07-24 19:04:42.353757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.354017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.354279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.354567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.354581] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.354591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.356514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.357433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.358296] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.358567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.358908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.359176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.359441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.360476] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.361408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.361605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.361618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.361628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.364552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.364825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.365356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.366172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.366354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.367412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.368020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.368836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.369749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.369934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.369948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.369960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.372408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.373215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.374174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.374729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.374913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.375726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.376632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.377155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.377419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.377753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.377767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.377777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.381306] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.382308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.383316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.383586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.383922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.384192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.384452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.385310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.386125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.386309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.386321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.386331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.388144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.388419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.388685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.388949] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.389280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.390349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.391425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.392555] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.393105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.393337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.393350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.393360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.396660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.397486] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.398471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.399451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.399675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.400617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.401444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.402359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.402770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.403131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.403147] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.403160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.405637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.406080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.406903] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.407820] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.408003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.408276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.408544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.408809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.409072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.409280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.409293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.409303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.412280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.531 [2024-07-24 19:04:42.412556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.412822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.413234] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.413424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.414386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.414659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.415645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.416650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.416979] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.416993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.417005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.419396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.420261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.420832] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.421738] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.421932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.422211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.422478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.422741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.423003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.423183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.423196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.423206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.426825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.427104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.427700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.428451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.428638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.429005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.429928] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.430771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.431040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.431385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.431402] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.431416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.433552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.434223] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.435182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.435451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.435794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.436061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.436325] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.437299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.438089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.438313] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.438327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.438346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.441256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.442137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.443214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.443595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.443778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.444902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.445172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.445433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.445700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.446045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.446062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.446076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.448164] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.448831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.449099] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.449368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.449736] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.450005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.451050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.452077] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.453170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.453575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.453589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.453600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.456717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.457757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.458525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.459260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.459449] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.460203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.460481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.460749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.461028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.461376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.461390] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.461402] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.463529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.463799] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.464073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.464342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.464681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.464950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.465213] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.465483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.465752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.466073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.532 [2024-07-24 19:04:42.466087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.466098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.468595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.468867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.469132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.469399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.469746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.470027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.470290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.470556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.470821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.471131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.471144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.471154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.473194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.474159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.474430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.474695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.475032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.475300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.475568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.476532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.477249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.477494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.477514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.477527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.481277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.481662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.482547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.483638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.483985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.484261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.484525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.484789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.485250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.485433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.485446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.485460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.487204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.487475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.488429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.489168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.489458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.490233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.490667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.490938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.491206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.491554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.491571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.491584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.494679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.494951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.494989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.495472] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.495675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.496159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.496922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.496960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.497227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.497559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.497574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.497588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.499936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.500335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.500605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.500875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.501254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.501532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.502546] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.502814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.503784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.504090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.504110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.504121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.533 [2024-07-24 19:04:42.507094] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.507371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.507638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.507902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.508233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.508815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.509437] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.510251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.510640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.510997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.511013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.511026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.513081] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.514054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.514329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.514590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.514855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.515207] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.515222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.515494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.516444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.516916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.517660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.517932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.518278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.518295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.518308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.518321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.520816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.520860] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.521124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.521155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.521515] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.521531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.521800] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.521836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.522753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.522787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.523061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.523073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.523084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.523094] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.524938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.524977] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.525989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.526029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.526232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.526245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.526772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.526810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.527749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.527792] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.527975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.527988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.528002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.528012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.530888] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.530941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.531722] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.531757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.531944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.531956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.532581] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.534 [2024-07-24 19:04:42.532619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.848 [2024-07-24 19:04:42.533637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.848 [2024-07-24 19:04:42.533670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.848 [2024-07-24 19:04:42.534050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.848 [2024-07-24 19:04:42.534066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.848 [2024-07-24 19:04:42.534082] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.534096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.536562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.536607] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.537608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.537648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.537999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.538012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.539085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.539123] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.539850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.539884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.540226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.540242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.540258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.540272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.544195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.544246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.545430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.545471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.545713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.545725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.546484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.546520] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.546791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.546826] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.547018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.547030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.547041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.547052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.549183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.549224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.550272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.550311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.550526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.550540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.550964] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.551000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.551269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.551304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.551607] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.551622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.551634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.551646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.554567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.554609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.555734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.555785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.556160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.556177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.556447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.556484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.557394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.557425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.557773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.557790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.557805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.557819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.560084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.560124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.560153] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.560181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.560385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.560398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.560700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.560737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.560771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.560803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.561131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.561145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.561156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.561167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.563836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.563889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.563919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.563947] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.564138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.564150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.564194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.564224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.564254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.564294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.564477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.564489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.564500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.564510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.566332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.566368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.566399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.566431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.849 [2024-07-24 19:04:42.566710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.566723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.566759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.566788] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.566817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.566844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.567061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.567074] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.567084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.567094] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.569878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.569932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.569967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.569997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.570360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.570376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.570414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.570447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.570482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.570518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.570737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.570749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.570760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.570770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.571931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.571964] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.571995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.572024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.572290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.572303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.572357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.572388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.572419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.572447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.572647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.572668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.572680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.572691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.574430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.574470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.574500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.574528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.574749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.574762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.574803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.574838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.574877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.574908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.575093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.575108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.575119] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.575129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.576727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.576776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.576806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.576836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.577183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.577198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.577235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.577267] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.577299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.577329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.577513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.577525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.577536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.577547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.580048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.580087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.580116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.580145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.580326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.580338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.580377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.580405] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.580434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.580462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.580713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.580725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.580736] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.580747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.582163] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.582199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.582231] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.582260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.582600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.582617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.582653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.582685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.582719] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.582751] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.582937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.582949] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.850 [2024-07-24 19:04:42.582960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.582970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.585327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.585364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.585404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.585434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.585616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.585628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.585671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.585700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.585728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.585756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.585981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.586002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.586013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.586025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.587879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.587911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.587947] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.587975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.588158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.588170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.588209] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.588242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.588270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.588298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.588479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.588491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.588501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.588511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.591429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.591466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.591498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.591526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.591802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.591814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.591854] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.591883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.591912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.591940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.592154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.592168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.592179] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.592191] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.593869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.593920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.593956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.593991] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.594180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.594195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.594235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.594271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.594319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.594349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.594540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.594552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.594562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.594573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.597539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.597575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.597604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.597631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.597889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.597901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.597944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.597979] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.598009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.598039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.598378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.598395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.598407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.598422] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.599902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.599933] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.599961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.599989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.600171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.600182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.600223] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.600255] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.600284] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.600312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.600613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.600634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.600645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.600657] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.603311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.603352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.603384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.603416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.603713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.603726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.851 [2024-07-24 19:04:42.603762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.603791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.603819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.603847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.604072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.604085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.604096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.604108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.605355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.605386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.605417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.605445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.605629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.605641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.605680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.605709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.605750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.605782] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.606025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.606037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.606048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.606059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.609103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.609153] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.609192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.609221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.609573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.609590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.609627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.609660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.609692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.609724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.609950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.609962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.609972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.609983] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.611128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.611160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.611188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.611217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.611397] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.611410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.611450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.611482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.611514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.611550] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.611730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.611742] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.611753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.611766] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.614651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.614690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.614724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.614757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.614940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.614951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.614987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.615016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.615051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.615090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.615272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.615284] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.615294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.615304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.616466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.616512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.616545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.616580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.616877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.616890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.616923] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.616957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.616992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.617024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.617348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.617364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.617379] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.617390] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.619963] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.620787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.620822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.620851] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.621032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.621045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.621086] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.621836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.621869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.621912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.622094] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.622106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.622117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.852 [2024-07-24 19:04:42.622127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.623889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.623924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.623955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.623986] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.624291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.624305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.624351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.624380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.624409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.624437] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.624621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.624632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.624643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.624653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.627563] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.627603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.627634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.627667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.627972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.627993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.628027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.628056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.628086] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.628115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.628458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.628477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.628492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.628507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.629622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.629658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.629689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.629729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.629916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.629929] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.629967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.629997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.630043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.630074] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.630261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.630274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.630285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.630296] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.632716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.632989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.633023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.634116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.634339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.634352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.634405] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.635415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.635450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.635718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.635904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.635916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.635930] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.635943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.637408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.637705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.637737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.638009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.638203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.638217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.638255] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.639040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.639076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.640087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.640384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.640409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.640421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.640433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.643274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.643923] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.643958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.853 [2024-07-24 19:04:42.644396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.644734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.644750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.644788] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.645743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.645776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.646687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.646884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.646896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.646906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.646916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.649652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.649931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.649964] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.650235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.650483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.650496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.650536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.651272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.651307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.652280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.652529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.652544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.652557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.652569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.655192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.655606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.655641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.656318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.656666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.656683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.656723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.657406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.657440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.658222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.658416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.658432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.658443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.658454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.661366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.661660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.661694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.661966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.662249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.662262] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.662302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.663061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.663099] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.664069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.664273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.664287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.664305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.664318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.666889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.667161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.667195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.668109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.668508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.668525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.668567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.669052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.669085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.669844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.670034] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.670048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.670060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.670071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.673206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.673486] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.673753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.674119] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.674305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.674320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.674364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.675532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.676620] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.677041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.677225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.677238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.677249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.677259] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.680690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.680965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.681960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.682857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.683048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.683061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.683690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.684625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.685622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.686667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.686950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.686963] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.854 [2024-07-24 19:04:42.686982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.686998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.689448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.690490] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.691157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.692173] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.692514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.692528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.692810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.693836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.694114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.694407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.694627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.694641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.694652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.694671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.698334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.698629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.698902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.699844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.700067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.700080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.700829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.701591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.702510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.702992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.703183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.703196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.703208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.703219] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.706034] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.706314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.707319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.708303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.708608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.708625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.708893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.709159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.709426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.710424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.710696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.710709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.710720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.710731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.713121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.714154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.714420] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.714926] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.715143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.715156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.716324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.716679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.717631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.718505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.718819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.718832] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.718844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.718856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.722359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.723315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.723756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.724783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.725150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.725165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.725433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.726509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.726784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.727329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.727560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.727574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.727584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.727596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.731175] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.731475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.732386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.733131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.733354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.733367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.734107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.735063] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.735546] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.736600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.736973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.736989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.737003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.737019] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.739327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.740374] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.741386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.741658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.741994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.742010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.855 [2024-07-24 19:04:42.742276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.742548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.743467] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.744335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.744534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.744548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.744559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.744570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.746839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.747873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.748147] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.748714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.748975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.748988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.749503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.750222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.750501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.750767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.751070] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.751083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.751095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.751107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.754735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.755088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.755350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.755624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.755873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.755887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.756169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.756436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.756725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.756999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.757368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.757382] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.757394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.757408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.760409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.760691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.760979] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.761251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.761559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.761574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.761848] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.762575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.763590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.764027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.764212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.764225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.764235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.764246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.767675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.767983] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.768248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.768518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.768826] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.768840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.769106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.769962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.770725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.771275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.771525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.771538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.771549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.771560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.773603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.774715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.775006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.776080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.776373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.776386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.776680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.776942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.777206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.778052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.778286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.778299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.778309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.778319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.781208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.781587] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.781862] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.782746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.782999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.783013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.784151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.785048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.785391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.785664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.785978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.785993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.786004] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.856 [2024-07-24 19:04:42.786016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.788598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.788895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.789625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.790141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.790494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.790509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.791342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.791722] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.792491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.792967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.793371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.793387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.793403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.793418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.796504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.796794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.797068] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.797816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.798072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.798085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.798366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.799203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.799609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.800328] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.800576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.800591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.800604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.800616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.802773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.803777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.804055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.804330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.804615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.804629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.805443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.805720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.806459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.806967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.807245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.807258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.807269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.807280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.810160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.810436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.811275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.811667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.812034] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.812050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.812365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.813302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.813573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.814186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.814408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.814421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.814432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.814443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.816450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.817386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.817714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.818430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.818693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.818713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.818992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.819257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.820294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.820588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.820904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.820918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.820930] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.820941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.822587] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.822859] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.823126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.823747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.823975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.823998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.824627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.825227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.825511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.825794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.825977] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.825990] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.826001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.826011] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.829570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.829623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.830683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.830960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.831292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.831309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.832088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.857 [2024-07-24 19:04:42.832124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.858 [2024-07-24 19:04:42.832466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.858 [2024-07-24 19:04:42.832739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.858 [2024-07-24 19:04:42.832927] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.858 [2024-07-24 19:04:42.832940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.858 [2024-07-24 19:04:42.832955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:57.858 [2024-07-24 19:04:42.832965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.835390] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.835791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.836655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.836928] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.837197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.837210] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.838009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.838997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.839512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.840453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.840656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.840670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.840682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.840695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.844353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.845199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.845607] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.846707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.846894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.846908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.847292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.847582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.848220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.848831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.849181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.849199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.849212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.849227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.851996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.852271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.852558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.853642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.854020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.854034] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.854329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.855245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.856391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.856943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.857133] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.857146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.857157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.857168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.858943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.121 [2024-07-24 19:04:42.858986] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.860035] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.860077] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.860262] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.860275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.860752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.860795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.861951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.861996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.862183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.862195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.862206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.862217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.865076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.865117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.865898] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.865940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.866180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.866194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.866988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.867025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.867919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.867955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.868247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.868260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.868271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.868282] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.871388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.871447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.872261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.872304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.872555] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.872568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.873485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.873537] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.874141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.874188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.874542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.874559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.874575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.874589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.876355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.876415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.877401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.877439] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.877740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.877758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.878034] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.878069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.879099] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.879140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.879530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.879547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.879562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.879576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.882058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.882115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.883022] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.883056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.883239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.883252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.884316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.884351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.885229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.885269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.885554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.885568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.885580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.885592] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.888084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.888126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.889102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.889136] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.889322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.889334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.889818] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.889853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.890720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.890764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.890948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.890961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.890972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.890982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.122 [2024-07-24 19:04:42.892771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.892813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.893114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.893149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.893343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.893357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.894496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.894538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.895526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.895560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.895747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.895762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.895774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.895786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.898017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.898060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.898090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.898122] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.898442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.898455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.899578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.899612] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.899643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.899673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.900023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.900041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.900056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.900068] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.902081] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.902120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.902150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.902182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.902368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.902379] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.902416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.902445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.902493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.902541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.902729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.902741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.902751] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.902762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.904696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.904733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.904762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.904790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.905043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.905056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.905095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.905129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.905163] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.905194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.905537] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.905553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.905567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.905582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.906557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.906592] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.906621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.906650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.906925] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.906937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.906978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.907006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.907035] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.907063] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.907287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.907300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.907310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.907321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.908424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.908461] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.908500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.908533] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.908863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.908876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.908923] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.908952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.908984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.123 [2024-07-24 19:04:42.909014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.909340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.909355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.909369] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.909384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.910609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.910645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.910678] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.910713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.911029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.911043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.911090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.911120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.911150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.911180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.911525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.911543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.911558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.911572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.912976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.913012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.913040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.913069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.913251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.913263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.913303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.913332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.913366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.913405] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.913712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.913724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.913735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.913745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.914652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.914694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.914732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.914762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.915029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.915044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.915082] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.915114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.915148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.915180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.915449] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.915474] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.915486] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.915513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.916800] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.916836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.916877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.916912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.917097] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.917110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.917158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.917190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.917224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.917256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.917441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.917454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.917465] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.917481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.918530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.918566] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.918596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.918628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.918941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.918954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.919001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.919035] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.919065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.919095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.919423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.919440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.919454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.919473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.920570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.920607] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.920636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.920665] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.920886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.920899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.920943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.920974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.921003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.921032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.921218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.921231] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.124 [2024-07-24 19:04:42.921242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.921253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.922601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.922639] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.922671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.922703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.922993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.923005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.923042] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.923071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.923101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.923133] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.923348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.923360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.923370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.923381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.924417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.924453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.924518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.924552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.924738] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.924751] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.924804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.924836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.924870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.924901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.925181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.925195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.925206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.925218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.926555] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.926601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.926663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.926693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.926896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.926909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.926951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.926981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.927017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.927049] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.927234] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.927246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.927260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.927273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.928196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.928231] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.928260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.928299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.928569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.928588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.928627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.928656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.928686] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.928715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.929025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.929041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.929051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.929062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.930249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.930286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.930316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.930345] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.930625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.930640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.930684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.930717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.930756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.930784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.930999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.931011] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.931022] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.931032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.932133] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.125 [2024-07-24 19:04:42.932173] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.932209] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.932240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.932523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.932537] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.932575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.932604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.932640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.932673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.932997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.933015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.933028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.933039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.933953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.933994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.934027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.934058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.934243] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.934256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.934293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.934323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.934360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.934390] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.934586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.934599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.934610] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.934621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.935776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.935819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.935852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.935885] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.936222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.936237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.936274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.936305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.936338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.936371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.936559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.936572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.936582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.936593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.937563] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.937600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.937630] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.937659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.937877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.937890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.937932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.937961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.937991] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.938020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.938204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.938217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.938228] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.938239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.939480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.939518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.939555] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.939588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.939933] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.939948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.939990] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.940020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.940050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.940080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.940301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.940314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.940325] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.940336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.941273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.942089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.942124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.942153] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.942335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.126 [2024-07-24 19:04:42.942347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.942389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.942839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.942883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.942914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.943260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.943275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.943291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.943304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.944563] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.944605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.944634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.944662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.944846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.944859] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.944899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.944929] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.944960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.944988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.945216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.945229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.945241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.945253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.946088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.946123] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.946159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.946190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.946460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.946478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.946533] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.946564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.946594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.946627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.946920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.946933] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.946955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.946973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.948018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.948054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.948084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.948112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.948293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.948306] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.948346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.948377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.948406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.948437] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.948742] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.948758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.948769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.948780] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.949623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.949905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.949939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.950201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.950554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.950571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.950616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.951264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.951297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.952101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.952286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.952299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.952310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.952320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.953283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.954215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.954252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.954609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.954969] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.954985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.955026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.955289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.955327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.955735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.127 [2024-07-24 19:04:42.955927] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.955940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.955952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.955968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.956857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.957713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.957760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.958659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.958854] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.958869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.958914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.959182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.959219] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.959487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.959810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.959825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.959839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.959854] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.960597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.961622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.961659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.962109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.962296] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.962309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.962354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.963460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.963518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.964592] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.964850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.964872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.964890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.964901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.966123] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.967137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.967185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.968305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.968650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.968664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.968709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.969562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.969596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.970528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.970722] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.970747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.970759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.970779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.971848] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.972458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.972498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.973226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.973418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.973432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.973483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.974440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.974478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.975321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.975509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.975523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.975533] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.975545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.976661] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.976932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.976967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.978135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.978334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.978347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.978391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.978830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.978870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.979957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.980142] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.980155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.980166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.980176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.981211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.981485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.982564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.983585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.983935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.983948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.983993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.985085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.986170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.986448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.986816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.986834] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.128 [2024-07-24 19:04:42.986846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.986857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.988604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.989046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.990128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.991183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.991479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.991494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.991773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.992062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.992710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.993473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.993662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.993674] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.993691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.993703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.994854] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.995133] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.995411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.996159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.996418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.996431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.997373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.997933] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.998874] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.999565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.999879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.999894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.999906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:42.999918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.001559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.002389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.003000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.003937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.004155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.004170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.004450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.004737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.005016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.005950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.006138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.006151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.006161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.006172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.007783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.008081] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.008357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.008841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.009036] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.009050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.010114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.010403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.011327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.012300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.012552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.012567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.012579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.012590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.014367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.015389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.016406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.016949] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.017135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.017148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.018097] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.018361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.018632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.018901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.019088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.019100] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.019114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.019125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.020955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.021645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.021913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.022179] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.022495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.022510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.022784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.023064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.023335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.023627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.023879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.023892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.023912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.023924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.025144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.025419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.129 [2024-07-24 19:04:43.025696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.025964] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.026242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.026256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.026534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.026805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.027095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.027368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.027727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.027744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.027760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.027775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.029549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.029833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.030099] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.030392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.030580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.030594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.031560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.032617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.033135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.033944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.034127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.034140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.034151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.034161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.035185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.035570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.036427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.036706] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.036972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.036986] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.037279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.037618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.038534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.039648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.040033] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.040047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.040059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.040070] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.041129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.041409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.042561] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.043165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.043391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.043404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.044541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.044815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.045089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.045362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.045556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.045570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.045581] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.045591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.047589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:58.130 [2024-07-24 19:04:43.049530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:58.130 [2024-07-24 19:04:43.049616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:58.389 00:27:58.389 Latency(us) 00:27:58.389 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:58.389 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:58.389 Verification LBA range: start 0x0 length 0x100 00:27:58.389 crypto_ram : 5.51 58.26 3.64 0.00 0.00 2093320.37 35951.18 1725656.50 00:27:58.389 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:58.389 Verification LBA range: start 0x100 length 0x100 00:27:58.389 crypto_ram : 5.46 54.02 3.38 0.00 0.00 2259891.90 15915.89 1797558.86 00:27:58.389 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:58.389 Verification LBA range: start 0x0 length 0x100 00:27:58.389 crypto_ram1 : 5.52 62.08 3.88 0.00 0.00 1939609.75 28336.52 1589840.94 00:27:58.389 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:58.389 Verification LBA range: start 0x100 length 0x100 00:27:58.389 crypto_ram1 : 5.48 57.99 3.62 0.00 0.00 2071663.62 15791.06 1653754.15 00:27:58.389 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:58.389 Verification LBA range: start 0x0 length 0x100 00:27:58.389 crypto_ram2 : 5.38 415.24 25.95 0.00 0.00 285951.28 15354.15 437405.99 00:27:58.389 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:58.389 Verification LBA range: start 0x100 length 0x100 00:27:58.389 crypto_ram2 : 5.39 404.69 25.29 0.00 0.00 293469.61 27462.70 449389.71 00:27:58.389 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:58.389 Verification LBA range: start 0x0 length 0x100 00:27:58.389 crypto_ram3 : 5.45 429.28 26.83 0.00 0.00 271153.28 8051.57 329552.46 00:27:58.389 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:58.389 Verification LBA range: start 0x100 length 0x100 00:27:58.389 crypto_ram3 : 5.45 421.91 26.37 0.00 0.00 276131.31 11671.65 307582.29 00:27:58.389 =================================================================================================================== 00:27:58.389 Total : 1903.47 118.97 0.00 0.00 504503.02 8051.57 1797558.86 00:27:58.647 00:27:58.647 real 0m8.395s 00:27:58.647 user 0m16.147s 00:27:58.647 sys 0m0.315s 00:27:58.647 19:04:43 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:58.647 19:04:43 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:27:58.647 ************************************ 00:27:58.647 END TEST bdev_verify_big_io 00:27:58.647 ************************************ 00:27:58.647 19:04:43 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:58.647 19:04:43 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:27:58.647 19:04:43 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:58.647 19:04:43 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:58.647 ************************************ 00:27:58.647 START TEST bdev_write_zeroes 00:27:58.647 ************************************ 00:27:58.647 19:04:43 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:58.905 [2024-07-24 19:04:43.702764] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:27:58.905 [2024-07-24 19:04:43.702797] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2256187 ] 00:27:58.905 [2024-07-24 19:04:43.765643] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:58.905 [2024-07-24 19:04:43.837474] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:58.905 [2024-07-24 19:04:43.858386] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:27:58.905 [2024-07-24 19:04:43.866411] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:58.905 [2024-07-24 19:04:43.874430] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:59.163 [2024-07-24 19:04:43.972633] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:28:01.692 [2024-07-24 19:04:46.107491] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:28:01.692 [2024-07-24 19:04:46.107541] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:01.692 [2024-07-24 19:04:46.107549] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:01.692 [2024-07-24 19:04:46.115502] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:28:01.692 [2024-07-24 19:04:46.115513] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:01.692 [2024-07-24 19:04:46.115518] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:01.692 [2024-07-24 19:04:46.123709] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:28:01.692 [2024-07-24 19:04:46.123720] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:01.693 [2024-07-24 19:04:46.123725] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:01.693 [2024-07-24 19:04:46.131740] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:28:01.693 [2024-07-24 19:04:46.131751] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:01.693 [2024-07-24 19:04:46.131756] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:01.693 Running I/O for 1 seconds... 00:28:02.258 00:28:02.258 Latency(us) 00:28:02.258 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:02.258 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:02.258 crypto_ram : 1.02 3085.18 12.05 0.00 0.00 41290.83 3588.88 48683.89 00:28:02.258 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:02.258 crypto_ram1 : 1.02 3090.89 12.07 0.00 0.00 41064.89 3542.06 45438.29 00:28:02.258 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:02.258 crypto_ram2 : 1.01 24071.67 94.03 0.00 0.00 5264.90 1560.38 6834.47 00:28:02.258 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:02.258 crypto_ram3 : 1.01 24103.35 94.15 0.00 0.00 5247.21 1560.38 5960.66 00:28:02.258 =================================================================================================================== 00:28:02.258 Total : 54351.08 212.31 0.00 0.00 9349.31 1560.38 48683.89 00:28:02.516 00:28:02.516 real 0m3.869s 00:28:02.516 user 0m3.578s 00:28:02.516 sys 0m0.253s 00:28:02.516 19:04:47 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:02.516 19:04:47 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:28:02.516 ************************************ 00:28:02.516 END TEST bdev_write_zeroes 00:28:02.516 ************************************ 00:28:02.775 19:04:47 blockdev_crypto_qat -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:02.775 19:04:47 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:28:02.775 19:04:47 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:02.775 19:04:47 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:02.775 ************************************ 00:28:02.775 START TEST bdev_json_nonenclosed 00:28:02.775 ************************************ 00:28:02.775 19:04:47 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:02.775 [2024-07-24 19:04:47.628606] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:28:02.775 [2024-07-24 19:04:47.628645] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2256878 ] 00:28:02.775 [2024-07-24 19:04:47.684280] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:02.775 [2024-07-24 19:04:47.756424] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:02.775 [2024-07-24 19:04:47.756499] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:28:02.775 [2024-07-24 19:04:47.756509] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:28:02.775 [2024-07-24 19:04:47.756515] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:28:03.033 00:28:03.033 real 0m0.237s 00:28:03.033 user 0m0.151s 00:28:03.033 sys 0m0.084s 00:28:03.033 19:04:47 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:03.033 19:04:47 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:28:03.033 ************************************ 00:28:03.033 END TEST bdev_json_nonenclosed 00:28:03.033 ************************************ 00:28:03.033 19:04:47 blockdev_crypto_qat -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:03.033 19:04:47 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:28:03.033 19:04:47 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:03.033 19:04:47 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:03.033 ************************************ 00:28:03.033 START TEST bdev_json_nonarray 00:28:03.033 ************************************ 00:28:03.033 19:04:47 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:03.033 [2024-07-24 19:04:47.940872] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:28:03.033 [2024-07-24 19:04:47.940905] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2256901 ] 00:28:03.033 [2024-07-24 19:04:48.002021] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:03.292 [2024-07-24 19:04:48.076024] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:03.292 [2024-07-24 19:04:48.076079] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:28:03.292 [2024-07-24 19:04:48.076089] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:28:03.292 [2024-07-24 19:04:48.076095] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:28:03.292 00:28:03.292 real 0m0.258s 00:28:03.292 user 0m0.157s 00:28:03.292 sys 0m0.099s 00:28:03.292 19:04:48 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:03.292 19:04:48 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:28:03.292 ************************************ 00:28:03.292 END TEST bdev_json_nonarray 00:28:03.292 ************************************ 00:28:03.292 19:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@786 -- # [[ crypto_qat == bdev ]] 00:28:03.292 19:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@793 -- # [[ crypto_qat == gpt ]] 00:28:03.292 19:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@797 -- # [[ crypto_qat == crypto_sw ]] 00:28:03.292 19:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:28:03.292 19:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # cleanup 00:28:03.292 19:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:28:03.292 19:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:03.292 19:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:28:03.292 19:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:28:03.292 19:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:28:03.292 19:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:28:03.292 00:28:03.292 real 1m6.281s 00:28:03.292 user 2m39.431s 00:28:03.292 sys 0m5.962s 00:28:03.292 19:04:48 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:03.292 19:04:48 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:03.292 ************************************ 00:28:03.292 END TEST blockdev_crypto_qat 00:28:03.292 ************************************ 00:28:03.292 19:04:48 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:28:03.292 19:04:48 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:28:03.292 19:04:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:03.292 19:04:48 -- common/autotest_common.sh@10 -- # set +x 00:28:03.292 ************************************ 00:28:03.292 START TEST chaining 00:28:03.292 ************************************ 00:28:03.292 19:04:48 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:28:03.551 * Looking for test storage... 00:28:03.551 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:28:03.551 19:04:48 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@7 -- # uname -s 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:28:03.551 19:04:48 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:28:03.551 19:04:48 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:28:03.551 19:04:48 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:28:03.551 19:04:48 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:03.551 19:04:48 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:03.551 19:04:48 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:03.551 19:04:48 chaining -- paths/export.sh@5 -- # export PATH 00:28:03.551 19:04:48 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@47 -- # : 0 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:28:03.551 19:04:48 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:28:03.551 19:04:48 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:28:03.551 19:04:48 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:28:03.551 19:04:48 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:28:03.551 19:04:48 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:28:03.551 19:04:48 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:28:03.551 19:04:48 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:28:03.551 19:04:48 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:28:03.551 19:04:48 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:28:03.551 19:04:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@296 -- # e810=() 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@297 -- # x722=() 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@298 -- # mlx=() 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:28:10.112 19:04:54 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:af:00.0 (0x8086 - 0x159b)' 00:28:10.113 Found 0000:af:00.0 (0x8086 - 0x159b) 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:af:00.1 (0x8086 - 0x159b)' 00:28:10.113 Found 0000:af:00.1 (0x8086 - 0x159b) 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:af:00.0: cvl_0_0' 00:28:10.113 Found net devices under 0000:af:00.0: cvl_0_0 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:af:00.1: cvl_0_1' 00:28:10.113 Found net devices under 0000:af:00.1: cvl_0_1 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:28:10.113 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:28:10.113 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.162 ms 00:28:10.113 00:28:10.113 --- 10.0.0.2 ping statistics --- 00:28:10.113 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:28:10.113 rtt min/avg/max/mdev = 0.162/0.162/0.162/0.000 ms 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:28:10.113 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:28:10.113 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.212 ms 00:28:10.113 00:28:10.113 --- 10.0.0.1 ping statistics --- 00:28:10.113 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:28:10.113 rtt min/avg/max/mdev = 0.212/0.212/0.212/0.000 ms 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@422 -- # return 0 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:28:10.113 19:04:54 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:28:10.113 19:04:54 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:28:10.113 19:04:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@481 -- # nvmfpid=2260524 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@482 -- # waitforlisten 2260524 00:28:10.113 19:04:54 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:28:10.113 19:04:54 chaining -- common/autotest_common.sh@829 -- # '[' -z 2260524 ']' 00:28:10.113 19:04:54 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:10.113 19:04:54 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:10.113 19:04:54 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:10.113 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:10.113 19:04:54 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:10.113 19:04:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:10.113 [2024-07-24 19:04:54.883956] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:28:10.113 [2024-07-24 19:04:54.883994] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:10.113 [2024-07-24 19:04:54.949698] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:10.113 [2024-07-24 19:04:55.025684] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:28:10.113 [2024-07-24 19:04:55.025723] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:28:10.113 [2024-07-24 19:04:55.025729] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:28:10.113 [2024-07-24 19:04:55.025735] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:28:10.113 [2024-07-24 19:04:55.025739] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:28:10.113 [2024-07-24 19:04:55.025779] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:10.680 19:04:55 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:10.680 19:04:55 chaining -- common/autotest_common.sh@862 -- # return 0 00:28:10.680 19:04:55 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:28:10.680 19:04:55 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:28:10.680 19:04:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:10.939 19:04:55 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@69 -- # mktemp 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.F39cik6TT7 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@69 -- # mktemp 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.5ftofDU3nH 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:28:10.939 19:04:55 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:10.939 19:04:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:10.939 malloc0 00:28:10.939 true 00:28:10.939 true 00:28:10.939 [2024-07-24 19:04:55.760834] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:28:10.939 crypto0 00:28:10.939 [2024-07-24 19:04:55.768862] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:28:10.939 crypto1 00:28:10.939 [2024-07-24 19:04:55.776947] tcp.c: 729:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:28:10.939 [2024-07-24 19:04:55.793110] tcp.c:1058:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:28:10.939 19:04:55 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@85 -- # update_stats 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:10.939 19:04:55 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:10.939 19:04:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:10.939 19:04:55 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:10.939 19:04:55 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:10.939 19:04:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:10.939 19:04:55 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:10.939 19:04:55 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:10.939 19:04:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:10.939 19:04:55 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:10.939 19:04:55 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:10.940 19:04:55 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:10.940 19:04:55 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:10.940 19:04:55 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:11.198 19:04:55 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:11.198 19:04:55 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:11.198 19:04:55 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:11.198 19:04:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:11.198 19:04:55 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:11.198 19:04:55 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:28:11.198 19:04:55 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.F39cik6TT7 bs=1K count=64 00:28:11.198 64+0 records in 00:28:11.198 64+0 records out 00:28:11.198 65536 bytes (66 kB, 64 KiB) copied, 0.000843567 s, 77.7 MB/s 00:28:11.198 19:04:55 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.F39cik6TT7 --ob Nvme0n1 --bs 65536 --count 1 00:28:11.198 19:04:55 chaining -- bdev/chaining.sh@25 -- # local config 00:28:11.198 19:04:55 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:28:11.198 19:04:55 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:28:11.198 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:28:11.198 19:04:56 chaining -- bdev/chaining.sh@31 -- # config='{ 00:28:11.198 "subsystems": [ 00:28:11.198 { 00:28:11.198 "subsystem": "bdev", 00:28:11.198 "config": [ 00:28:11.198 { 00:28:11.198 "method": "bdev_nvme_attach_controller", 00:28:11.198 "params": { 00:28:11.198 "trtype": "tcp", 00:28:11.198 "adrfam": "IPv4", 00:28:11.198 "name": "Nvme0", 00:28:11.198 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:11.198 "traddr": "10.0.0.2", 00:28:11.199 "trsvcid": "4420" 00:28:11.199 } 00:28:11.199 }, 00:28:11.199 { 00:28:11.199 "method": "bdev_set_options", 00:28:11.199 "params": { 00:28:11.199 "bdev_auto_examine": false 00:28:11.199 } 00:28:11.199 } 00:28:11.199 ] 00:28:11.199 } 00:28:11.199 ] 00:28:11.199 }' 00:28:11.199 19:04:56 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.F39cik6TT7 --ob Nvme0n1 --bs 65536 --count 1 00:28:11.199 19:04:56 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:28:11.199 "subsystems": [ 00:28:11.199 { 00:28:11.199 "subsystem": "bdev", 00:28:11.199 "config": [ 00:28:11.199 { 00:28:11.199 "method": "bdev_nvme_attach_controller", 00:28:11.199 "params": { 00:28:11.199 "trtype": "tcp", 00:28:11.199 "adrfam": "IPv4", 00:28:11.199 "name": "Nvme0", 00:28:11.199 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:11.199 "traddr": "10.0.0.2", 00:28:11.199 "trsvcid": "4420" 00:28:11.199 } 00:28:11.199 }, 00:28:11.199 { 00:28:11.199 "method": "bdev_set_options", 00:28:11.199 "params": { 00:28:11.199 "bdev_auto_examine": false 00:28:11.199 } 00:28:11.199 } 00:28:11.199 ] 00:28:11.199 } 00:28:11.199 ] 00:28:11.199 }' 00:28:11.199 [2024-07-24 19:04:56.070758] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:28:11.199 [2024-07-24 19:04:56.070797] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2260795 ] 00:28:11.199 [2024-07-24 19:04:56.134041] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:11.199 [2024-07-24 19:04:56.206907] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:11.715  Copying: 64/64 [kB] (average 15 MBps) 00:28:11.716 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:11.716 19:04:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:11.716 19:04:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:11.716 19:04:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:11.716 19:04:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:11.716 19:04:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:11.716 19:04:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:11.716 19:04:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:11.716 19:04:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:11.716 19:04:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:11.716 19:04:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:11.974 19:04:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:11.974 19:04:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:11.974 19:04:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@96 -- # update_stats 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:11.974 19:04:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:11.974 19:04:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:11.974 19:04:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:11.974 19:04:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:11.974 19:04:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:11.974 19:04:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:11.974 19:04:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:11.974 19:04:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:11.974 19:04:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:11.974 19:04:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:11.974 19:04:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:11.974 19:04:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.5ftofDU3nH --ib Nvme0n1 --bs 65536 --count 1 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@25 -- # local config 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:28:11.974 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@31 -- # config='{ 00:28:11.974 "subsystems": [ 00:28:11.974 { 00:28:11.974 "subsystem": "bdev", 00:28:11.974 "config": [ 00:28:11.974 { 00:28:11.974 "method": "bdev_nvme_attach_controller", 00:28:11.974 "params": { 00:28:11.974 "trtype": "tcp", 00:28:11.974 "adrfam": "IPv4", 00:28:11.974 "name": "Nvme0", 00:28:11.974 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:11.974 "traddr": "10.0.0.2", 00:28:11.974 "trsvcid": "4420" 00:28:11.974 } 00:28:11.974 }, 00:28:11.974 { 00:28:11.974 "method": "bdev_set_options", 00:28:11.974 "params": { 00:28:11.974 "bdev_auto_examine": false 00:28:11.974 } 00:28:11.974 } 00:28:11.974 ] 00:28:11.974 } 00:28:11.974 ] 00:28:11.974 }' 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.5ftofDU3nH --ib Nvme0n1 --bs 65536 --count 1 00:28:11.974 19:04:56 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:28:11.974 "subsystems": [ 00:28:11.974 { 00:28:11.974 "subsystem": "bdev", 00:28:11.974 "config": [ 00:28:11.974 { 00:28:11.974 "method": "bdev_nvme_attach_controller", 00:28:11.974 "params": { 00:28:11.974 "trtype": "tcp", 00:28:11.974 "adrfam": "IPv4", 00:28:11.974 "name": "Nvme0", 00:28:11.974 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:11.974 "traddr": "10.0.0.2", 00:28:11.974 "trsvcid": "4420" 00:28:11.974 } 00:28:11.974 }, 00:28:11.974 { 00:28:11.974 "method": "bdev_set_options", 00:28:11.974 "params": { 00:28:11.974 "bdev_auto_examine": false 00:28:11.974 } 00:28:11.974 } 00:28:11.974 ] 00:28:11.974 } 00:28:11.974 ] 00:28:11.974 }' 00:28:12.233 [2024-07-24 19:04:57.027866] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:28:12.233 [2024-07-24 19:04:57.027911] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2260947 ] 00:28:12.233 [2024-07-24 19:04:57.091346] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:12.233 [2024-07-24 19:04:57.162840] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:12.750  Copying: 64/64 [kB] (average 10 MBps) 00:28:12.750 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:12.750 19:04:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:12.750 19:04:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:12.750 19:04:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:12.750 19:04:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:12.750 19:04:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:12.750 19:04:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:12.750 19:04:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:12.750 19:04:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:12.750 19:04:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:12.750 19:04:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:12.750 19:04:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:12.750 19:04:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.F39cik6TT7 /tmp/tmp.5ftofDU3nH 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@25 -- # local config 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:28:12.750 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@31 -- # config='{ 00:28:12.750 "subsystems": [ 00:28:12.750 { 00:28:12.750 "subsystem": "bdev", 00:28:12.750 "config": [ 00:28:12.750 { 00:28:12.750 "method": "bdev_nvme_attach_controller", 00:28:12.750 "params": { 00:28:12.750 "trtype": "tcp", 00:28:12.750 "adrfam": "IPv4", 00:28:12.750 "name": "Nvme0", 00:28:12.750 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:12.750 "traddr": "10.0.0.2", 00:28:12.750 "trsvcid": "4420" 00:28:12.750 } 00:28:12.750 }, 00:28:12.750 { 00:28:12.750 "method": "bdev_set_options", 00:28:12.750 "params": { 00:28:12.750 "bdev_auto_examine": false 00:28:12.750 } 00:28:12.750 } 00:28:12.750 ] 00:28:12.750 } 00:28:12.750 ] 00:28:12.750 }' 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:28:12.750 19:04:57 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:28:12.750 "subsystems": [ 00:28:12.750 { 00:28:12.750 "subsystem": "bdev", 00:28:12.750 "config": [ 00:28:12.750 { 00:28:12.750 "method": "bdev_nvme_attach_controller", 00:28:12.750 "params": { 00:28:12.750 "trtype": "tcp", 00:28:12.750 "adrfam": "IPv4", 00:28:12.750 "name": "Nvme0", 00:28:12.750 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:12.750 "traddr": "10.0.0.2", 00:28:12.750 "trsvcid": "4420" 00:28:12.750 } 00:28:12.750 }, 00:28:12.750 { 00:28:12.750 "method": "bdev_set_options", 00:28:12.750 "params": { 00:28:12.750 "bdev_auto_examine": false 00:28:12.750 } 00:28:12.750 } 00:28:12.750 ] 00:28:12.750 } 00:28:12.750 ] 00:28:12.750 }' 00:28:13.009 [2024-07-24 19:04:57.801350] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:28:13.009 [2024-07-24 19:04:57.801393] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2261090 ] 00:28:13.009 [2024-07-24 19:04:57.865709] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:13.009 [2024-07-24 19:04:57.938248] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:13.527  Copying: 64/64 [kB] (average 31 MBps) 00:28:13.527 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@106 -- # update_stats 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:13.527 19:04:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:13.527 19:04:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:13.527 19:04:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:13.527 19:04:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:13.527 19:04:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:13.527 19:04:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:13.527 19:04:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:13.527 19:04:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:13.527 19:04:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:13.786 19:04:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:13.786 19:04:58 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:28:13.786 19:04:58 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:28:13.786 19:04:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:13.786 19:04:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:13.786 19:04:58 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:13.786 19:04:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:13.786 19:04:58 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:13.786 19:04:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:13.786 19:04:58 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:13.786 19:04:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:13.786 19:04:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:13.786 19:04:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:13.786 19:04:58 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:28:13.786 19:04:58 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.F39cik6TT7 --ob Nvme0n1 --bs 4096 --count 16 00:28:13.786 19:04:58 chaining -- bdev/chaining.sh@25 -- # local config 00:28:13.786 19:04:58 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:28:13.786 19:04:58 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:28:13.786 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:28:13.786 19:04:58 chaining -- bdev/chaining.sh@31 -- # config='{ 00:28:13.786 "subsystems": [ 00:28:13.786 { 00:28:13.786 "subsystem": "bdev", 00:28:13.786 "config": [ 00:28:13.786 { 00:28:13.786 "method": "bdev_nvme_attach_controller", 00:28:13.786 "params": { 00:28:13.786 "trtype": "tcp", 00:28:13.786 "adrfam": "IPv4", 00:28:13.786 "name": "Nvme0", 00:28:13.786 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:13.786 "traddr": "10.0.0.2", 00:28:13.786 "trsvcid": "4420" 00:28:13.786 } 00:28:13.786 }, 00:28:13.786 { 00:28:13.786 "method": "bdev_set_options", 00:28:13.786 "params": { 00:28:13.786 "bdev_auto_examine": false 00:28:13.786 } 00:28:13.786 } 00:28:13.786 ] 00:28:13.786 } 00:28:13.786 ] 00:28:13.786 }' 00:28:13.787 19:04:58 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.F39cik6TT7 --ob Nvme0n1 --bs 4096 --count 16 00:28:13.787 19:04:58 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:28:13.787 "subsystems": [ 00:28:13.787 { 00:28:13.787 "subsystem": "bdev", 00:28:13.787 "config": [ 00:28:13.787 { 00:28:13.787 "method": "bdev_nvme_attach_controller", 00:28:13.787 "params": { 00:28:13.787 "trtype": "tcp", 00:28:13.787 "adrfam": "IPv4", 00:28:13.787 "name": "Nvme0", 00:28:13.787 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:13.787 "traddr": "10.0.0.2", 00:28:13.787 "trsvcid": "4420" 00:28:13.787 } 00:28:13.787 }, 00:28:13.787 { 00:28:13.787 "method": "bdev_set_options", 00:28:13.787 "params": { 00:28:13.787 "bdev_auto_examine": false 00:28:13.787 } 00:28:13.787 } 00:28:13.787 ] 00:28:13.787 } 00:28:13.787 ] 00:28:13.787 }' 00:28:13.787 [2024-07-24 19:04:58.691486] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:28:13.787 [2024-07-24 19:04:58.691525] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2261333 ] 00:28:13.787 [2024-07-24 19:04:58.755838] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:14.045 [2024-07-24 19:04:58.828838] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:14.562  Copying: 64/64 [kB] (average 10 MBps) 00:28:14.562 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:14.562 19:04:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:14.562 19:04:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:14.562 19:04:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:14.562 19:04:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:14.562 19:04:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:14.562 19:04:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:14.562 19:04:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:14.562 19:04:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:14.562 19:04:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:14.562 19:04:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:14.562 19:04:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:14.562 19:04:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@114 -- # update_stats 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:14.562 19:04:59 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:14.563 19:04:59 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:14.563 19:04:59 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:14.563 19:04:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:14.563 19:04:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:14.563 19:04:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:14.821 19:04:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:14.821 19:04:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:14.821 19:04:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:14.821 19:04:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:14.821 19:04:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:14.821 19:04:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:14.821 19:04:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:14.822 19:04:59 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:14.822 19:04:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:14.822 19:04:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:14.822 19:04:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:14.822 19:04:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:14.822 19:04:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:14.822 19:04:59 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:28:14.822 19:04:59 chaining -- bdev/chaining.sh@117 -- # : 00:28:14.822 19:04:59 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.5ftofDU3nH --ib Nvme0n1 --bs 4096 --count 16 00:28:14.822 19:04:59 chaining -- bdev/chaining.sh@25 -- # local config 00:28:14.822 19:04:59 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:28:14.822 19:04:59 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:28:14.822 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:28:14.822 19:04:59 chaining -- bdev/chaining.sh@31 -- # config='{ 00:28:14.822 "subsystems": [ 00:28:14.822 { 00:28:14.822 "subsystem": "bdev", 00:28:14.822 "config": [ 00:28:14.822 { 00:28:14.822 "method": "bdev_nvme_attach_controller", 00:28:14.822 "params": { 00:28:14.822 "trtype": "tcp", 00:28:14.822 "adrfam": "IPv4", 00:28:14.822 "name": "Nvme0", 00:28:14.822 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:14.822 "traddr": "10.0.0.2", 00:28:14.822 "trsvcid": "4420" 00:28:14.822 } 00:28:14.822 }, 00:28:14.822 { 00:28:14.822 "method": "bdev_set_options", 00:28:14.822 "params": { 00:28:14.822 "bdev_auto_examine": false 00:28:14.822 } 00:28:14.822 } 00:28:14.822 ] 00:28:14.822 } 00:28:14.822 ] 00:28:14.822 }' 00:28:14.822 19:04:59 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.5ftofDU3nH --ib Nvme0n1 --bs 4096 --count 16 00:28:14.822 19:04:59 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:28:14.822 "subsystems": [ 00:28:14.822 { 00:28:14.822 "subsystem": "bdev", 00:28:14.822 "config": [ 00:28:14.822 { 00:28:14.822 "method": "bdev_nvme_attach_controller", 00:28:14.822 "params": { 00:28:14.822 "trtype": "tcp", 00:28:14.822 "adrfam": "IPv4", 00:28:14.822 "name": "Nvme0", 00:28:14.822 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:14.822 "traddr": "10.0.0.2", 00:28:14.822 "trsvcid": "4420" 00:28:14.822 } 00:28:14.822 }, 00:28:14.822 { 00:28:14.822 "method": "bdev_set_options", 00:28:14.822 "params": { 00:28:14.822 "bdev_auto_examine": false 00:28:14.822 } 00:28:14.822 } 00:28:14.822 ] 00:28:14.822 } 00:28:14.822 ] 00:28:14.822 }' 00:28:14.822 [2024-07-24 19:04:59.784295] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:28:14.822 [2024-07-24 19:04:59.784337] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2261446 ] 00:28:15.081 [2024-07-24 19:04:59.849057] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:15.081 [2024-07-24 19:04:59.925060] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:15.598  Copying: 64/64 [kB] (average 719 kBps) 00:28:15.598 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:15.598 19:05:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:15.598 19:05:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:15.598 19:05:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:15.598 19:05:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:15.598 19:05:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:15.598 19:05:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:15.598 19:05:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:15.598 19:05:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:15.598 19:05:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:15.598 19:05:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:15.598 19:05:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:15.598 19:05:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.F39cik6TT7 /tmp/tmp.5ftofDU3nH 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.F39cik6TT7 /tmp/tmp.5ftofDU3nH 00:28:15.598 19:05:00 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:28:15.598 19:05:00 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:28:15.598 19:05:00 chaining -- nvmf/common.sh@117 -- # sync 00:28:15.598 19:05:00 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:28:15.598 19:05:00 chaining -- nvmf/common.sh@120 -- # set +e 00:28:15.857 19:05:00 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:28:15.857 19:05:00 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:28:15.857 rmmod nvme_tcp 00:28:15.857 rmmod nvme_fabrics 00:28:15.857 rmmod nvme_keyring 00:28:15.857 19:05:00 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:28:15.857 19:05:00 chaining -- nvmf/common.sh@124 -- # set -e 00:28:15.857 19:05:00 chaining -- nvmf/common.sh@125 -- # return 0 00:28:15.857 19:05:00 chaining -- nvmf/common.sh@489 -- # '[' -n 2260524 ']' 00:28:15.857 19:05:00 chaining -- nvmf/common.sh@490 -- # killprocess 2260524 00:28:15.857 19:05:00 chaining -- common/autotest_common.sh@948 -- # '[' -z 2260524 ']' 00:28:15.857 19:05:00 chaining -- common/autotest_common.sh@952 -- # kill -0 2260524 00:28:15.857 19:05:00 chaining -- common/autotest_common.sh@953 -- # uname 00:28:15.857 19:05:00 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:15.857 19:05:00 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2260524 00:28:15.857 19:05:00 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:28:15.857 19:05:00 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:28:15.857 19:05:00 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2260524' 00:28:15.857 killing process with pid 2260524 00:28:15.857 19:05:00 chaining -- common/autotest_common.sh@967 -- # kill 2260524 00:28:15.857 19:05:00 chaining -- common/autotest_common.sh@972 -- # wait 2260524 00:28:16.114 19:05:00 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:28:16.114 19:05:00 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:28:16.114 19:05:00 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:28:16.114 19:05:00 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:28:16.114 19:05:00 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:28:16.114 19:05:00 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:28:16.114 19:05:00 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:28:16.114 19:05:00 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:28:18.014 19:05:02 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:28:18.014 19:05:02 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:28:18.014 19:05:02 chaining -- bdev/chaining.sh@132 -- # bperfpid=2262110 00:28:18.014 19:05:02 chaining -- bdev/chaining.sh@134 -- # waitforlisten 2262110 00:28:18.014 19:05:02 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:28:18.014 19:05:02 chaining -- common/autotest_common.sh@829 -- # '[' -z 2262110 ']' 00:28:18.014 19:05:02 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:18.014 19:05:02 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:18.014 19:05:02 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:18.014 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:18.014 19:05:02 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:18.014 19:05:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:18.014 [2024-07-24 19:05:03.016344] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:28:18.014 [2024-07-24 19:05:03.016395] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2262110 ] 00:28:18.271 [2024-07-24 19:05:03.082656] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:18.271 [2024-07-24 19:05:03.166557] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:18.833 19:05:03 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:18.833 19:05:03 chaining -- common/autotest_common.sh@862 -- # return 0 00:28:18.833 19:05:03 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:28:18.833 19:05:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:18.833 19:05:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:19.090 malloc0 00:28:19.090 true 00:28:19.090 true 00:28:19.090 [2024-07-24 19:05:03.926449] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:28:19.090 crypto0 00:28:19.090 [2024-07-24 19:05:03.934476] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:28:19.090 crypto1 00:28:19.090 19:05:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:19.090 19:05:03 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:19.090 Running I/O for 5 seconds... 00:28:24.347 00:28:24.347 Latency(us) 00:28:24.347 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:24.347 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:28:24.347 Verification LBA range: start 0x0 length 0x2000 00:28:24.347 crypto1 : 5.01 17872.84 69.82 0.00 0.00 14288.14 3308.01 10797.84 00:28:24.347 =================================================================================================================== 00:28:24.347 Total : 17872.84 69.82 0.00 0.00 14288.14 3308.01 10797.84 00:28:24.347 0 00:28:24.347 19:05:09 chaining -- bdev/chaining.sh@146 -- # killprocess 2262110 00:28:24.347 19:05:09 chaining -- common/autotest_common.sh@948 -- # '[' -z 2262110 ']' 00:28:24.348 19:05:09 chaining -- common/autotest_common.sh@952 -- # kill -0 2262110 00:28:24.348 19:05:09 chaining -- common/autotest_common.sh@953 -- # uname 00:28:24.348 19:05:09 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:24.348 19:05:09 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2262110 00:28:24.348 19:05:09 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:24.348 19:05:09 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:24.348 19:05:09 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2262110' 00:28:24.348 killing process with pid 2262110 00:28:24.348 19:05:09 chaining -- common/autotest_common.sh@967 -- # kill 2262110 00:28:24.348 Received shutdown signal, test time was about 5.000000 seconds 00:28:24.348 00:28:24.348 Latency(us) 00:28:24.348 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:24.348 =================================================================================================================== 00:28:24.348 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:24.348 19:05:09 chaining -- common/autotest_common.sh@972 -- # wait 2262110 00:28:24.348 19:05:09 chaining -- bdev/chaining.sh@152 -- # bperfpid=2263025 00:28:24.348 19:05:09 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:28:24.348 19:05:09 chaining -- bdev/chaining.sh@154 -- # waitforlisten 2263025 00:28:24.348 19:05:09 chaining -- common/autotest_common.sh@829 -- # '[' -z 2263025 ']' 00:28:24.348 19:05:09 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:24.348 19:05:09 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:24.348 19:05:09 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:24.348 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:24.348 19:05:09 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:24.348 19:05:09 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:24.348 [2024-07-24 19:05:09.337202] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:28:24.348 [2024-07-24 19:05:09.337250] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2263025 ] 00:28:24.606 [2024-07-24 19:05:09.400605] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:24.606 [2024-07-24 19:05:09.479071] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:25.172 19:05:10 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:25.172 19:05:10 chaining -- common/autotest_common.sh@862 -- # return 0 00:28:25.172 19:05:10 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:28:25.172 19:05:10 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:25.172 19:05:10 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:25.430 malloc0 00:28:25.430 true 00:28:25.430 true 00:28:25.430 [2024-07-24 19:05:10.259633] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:28:25.430 [2024-07-24 19:05:10.259673] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:25.430 [2024-07-24 19:05:10.259685] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21deee0 00:28:25.430 [2024-07-24 19:05:10.259691] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:25.430 [2024-07-24 19:05:10.260428] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:25.430 [2024-07-24 19:05:10.260445] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:28:25.430 pt0 00:28:25.430 [2024-07-24 19:05:10.267662] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:28:25.430 crypto0 00:28:25.430 [2024-07-24 19:05:10.275681] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:28:25.430 crypto1 00:28:25.430 19:05:10 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:25.430 19:05:10 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:25.430 Running I/O for 5 seconds... 00:28:30.800 00:28:30.800 Latency(us) 00:28:30.800 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:30.800 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:28:30.800 Verification LBA range: start 0x0 length 0x2000 00:28:30.800 crypto1 : 5.01 13913.22 54.35 0.00 0.00 18356.07 1341.93 13169.62 00:28:30.800 =================================================================================================================== 00:28:30.800 Total : 13913.22 54.35 0.00 0.00 18356.07 1341.93 13169.62 00:28:30.800 0 00:28:30.800 19:05:15 chaining -- bdev/chaining.sh@167 -- # killprocess 2263025 00:28:30.800 19:05:15 chaining -- common/autotest_common.sh@948 -- # '[' -z 2263025 ']' 00:28:30.800 19:05:15 chaining -- common/autotest_common.sh@952 -- # kill -0 2263025 00:28:30.800 19:05:15 chaining -- common/autotest_common.sh@953 -- # uname 00:28:30.800 19:05:15 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:30.800 19:05:15 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2263025 00:28:30.800 19:05:15 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:30.800 19:05:15 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:30.800 19:05:15 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2263025' 00:28:30.800 killing process with pid 2263025 00:28:30.800 19:05:15 chaining -- common/autotest_common.sh@967 -- # kill 2263025 00:28:30.800 Received shutdown signal, test time was about 5.000000 seconds 00:28:30.800 00:28:30.800 Latency(us) 00:28:30.800 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:30.800 =================================================================================================================== 00:28:30.800 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:30.800 19:05:15 chaining -- common/autotest_common.sh@972 -- # wait 2263025 00:28:30.800 19:05:15 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:28:30.800 19:05:15 chaining -- bdev/chaining.sh@170 -- # killprocess 2263025 00:28:30.800 19:05:15 chaining -- common/autotest_common.sh@948 -- # '[' -z 2263025 ']' 00:28:30.800 19:05:15 chaining -- common/autotest_common.sh@952 -- # kill -0 2263025 00:28:30.800 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (2263025) - No such process 00:28:30.800 19:05:15 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 2263025 is not found' 00:28:30.800 Process with pid 2263025 is not found 00:28:30.800 19:05:15 chaining -- bdev/chaining.sh@171 -- # wait 2263025 00:28:30.800 19:05:15 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:28:30.800 19:05:15 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:28:30.800 19:05:15 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:28:30.800 19:05:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@296 -- # e810=() 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@297 -- # x722=() 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@298 -- # mlx=() 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:af:00.0 (0x8086 - 0x159b)' 00:28:30.800 Found 0000:af:00.0 (0x8086 - 0x159b) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:af:00.1 (0x8086 - 0x159b)' 00:28:30.800 Found 0000:af:00.1 (0x8086 - 0x159b) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:af:00.0: cvl_0_0' 00:28:30.800 Found net devices under 0000:af:00.0: cvl_0_0 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:af:00.1: cvl_0_1' 00:28:30.800 Found net devices under 0000:af:00.1: cvl_0_1 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:28:30.800 19:05:15 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:28:30.801 19:05:15 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:28:30.801 19:05:15 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:28:30.801 19:05:15 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:28:30.801 19:05:15 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:28:30.801 19:05:15 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:28:30.801 19:05:15 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:28:30.801 19:05:15 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:28:30.801 19:05:15 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:28:31.060 19:05:15 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:28:31.060 19:05:15 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:28:31.060 19:05:15 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:28:31.060 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:28:31.060 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.205 ms 00:28:31.060 00:28:31.060 --- 10.0.0.2 ping statistics --- 00:28:31.060 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:28:31.060 rtt min/avg/max/mdev = 0.205/0.205/0.205/0.000 ms 00:28:31.060 19:05:15 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:28:31.060 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:28:31.060 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.196 ms 00:28:31.060 00:28:31.060 --- 10.0.0.1 ping statistics --- 00:28:31.060 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:28:31.060 rtt min/avg/max/mdev = 0.196/0.196/0.196/0.000 ms 00:28:31.060 19:05:15 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:28:31.060 19:05:15 chaining -- nvmf/common.sh@422 -- # return 0 00:28:31.060 19:05:15 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:28:31.060 19:05:15 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:28:31.060 19:05:15 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:28:31.060 19:05:15 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:28:31.060 19:05:15 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:28:31.060 19:05:15 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:28:31.060 19:05:15 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:28:31.060 19:05:15 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:28:31.060 19:05:15 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:28:31.060 19:05:15 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:28:31.060 19:05:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:31.060 19:05:15 chaining -- nvmf/common.sh@481 -- # nvmfpid=2264193 00:28:31.060 19:05:15 chaining -- nvmf/common.sh@482 -- # waitforlisten 2264193 00:28:31.060 19:05:15 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:28:31.060 19:05:15 chaining -- common/autotest_common.sh@829 -- # '[' -z 2264193 ']' 00:28:31.060 19:05:15 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:31.060 19:05:15 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:31.060 19:05:15 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:31.060 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:31.060 19:05:15 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:31.060 19:05:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:31.060 [2024-07-24 19:05:15.992794] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:28:31.060 [2024-07-24 19:05:15.992840] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:31.060 [2024-07-24 19:05:16.060819] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:31.319 [2024-07-24 19:05:16.138964] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:28:31.319 [2024-07-24 19:05:16.138995] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:28:31.319 [2024-07-24 19:05:16.139002] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:28:31.319 [2024-07-24 19:05:16.139008] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:28:31.319 [2024-07-24 19:05:16.139016] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:28:31.319 [2024-07-24 19:05:16.139031] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:31.885 19:05:16 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:31.885 19:05:16 chaining -- common/autotest_common.sh@862 -- # return 0 00:28:31.885 19:05:16 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:28:31.885 19:05:16 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:28:31.885 19:05:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:31.885 19:05:16 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:28:31.885 19:05:16 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:28:31.885 19:05:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:31.885 19:05:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:31.885 malloc0 00:28:31.885 [2024-07-24 19:05:16.827552] tcp.c: 729:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:28:31.885 [2024-07-24 19:05:16.843669] tcp.c:1058:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:28:31.885 19:05:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:31.885 19:05:16 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:28:31.885 19:05:16 chaining -- bdev/chaining.sh@189 -- # bperfpid=2264319 00:28:31.885 19:05:16 chaining -- bdev/chaining.sh@191 -- # waitforlisten 2264319 /var/tmp/bperf.sock 00:28:31.885 19:05:16 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:28:31.885 19:05:16 chaining -- common/autotest_common.sh@829 -- # '[' -z 2264319 ']' 00:28:31.885 19:05:16 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:28:31.885 19:05:16 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:31.885 19:05:16 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:28:31.885 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:28:31.885 19:05:16 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:31.885 19:05:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:32.144 [2024-07-24 19:05:16.905203] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:28:32.144 [2024-07-24 19:05:16.905244] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2264319 ] 00:28:32.144 [2024-07-24 19:05:16.969537] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:32.144 [2024-07-24 19:05:17.047507] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:32.712 19:05:17 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:32.712 19:05:17 chaining -- common/autotest_common.sh@862 -- # return 0 00:28:32.712 19:05:17 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:28:32.712 19:05:17 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:28:33.279 [2024-07-24 19:05:18.015194] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:28:33.279 nvme0n1 00:28:33.279 true 00:28:33.279 crypto0 00:28:33.279 19:05:18 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:28:33.279 Running I/O for 5 seconds... 00:28:38.551 00:28:38.551 Latency(us) 00:28:38.551 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:38.551 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:28:38.551 Verification LBA range: start 0x0 length 0x2000 00:28:38.551 crypto0 : 5.01 12878.50 50.31 0.00 0.00 19827.93 2808.69 16852.11 00:28:38.551 =================================================================================================================== 00:28:38.551 Total : 12878.50 50.31 0.00 0.00 19827.93 2808.69 16852.11 00:28:38.551 0 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@205 -- # sequence=129156 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@206 -- # encrypt=64578 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:38.551 19:05:23 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:38.811 19:05:23 chaining -- bdev/chaining.sh@207 -- # decrypt=64578 00:28:38.811 19:05:23 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:28:38.811 19:05:23 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:28:38.811 19:05:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:38.811 19:05:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:38.811 19:05:23 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:28:38.811 19:05:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:38.811 19:05:23 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:28:38.811 19:05:23 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:28:38.811 19:05:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:28:38.811 19:05:23 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:39.069 19:05:23 chaining -- bdev/chaining.sh@208 -- # crc32c=129156 00:28:39.069 19:05:23 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:28:39.069 19:05:23 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:28:39.069 19:05:23 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:28:39.069 19:05:23 chaining -- bdev/chaining.sh@214 -- # killprocess 2264319 00:28:39.069 19:05:23 chaining -- common/autotest_common.sh@948 -- # '[' -z 2264319 ']' 00:28:39.069 19:05:23 chaining -- common/autotest_common.sh@952 -- # kill -0 2264319 00:28:39.069 19:05:23 chaining -- common/autotest_common.sh@953 -- # uname 00:28:39.069 19:05:23 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:39.069 19:05:23 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2264319 00:28:39.069 19:05:23 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:39.069 19:05:23 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:39.069 19:05:23 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2264319' 00:28:39.069 killing process with pid 2264319 00:28:39.069 19:05:23 chaining -- common/autotest_common.sh@967 -- # kill 2264319 00:28:39.069 Received shutdown signal, test time was about 5.000000 seconds 00:28:39.069 00:28:39.069 Latency(us) 00:28:39.069 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:39.069 =================================================================================================================== 00:28:39.069 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:39.069 19:05:23 chaining -- common/autotest_common.sh@972 -- # wait 2264319 00:28:39.328 19:05:24 chaining -- bdev/chaining.sh@219 -- # bperfpid=2265537 00:28:39.328 19:05:24 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:28:39.328 19:05:24 chaining -- bdev/chaining.sh@221 -- # waitforlisten 2265537 /var/tmp/bperf.sock 00:28:39.328 19:05:24 chaining -- common/autotest_common.sh@829 -- # '[' -z 2265537 ']' 00:28:39.328 19:05:24 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:28:39.328 19:05:24 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:39.328 19:05:24 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:28:39.328 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:28:39.328 19:05:24 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:39.328 19:05:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:39.328 [2024-07-24 19:05:24.152023] Starting SPDK v24.09-pre git sha1 0bb5c21e2 / DPDK 24.03.0 initialization... 00:28:39.328 [2024-07-24 19:05:24.152069] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2265537 ] 00:28:39.328 [2024-07-24 19:05:24.216830] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:39.328 [2024-07-24 19:05:24.295302] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:40.263 19:05:24 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:40.263 19:05:24 chaining -- common/autotest_common.sh@862 -- # return 0 00:28:40.263 19:05:24 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:28:40.263 19:05:24 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:28:40.263 [2024-07-24 19:05:25.246744] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:28:40.263 nvme0n1 00:28:40.263 true 00:28:40.263 crypto0 00:28:40.521 19:05:25 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:28:40.521 Running I/O for 5 seconds... 00:28:45.793 00:28:45.793 Latency(us) 00:28:45.793 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:45.793 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:28:45.793 Verification LBA range: start 0x0 length 0x200 00:28:45.793 crypto0 : 5.00 2505.70 156.61 0.00 0.00 12527.48 616.35 13793.77 00:28:45.793 =================================================================================================================== 00:28:45.793 Total : 2505.70 156.61 0.00 0.00 12527.48 616.35 13793.77 00:28:45.793 0 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@233 -- # sequence=25082 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@234 -- # encrypt=12541 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:45.793 19:05:30 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:45.794 19:05:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:45.794 19:05:30 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:45.794 19:05:30 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:45.794 19:05:30 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:28:45.794 19:05:30 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:46.053 19:05:30 chaining -- bdev/chaining.sh@235 -- # decrypt=12541 00:28:46.053 19:05:30 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:28:46.053 19:05:30 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:28:46.053 19:05:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:46.053 19:05:30 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:46.053 19:05:30 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:28:46.053 19:05:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:46.053 19:05:30 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:28:46.053 19:05:30 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:28:46.053 19:05:30 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:28:46.053 19:05:30 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:46.312 19:05:31 chaining -- bdev/chaining.sh@236 -- # crc32c=25082 00:28:46.312 19:05:31 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:28:46.312 19:05:31 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:28:46.312 19:05:31 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:28:46.312 19:05:31 chaining -- bdev/chaining.sh@242 -- # killprocess 2265537 00:28:46.312 19:05:31 chaining -- common/autotest_common.sh@948 -- # '[' -z 2265537 ']' 00:28:46.312 19:05:31 chaining -- common/autotest_common.sh@952 -- # kill -0 2265537 00:28:46.312 19:05:31 chaining -- common/autotest_common.sh@953 -- # uname 00:28:46.312 19:05:31 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:46.312 19:05:31 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2265537 00:28:46.312 19:05:31 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:46.312 19:05:31 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:46.312 19:05:31 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2265537' 00:28:46.312 killing process with pid 2265537 00:28:46.312 19:05:31 chaining -- common/autotest_common.sh@967 -- # kill 2265537 00:28:46.312 Received shutdown signal, test time was about 5.000000 seconds 00:28:46.312 00:28:46.312 Latency(us) 00:28:46.312 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:46.312 =================================================================================================================== 00:28:46.312 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:46.312 19:05:31 chaining -- common/autotest_common.sh@972 -- # wait 2265537 00:28:46.312 19:05:31 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:28:46.312 19:05:31 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:28:46.312 19:05:31 chaining -- nvmf/common.sh@117 -- # sync 00:28:46.312 19:05:31 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:28:46.312 19:05:31 chaining -- nvmf/common.sh@120 -- # set +e 00:28:46.312 19:05:31 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:28:46.312 19:05:31 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:28:46.571 rmmod nvme_tcp 00:28:46.571 rmmod nvme_fabrics 00:28:46.571 rmmod nvme_keyring 00:28:46.571 19:05:31 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:28:46.571 19:05:31 chaining -- nvmf/common.sh@124 -- # set -e 00:28:46.571 19:05:31 chaining -- nvmf/common.sh@125 -- # return 0 00:28:46.571 19:05:31 chaining -- nvmf/common.sh@489 -- # '[' -n 2264193 ']' 00:28:46.571 19:05:31 chaining -- nvmf/common.sh@490 -- # killprocess 2264193 00:28:46.571 19:05:31 chaining -- common/autotest_common.sh@948 -- # '[' -z 2264193 ']' 00:28:46.571 19:05:31 chaining -- common/autotest_common.sh@952 -- # kill -0 2264193 00:28:46.571 19:05:31 chaining -- common/autotest_common.sh@953 -- # uname 00:28:46.571 19:05:31 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:46.571 19:05:31 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2264193 00:28:46.571 19:05:31 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:28:46.571 19:05:31 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:28:46.571 19:05:31 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2264193' 00:28:46.571 killing process with pid 2264193 00:28:46.571 19:05:31 chaining -- common/autotest_common.sh@967 -- # kill 2264193 00:28:46.571 19:05:31 chaining -- common/autotest_common.sh@972 -- # wait 2264193 00:28:46.830 19:05:31 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:28:46.830 19:05:31 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:28:46.830 19:05:31 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:28:46.830 19:05:31 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:28:46.830 19:05:31 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:28:46.830 19:05:31 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:28:46.830 19:05:31 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:28:46.830 19:05:31 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:28:48.734 19:05:33 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:28:48.734 19:05:33 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:28:48.734 00:28:48.734 real 0m45.397s 00:28:48.734 user 0m55.184s 00:28:48.734 sys 0m10.015s 00:28:48.734 19:05:33 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:48.734 19:05:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:48.734 ************************************ 00:28:48.734 END TEST chaining 00:28:48.734 ************************************ 00:28:48.734 19:05:33 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:28:48.734 19:05:33 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:28:48.734 19:05:33 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:28:48.734 19:05:33 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:28:48.734 19:05:33 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:28:48.734 19:05:33 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:28:48.734 19:05:33 -- common/autotest_common.sh@722 -- # xtrace_disable 00:28:48.734 19:05:33 -- common/autotest_common.sh@10 -- # set +x 00:28:48.734 19:05:33 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:28:48.734 19:05:33 -- common/autotest_common.sh@1390 -- # local autotest_es=0 00:28:48.734 19:05:33 -- common/autotest_common.sh@1391 -- # xtrace_disable 00:28:48.734 19:05:33 -- common/autotest_common.sh@10 -- # set +x 00:28:52.942 INFO: APP EXITING 00:28:52.942 INFO: killing all VMs 00:28:52.942 INFO: killing vhost app 00:28:52.942 INFO: EXIT DONE 00:28:55.473 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:28:55.473 Waiting for block devices as requested 00:28:55.473 0000:5e:00.0 (8086 0a54): vfio-pci -> nvme 00:28:55.732 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:28:55.732 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:28:55.990 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:28:55.990 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:28:55.990 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:28:55.990 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:28:56.249 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:28:56.249 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:28:56.249 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:28:56.249 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:28:56.507 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:28:56.507 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:28:56.507 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:28:56.766 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:28:56.766 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:28:56.766 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:28:59.398 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:28:59.656 Cleaning 00:28:59.656 Removing: /var/run/dpdk/spdk0/config 00:28:59.656 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:28:59.656 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:28:59.915 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:28:59.915 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:28:59.915 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:28:59.915 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:28:59.915 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:28:59.915 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:28:59.915 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:28:59.915 Removing: /var/run/dpdk/spdk0/hugepage_info 00:28:59.915 Removing: /dev/shm/nvmf_trace.0 00:28:59.915 Removing: /dev/shm/spdk_tgt_trace.pid2002571 00:28:59.915 Removing: /var/run/dpdk/spdk0 00:28:59.915 Removing: /var/run/dpdk/spdk_pid1998855 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2001350 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2002571 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2003192 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2004518 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2004754 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2005722 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2005840 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2006064 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2008727 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2010433 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2010717 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2011004 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2011413 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2011785 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2011992 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2012193 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2012471 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2013297 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2016260 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2016512 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2016801 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2017065 00:28:59.915 Removing: /var/run/dpdk/spdk_pid2017092 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2017217 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2017525 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2017804 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2018091 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2018356 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2018598 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2018851 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2019097 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2019347 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2019593 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2019837 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2020085 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2020332 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2020580 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2020825 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2021073 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2021326 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2021571 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2021827 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2022069 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2022316 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2022642 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2023025 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2023272 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2023526 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2023961 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2024239 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2024496 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2024784 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2025022 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2025331 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2025782 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2026156 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2026298 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2029993 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2032097 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2034146 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2035288 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2036570 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2036841 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2037041 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2037079 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2042120 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2042822 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2043967 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2044219 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2049533 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2051122 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2052106 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2056105 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2057712 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2058687 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2062698 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2065069 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2065859 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2075650 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2077768 00:28:59.916 Removing: /var/run/dpdk/spdk_pid2078785 00:29:00.174 Removing: /var/run/dpdk/spdk_pid2088265 00:29:00.174 Removing: /var/run/dpdk/spdk_pid2090166 00:29:00.174 Removing: /var/run/dpdk/spdk_pid2091181 00:29:00.174 Removing: /var/run/dpdk/spdk_pid2100525 00:29:00.174 Removing: /var/run/dpdk/spdk_pid2103711 00:29:00.174 Removing: /var/run/dpdk/spdk_pid2104710 00:29:00.174 Removing: /var/run/dpdk/spdk_pid2115517 00:29:00.174 Removing: /var/run/dpdk/spdk_pid2117896 00:29:00.174 Removing: /var/run/dpdk/spdk_pid2119043 00:29:00.174 Removing: /var/run/dpdk/spdk_pid2129458 00:29:00.174 Removing: /var/run/dpdk/spdk_pid2131840 00:29:00.174 Removing: /var/run/dpdk/spdk_pid2132854 00:29:00.174 Removing: /var/run/dpdk/spdk_pid2143830 00:29:00.174 Removing: /var/run/dpdk/spdk_pid2147513 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2148634 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2149647 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2152740 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2157850 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2160440 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2165151 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2168625 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2174181 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2177324 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2183677 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2185860 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2192007 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2194149 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2200299 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2202658 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2206994 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2207574 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2207944 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2208319 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2209313 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2210030 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2210786 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2211321 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2213162 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2215001 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2216838 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2218279 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2220121 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2221956 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2223796 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2225236 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2225863 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2226407 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2228487 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2230667 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2233043 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2234205 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2235574 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2236267 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2236288 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2236556 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2236817 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2236848 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2237950 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2240401 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2242333 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2243252 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2244175 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2244457 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2244658 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2244679 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2245687 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2246356 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2246828 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2249025 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2251286 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2253536 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2254820 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2256187 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2256878 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2256901 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2260795 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2260947 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2261090 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2261333 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2261446 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2262110 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2263025 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2264319 00:29:00.175 Removing: /var/run/dpdk/spdk_pid2265537 00:29:00.175 Clean 00:29:00.434 19:05:45 -- common/autotest_common.sh@1449 -- # return 0 00:29:00.434 19:05:45 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:29:00.434 19:05:45 -- common/autotest_common.sh@728 -- # xtrace_disable 00:29:00.434 19:05:45 -- common/autotest_common.sh@10 -- # set +x 00:29:00.434 19:05:45 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:29:00.434 19:05:45 -- common/autotest_common.sh@728 -- # xtrace_disable 00:29:00.434 19:05:45 -- common/autotest_common.sh@10 -- # set +x 00:29:00.434 19:05:45 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:29:00.434 19:05:45 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:29:00.434 19:05:45 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:29:00.434 19:05:45 -- spdk/autotest.sh@391 -- # hash lcov 00:29:00.434 19:05:45 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:29:00.434 19:05:45 -- spdk/autotest.sh@393 -- # hostname 00:29:00.434 19:05:45 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-03 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:29:00.692 geninfo: WARNING: invalid characters removed from testname! 00:29:22.621 19:06:03 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:22.621 19:06:06 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:23.188 19:06:08 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:25.091 19:06:09 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:26.988 19:06:11 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:28.364 19:06:13 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:30.266 19:06:14 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:29:30.266 19:06:14 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:30.266 19:06:14 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:29:30.266 19:06:14 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:30.266 19:06:14 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:30.266 19:06:14 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:30.266 19:06:14 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:30.266 19:06:14 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:30.266 19:06:14 -- paths/export.sh@5 -- $ export PATH 00:29:30.266 19:06:14 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:30.266 19:06:14 -- common/autobuild_common.sh@446 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:30.266 19:06:14 -- common/autobuild_common.sh@447 -- $ date +%s 00:29:30.266 19:06:14 -- common/autobuild_common.sh@447 -- $ mktemp -dt spdk_1721840774.XXXXXX 00:29:30.266 19:06:14 -- common/autobuild_common.sh@447 -- $ SPDK_WORKSPACE=/tmp/spdk_1721840774.TwovO3 00:29:30.266 19:06:14 -- common/autobuild_common.sh@449 -- $ [[ -n '' ]] 00:29:30.266 19:06:14 -- common/autobuild_common.sh@453 -- $ '[' -n '' ']' 00:29:30.266 19:06:14 -- common/autobuild_common.sh@456 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:29:30.266 19:06:14 -- common/autobuild_common.sh@460 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:29:30.266 19:06:14 -- common/autobuild_common.sh@462 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:29:30.266 19:06:14 -- common/autobuild_common.sh@463 -- $ get_config_params 00:29:30.266 19:06:14 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:29:30.266 19:06:14 -- common/autotest_common.sh@10 -- $ set +x 00:29:30.266 19:06:14 -- common/autobuild_common.sh@463 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:29:30.266 19:06:14 -- common/autobuild_common.sh@465 -- $ start_monitor_resources 00:29:30.266 19:06:14 -- pm/common@17 -- $ local monitor 00:29:30.266 19:06:14 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:30.266 19:06:14 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:30.266 19:06:14 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:30.266 19:06:14 -- pm/common@21 -- $ date +%s 00:29:30.266 19:06:14 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:30.266 19:06:14 -- pm/common@21 -- $ date +%s 00:29:30.266 19:06:14 -- pm/common@21 -- $ date +%s 00:29:30.266 19:06:14 -- pm/common@25 -- $ sleep 1 00:29:30.266 19:06:14 -- pm/common@21 -- $ date +%s 00:29:30.266 19:06:14 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721840774 00:29:30.266 19:06:14 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721840774 00:29:30.266 19:06:14 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721840774 00:29:30.266 19:06:14 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721840774 00:29:30.266 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721840774_collect-vmstat.pm.log 00:29:30.266 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721840774_collect-cpu-temp.pm.log 00:29:30.266 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721840774_collect-cpu-load.pm.log 00:29:30.266 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721840774_collect-bmc-pm.bmc.pm.log 00:29:31.204 19:06:15 -- common/autobuild_common.sh@466 -- $ trap stop_monitor_resources EXIT 00:29:31.204 19:06:15 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j96 00:29:31.204 19:06:15 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:31.204 19:06:15 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:29:31.204 19:06:15 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:29:31.204 19:06:15 -- spdk/autopackage.sh@19 -- $ timing_finish 00:29:31.204 19:06:15 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:29:31.204 19:06:15 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:29:31.204 19:06:15 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:29:31.204 19:06:16 -- spdk/autopackage.sh@20 -- $ exit 0 00:29:31.204 19:06:16 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:29:31.204 19:06:16 -- pm/common@29 -- $ signal_monitor_resources TERM 00:29:31.204 19:06:16 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:29:31.204 19:06:16 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:31.204 19:06:16 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:29:31.204 19:06:16 -- pm/common@44 -- $ pid=2276924 00:29:31.204 19:06:16 -- pm/common@50 -- $ kill -TERM 2276924 00:29:31.204 19:06:16 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:31.204 19:06:16 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:29:31.204 19:06:16 -- pm/common@44 -- $ pid=2276926 00:29:31.204 19:06:16 -- pm/common@50 -- $ kill -TERM 2276926 00:29:31.204 19:06:16 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:31.204 19:06:16 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:29:31.204 19:06:16 -- pm/common@44 -- $ pid=2276928 00:29:31.204 19:06:16 -- pm/common@50 -- $ kill -TERM 2276928 00:29:31.204 19:06:16 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:31.204 19:06:16 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:29:31.204 19:06:16 -- pm/common@44 -- $ pid=2276954 00:29:31.204 19:06:16 -- pm/common@50 -- $ sudo -E kill -TERM 2276954 00:29:31.204 + [[ -n 1882973 ]] 00:29:31.204 + sudo kill 1882973 00:29:31.214 [Pipeline] } 00:29:31.231 [Pipeline] // stage 00:29:31.236 [Pipeline] } 00:29:31.253 [Pipeline] // timeout 00:29:31.261 [Pipeline] } 00:29:31.277 [Pipeline] // catchError 00:29:31.282 [Pipeline] } 00:29:31.299 [Pipeline] // wrap 00:29:31.305 [Pipeline] } 00:29:31.321 [Pipeline] // catchError 00:29:31.329 [Pipeline] stage 00:29:31.331 [Pipeline] { (Epilogue) 00:29:31.345 [Pipeline] catchError 00:29:31.346 [Pipeline] { 00:29:31.359 [Pipeline] echo 00:29:31.361 Cleanup processes 00:29:31.366 [Pipeline] sh 00:29:31.642 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:31.642 2277060 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:29:31.643 2277320 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:31.656 [Pipeline] sh 00:29:31.999 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:31.999 ++ grep -v 'sudo pgrep' 00:29:31.999 ++ awk '{print $1}' 00:29:31.999 + sudo kill -9 2277060 00:29:32.009 [Pipeline] sh 00:29:32.287 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:29:40.393 [Pipeline] sh 00:29:40.667 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:29:40.667 Artifacts sizes are good 00:29:40.681 [Pipeline] archiveArtifacts 00:29:40.688 Archiving artifacts 00:29:40.820 [Pipeline] sh 00:29:41.099 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:29:41.113 [Pipeline] cleanWs 00:29:41.123 [WS-CLEANUP] Deleting project workspace... 00:29:41.123 [WS-CLEANUP] Deferred wipeout is used... 00:29:41.128 [WS-CLEANUP] done 00:29:41.130 [Pipeline] } 00:29:41.150 [Pipeline] // catchError 00:29:41.162 [Pipeline] sh 00:29:41.486 + logger -p user.info -t JENKINS-CI 00:29:41.495 [Pipeline] } 00:29:41.512 [Pipeline] // stage 00:29:41.518 [Pipeline] } 00:29:41.536 [Pipeline] // node 00:29:41.542 [Pipeline] End of Pipeline 00:29:41.575 Finished: SUCCESS